type of project: fellow research project
published: 2021
by: Luise Ehrenwerth
website: www.luise-ehrenwerth.de
contact: luise@ehrenwerth.de
connecting:stitches (Luise Ehrenwerth)
In her research project “connecting:stitches”, costume designer Luise Ehrenwerth explores the possibilities of bringing together costume making and digital technologies. Her focus lays on electronic textiles and sensors made of conductive materials that are connected to mikrocontrollers, as well as the implementation of textile AR markers in costumes. In addition to the question of what role costume design plays in the theater of digitality, the project is also about the search for changing body images in a digital (art-)world.
About
Luise Ehrenwerth is a costume designer, stage designer and scenographer. She studied at the Dresden Academy of Fine Arts and at Accademia di Belle Arti in Rome. Projects with her participation have been carried out at the Nationaltheater Weimar (“Reichstag Reenactment”, Kunstfest 2019), for the documentary film “ANAMNESIS” (set design, Berlinale Forum 2021) and at the Landesbühnen Sachsen and the Theater Baden-Baden for “setup.school(). – Die Lernmaschine“ (in collaboration with the media theater collective machina eX).
One of the main areas of interest in her artistic work are the diverse sensorial ways of perceiving scenic spaces and their influence on the people who interact with them. To investigate the narrative potential of acoustic spaces using binaural 3D audio recordings in 2018 she developed the augmented audio reality project SCHRAPP SCHRAPP BUFF ZONG in collaboration with Nele Bühler. In 2020 within the Denkzeit-Stipendium from the Cultural Foundation of the Free State of Saxony Luise Ehrenwerth researched how virtual and at the same time haptic scenographies can be created with Augmented Reality.
For her stage and costume design for “It’s not that way, it’s over here – Scenes by Eugène Ionesco”, which she realized together with Nele Bühler at the Academy for Performing Arts in Ludwigsburg, they received the MARTA Award at the SETKÁNÍ Encounter Festival Brno 2018 in the category “Best Scenography”.
Luise Ehrenwerth is also a passionate maker, meaning that she loves to build things with all kinds of materials and different handcrafting techniques.
Research questions
General research questions
- How is digitality influencing our body understanding and how could digital technologies change the art of costume making?
- Is there a digital body and if yes, how is it different to an analogue one?
- What is a digital costume – and is a digital costume still a costume?
- What are the new narrative possibilities for storytelling when extending a costume through digital technology?
Hands-on research questions
- Which textile manufactoring methods work out best as image markers for Augmented Reality? How can they be integrated in a costume?
- How to make a virtual costume with Blender? How is the process of 3D modeling a costume different from the usual (analogue) workflow of costume creation?
- What is needed to make the costume an active digital play partner (a textile interface), with which one could change lights or sounds in the scenographic sourroundings?
Fields of research
The project is divided into three paths:
Augmented Reality and costumes
First of all there is the plan to produce different types of textile image marker for Augmented Reality, that could be integrated in the costumes and reveal a digital costume overlay (or any other information) in AR. For that experimental set-up diverse techniques of handcrafting will be used, like stitching, patchworking or textile printing.
eTextiles
Then, secondly, there is the interest in eTextiles and especially textile sensors made of conductive materials. With those implemented in a costume, the wearers would have a textile digital interface with which they could change things in their surrounding (like lights or sound). The idea is to enhance the costume's status from an non-technological “clothing” to an active digital play partner.
Virtual costume making
The third topic refers to digital only performances like VR or other virtual theater formats. How can costume designers still be part of the creative process, especially if they don't have the 3D-modelling or programming skills? How differs their artistic work from classical costume making when the goal is to design a costume for a digital theater piece?
Research process
Augmented Reality and costumes
Stitching, weaving, patchworking - those are just some textile manufactoring methods I would like to test for their qualification for textile image marker production. The AR app I am using for the tests is a basic setup made in Unity + Google's AR Foundation - no scripting is needed at this point! For the beginning I just need to know if and how good the tracking of my textile markers work. There are some tutorials online on how to build an app like this. I followed this one: Tutorial AR Foundation Reference Image Library in Unity But if you search the internet for “unity ar foundation reference image library tutorial” you can find other tutorials, too.
The following pictures show my textile AR image markers:
Stitched marker on white cotton (tracking works good)
Patchwork-marker with checkered and dotted fabric (Tracking works good, but depends sometimes on the light situation)
Ornamental marker, implemented in a selfmade shirt (I got the fabric from theater's costume departement as it is - the tracking works fine)
Lasercutted markers with doublefaced iron-on fleece (first the fleece was ironed on the fabric, then both materials together were cutted in the laser cutter and after that attached to another fabric by ironing them on - tracking is really good, due to the shape of the markers and the reference points and contrasts they provide)
Only disadvantage: The iron-on fleece will eventually not stick forever, especially because of the need to wash costumes from time to time.
Another idea at this point is to use “Flexfolie” instead, a material that is used for prints on shirts etc. and can be washed without concerns. Due to time reasons I did not try Flexfolie in my fellowship time, so unfortunately I can not say much about it's suitability for AR image markers.
These are try outs that did not work so great as markers! They were tracked, but not stable enough and often it was only in the right light situation. So it took quite a time for the tablet to recognize them as markers and show the AR.
The first one is probably to three-dimensional and not flat enough, because it is made of 8-10 layers of fabric, sewed together and then cutted to reveal the fabric layers underneath.
The second one is probably missing enough contrast and/or is having to much of the same repetitive patterns with all the stripes.
And with the third one I am not really sure what the problem is, because the triangular shapes should work fine and I was expecting it to have enough reference points in the stitchings. But maybe it is missing enough contrasts, too.
The following videos show some of the textile AR image markers in action:
Virtual costume making
For this part I am learning Blender and at the moment I'm working with an Add-on called “Garment Tool” (costs about 50$). It gives you the opportunity to digitally sew multiple 2D patterns to get a 3D garment - I am interested in working with this, because it seems to be close to the actual tailoring process (drawing a paper pattern, sewing together the parts and put the finished costume to a human body). There are software solutions like Clo3D or Marvellous Designer doing propably a better job for this issue than the “Garment Tool”-Add on, but they cost a lot. Blender is free to use and also has a lot of features and possibilities apart from “3D-sewing”.
Comment at the end of my fellowship: The Add-on “Garment tool” works okay, but it definitely has it's limits. So I think for a more fluent and faster workflow and also for better looking results, Clo3D or Marvellous Designer work probably much better.
The videos give an insight in the workflow of the digital sewing process in Blender:
eTextiles
I started my learning in the field of electronic textiles by making simple soft circuits that connect e.g. LEDs with a battery using conductive thread made of metallic materials like copper, silver or steel. There is already a wide knowledge and a lot of tutorials in this field what I could fall back on.
Here are some literature tips:
“Making Things Wearable - Intelligente Kleidung selber schneidern” by René Bohne, ISBN: 9783868991918 (in german)
“Crafting Wearables - Blending Technology with Fashion” by Sibel Deren Guler, Madeline Gannon, Kate Sicchio, ISBN: 9781484218075 (english)
“Make: Wearable Electronics: Design, prototype, and wear your own interactive garments” by Kate Hartmann, 9781449336516
(english)
The following pictures show my try outs:
This is the very first soft circuit I made (before starting my fellowship at the Academy). The microcontroller I used for that is the Arduino Lilypad which is especially designed for eTextile projects. Thanks to the big pin holes it is easy to sew on fabric. I attached some LEDs and a light sensor to it. It is powered by a small Lithium-Polymer-battery.
I started to think of possibilities to attach a mikrocontroller to fabric with other techniques than sewing - because when you sew the mikrocontroller on the fabric it is hard to detach it again without breaking your whole soft circuit. Having a laser cutter in the academy gave me the possibility to make small conductive pads out of copper fabric:
Before laser cutting the copper fabric I ironed it on a double sided iron-on-fleece, so that the small copper pads could be attached to another fabric easily, as you can see in the following picture:
I can now attach the microcontroller (in this case it is an Adafruit Flora Board, quiet similar to the Lilypad Arduino) with little screws (M2). The pictures show the front and the backside:
Due to time reasons my learning progress in the field of eTextiles did not as far as I imagined it before starting my fellowship. Especially the whole topic of eTextile sensors stayed mostly unregarded, unfortunately. But till the last month of my fellowship I gained enough knowledge to make my first costume project with implemented microcontrollers and soft circuits.
The following section gives an insight in this final project.
THE INTERACTIVE AUGMENTED REALITY COSTUME
Project draft
Coming closer to the end of my fellowship I started to think of a way to combine the three fields of my research in one costume project. The following pictures show drafts of a possible communication between the costume (and it's wearer) and an Unity AR app on a smart device (held through a member of an participating audience).
In the beginning there seems to be just a “normal” costume…
…but when the tablet tracks the textile AR marker implemented in the fabric…
…LEDs will light up in the costume. They are attached to a mikrocontroller, that communicates with the Unity app on the smart device. The lightened LED tells the wearer of the costume, that the tracking of the AR marker worked and that the audience member sees the virtual content on the tablet.
In addition to the lights in the costume, there could also be switches, buttons or any other (etextile) sensors, that give the wearer of the costume the possibility to change the AR content on the tablet (change the object, change the colour, change the size…).
The costume now is a digital interface that functions as real time controller of the experience the holder of the smart device has. But it is also possible to give the person with the smart device the opportunity to communicate with the intelligent costume and to interact with the AR objects on the costume (e.g.)
Workshop on MQTT and microcontrolling
In the fourth month of my fellowship I was given a 3-days workshop by Anton Kurt Krause, who is a theater director and software developer. I learned about network technology and how to set up a MQTT-Broker on a Raspberry Pi. Using MQTT Mosquitto gave me the possibility to connect microcontrollers in the costume with an AR Unity App on a tablet.
This chart shows the technical setup of my project and gives an basic overview how the MQTT machine-to-machine-communication connects multiple technical clients (microcontrollers, raspberry pi, Unity app on a tablet) in one network:
Making the interactive costume prototype
Based on this machine-to-machine-communication technology I combined my fellowship learnings on microcontrolling (eTextiles), Augmented Reality (Unity) and 3D modelling (Blender) in one costume project:
How it's made
In one sleeve are six Adafruit Neopixel RGB LEDs, connected to an Adafruit Feather S2. This microcontroller works with an ESP32 chip, is wifi-able and can connect to the network and the MQTT broker. It is also small and easy to hide in the costume. Unfortunately it does not have big pin holes for easy sewing, like the Adafruit Flora or the Lilypad Arduino. Therefore I sewed it to the fabric for permanent usage.
My original plan was to put a second microcontroller (Tiny S2, even smaller than the Feather S2) in the other sleeve and to attach to it six ON/OFF-switches using common metallic snap buttons (metall = conductive). It took me some days to finish the soft circuit in the sleeve and to put everything together - and it worked very well in the beginning!
But later on, the circuit started to cause problems. Little shortages, probably caused by the single traces sewed with conductive thread touching each other when moving the sleeve in the wrong direction. I came to the conclusion to redo it all from the beginning and to put the microcontroller with the snap buttons in the pants of the costume.
I also used the Adafruit Feather S2, too, and not the Tiny S2 as before in the sleeve. The GPIO pin holes of the Tiny S2 are so close to each other, that makes it really hard to make sure that the conductive threads are not touching. The Feather S2 is only slightly bigger and the distance between the GPIOs is bigger, too. Also, redoing it all I choosed GPIOs with at least one spare one inbetween.
In the pants the circuit worked much better and reliable - I came to the conclusion, that the sleeves may not be the best part in a garment to implement electronics, because we just move our arms way too much.
This is the circuit piece before sewing it in the pants. This time I took better care of isolating the traces of conductive threads from each other, sewing “cable tunnels” with non-conductive, normal thread. Also, I sewed everything with the sewing machine (using the conductive thread as lower thread!), what makes the traces more rectilinear.
With the costume as it is the wearer can decide, what objects are visible in the AR app on the tablet by closing or opening the metallic snap buttons in the pants. Each of the snap buttons is attached to one virtual AR object in the app. If all six buttons are closed, all six AR objects are visible in the virtual space. Opening a button “hides” the virtual object. The Neopixel LEDs in the right sleeve are aligned to the six snap buttons: Snap button one - LED one, snap button two - LED two, and so on. When the button is closed, the LED turns blue.
The person with the tablet has six buttons him*herself on the Userinterface in the app. Clicking on those buttons changes the colour of the aligned LED (one to six) to red - which tells the wearer of the costume that the tablet holder wants to change the actual state of the snap button (either opening or closing it).