type of project: fellow research project

published: 2020

by: Markus Wagner & Christoph Wirth

AUTO_NOMOS (Markus Wagner & Christoph Wirth)

AUTO_NOMOS is a research on cars, crashes and VR´s. Starting from the piece „The one – legged Runner“ by playwrite Katharina Schmitt, it investigates transmissions between performative environments as well as theatrical tools of staging and VR – technology with a strong focus on sound and (choreographic) interaction design.

AUTO_NOMOS critically revisits the dispositives of western history of technological progress, its phantasms, desires and catastrophies and takes the car as an complex allegory of the latter. Within the virtual environments AUTO_NOMOS suggests, the car becomes a vehicle for an aesthetical and poetic reflection on proxy desires and phantasies of self- optimization induced by modern technology as well as the infrastructures and ecologies of life, phantasy and drive connected to them.

To find productive and interesting ways of transmissions between performative questions of embodiment and disembodiment, presence etc. and the virtual space, specifically virtual reality environments as predesigned interfaces of embodied first person experience. How can one use the critical knowledge within performance and live- arts, its knowledge about perceptional, sensorial and affective questions, its aestetical strategies of staging to design virtual environments which allow a conscious and non - illusionist experience of immersive potentials.

How can the virtual spaces of visuality, sound and the imaginary landscapes of text converge within VR, how can they productively disrupt each other? And do the (perceptional) gaps of such disruptions leave or open up potentials for interaction, freedom of somatic experience or inter- passive enjoyments?

Overall: which interesting ways exist to explore cross- medial forms of staging theatrical and performative contents in a new digital medium? Which unkown ways of story-telling and aesthetic experience can emerge, which simultaneously are capable to question the mediums tendency towards a totally closed and illusionistic fictional environment?

February

- Designing first prototypes of virtual environments, which would serve as “containers of imagination” and which strongly relate to the main acts of the play. Driving question were: How to orientate, what relations of the player to the VR- Avatar? Which relations between inside and outside, environment and first - person perspective. Questions of orientation, navigation, movement within the containers. - First experiments with 3 D Sound, voice recordings and the DEAR - VR Plugin for 3 D Sound. Questions: How to abstract the voice from concrete embodiments, how to make sound environment become a second virtual environment within the virtual world

March

- Developing first scripts for interaction design, navigation and orientation within the prototypes

April

- experimenting with motion capture and the opitrack system under a choreographical perspective using Forsythe Improvisation techniques. Experiments of movement improvisation within VR and a “mirroring” avatar. So the dancer could explore his movements and its functionality “live” within the virtual environment

- working with the animated material to explore more filmic techniques of narration

- developing the sound under the perspective of bridging soundtrack aesthetics and song- like structures to communicate with merely radio- play aesthetics

May

- voice recording with actors and work with the playwrite to implement the text into the virtual environments. Adjust the sound composition alongside with further and more detailed development of the virtual spaces in terms of look, interaction and strategies of story- telling

- set up a first dramaturgical structure of the virtual spaces according to the macrostructure of the plays development

- simultaneously work with the text to make the communication between the virtual environments and the phantasy the text suggests stronger

- developing responsive scripts so that the virtual worlds would shift shape according to questions of interactions within those environment –> create different experience of embodiment related to different forms of interaction and link them to the progress of the narration of the drama

- preparing voice recordings and final sound design and composition

June

- final wrap up of composition, visual and interaction design of the environments. Adjust details, kill bugs, leave interesting ones. Create tutorial texts for the navigation within the virtual environments and link them to the style and questions of the drama. Final work with the writer on questions of dramaturgy and staging

Hardware we used:

Oculus Quest 2 with Oculus Link Cable/ Wifi Air Link Connection

Opti Track System for motion capture

Software we used:

Unity 2020.3 Cinema 4d R 24

Sound:

Dear - VR PRO plug- in for 3 D sound spatialisation

Ableton live 10 as DAW

Different virtual and analogue synthesisers

Crucial Unity Assets: Final IK (made the creation of a good fitting VR body much easier, although it would also be possible with build-in Unity Animation tools) https://assetstore.unity.com/packages/tools/animation/final-ik-14290

Obi Softbody (to create particle based soft-body simulations) https://assetstore.unity.com/packages/tools/physics/obi-softbody-130029

Puppet Master (to work with ragdoll physics and collisions) https://assetstore.unity.com/packages/tools/physics/puppetmaster-48977

Realistic car Controller (to have a good starting point to create a VR-Car-Rig) https://assetstore.unity.com/packages/tools/physics/realistic-car-controller-16296

3d Gesture Recognition (to train a AI with movements, that can be recognized) https://assetstore.unity.com/packages/templates/systems/mivry-3d-gesture-recognition-143176

As one of the starting points, we developed several “worlds” in unity, where we combined spatial design, scripted behaviors of the elements and animations. We think of these “worlds” as of virtual scenographies and described it as frameworks or infrastructures of possible/potential experiences, sensations and actions. Like a stage-design in a real-world-theatre-production, these virtual scenographies where inspired by the theatre piece as well as the creative concept and especially the sound design, which developed in parallel. At the same time, the virtual scenographies influenced the dramaturgy, the treatment of the text etc. After exploring and unfolding these scenographies in “open worlds” we narrowed down the open structure of the spaces and implemented a sequential order and a timeline-based structure, where the scripted behaviors / scenographical mechanics where one of the building blocks. Here are some examples of these scenographical scripts:

The VR Body

A quite big part of the work was to create a good working VR-Body. By VR-Body we refer to the avatar, that is attached to the player and controlled by the player as his “own” body. We called it “inhabiting a puppet” and tried to get the feeling of “having” a body in VR as organically glitchy as possible.

The Viewpoint Traveler:

Enables the VR-camera (which is always at the same time tracked by the movements of the players head) to travel in and out of the VR-body. So that the player can for example look herself from outside or merge with another vr-body or (animated)avatar. This script was created to experiment with Out-Of-Body-Experiences and with a feeling of synchronically having a first, second and third-person perspective.

The Looping Walker:

A VR-Rig, that adjusts to floors in every rotation/world orientation, so that you can for example walk a road, that bends in a vertical loop. With the help of that script you also can for example teleport on every face of a cube so that a wall or a ceiling can get your new floor. So the script creates possibilities in the virtual world, that are not there in the real world, but very interestingly, this feeling of freedom disappears after a while and makes place for an uncanny feeling of a dead end or a never ending race.

The Movement Blender:

This script is about the VR-body. With the help of that script, we can blend in animation-data (for example of mocap recordings) and “occupy” the VR-body. This “occupation” can be related to certain rules, like: if you move your arms fast enough, the movement of “your” VR-body is synchronized, but if you move slow or stop moving, animation data takes over and controls the movement of your arms. This script dissociates the feeling and the seeing of a body, that is supposed to be “mine”.

VR-Gesture-Recognizer

Training a VR with gestures and movement-qualities, that then will be recognized, to trigger or conduct certain reactions.

  • projects/auto_nomos/start.txt
  • Last modified: 30.08.2021 07:03
  • by Philipp Kramer