
What You See Is What You Get.
Virtual Production in Independent Performing Arts
This project is about the possibilities of virtual production in performing arts.
We created a virtual space and used a set up of different tools to enable a real time performance in VR.
This was a very exciting journey, thought we are at the beginning of real time VR experiences as a new aesthetic experience.
We present on this page different stages of the work and the tools we are working with and what virtual production can look like for performing arts. The blog entries below give some overview and the presentation as well as a tech talk you can watch in the video .
Independent Virtual Production in Performing Arts
"Virtual production is where the physical and digital worlds meet."
(WETE Production)
"It combines virtual and augmented reality with computer generated imagery and game-engine technologies to enable production crews to see their scenes unfold as they are composed and captured on set."
(Moving Picture Company)





ABOUT THIS PROJECT
WYSIWYG is an artistic research project, challenging the possibilities of HOW to work together in the virtual space.
For this project we chosed to work completely in a virtual reality surrounding with VR-headsets. To capture the motions we uses special suits by Rokoko, which generate the movement data for the 3D avatars in real time. This technique combines high quality motion data with the benefits of well made 3D models.
This set up allowed us to experiment with a complete virtual production, as it is already used for films.
We asked, how does virtual production can be used for the production process in dance theater? What synergies arise when physical and virtual production flow together? What obstacles does the virtual environment bring up for the dancers? What can be simulated and what cannot?
And finally: If people can work together in real time in a virtual world, could this be the point where performativity (re)enters the virtual space?






All our results about real time VR and more detailed informations about the programming you find in the videos below.
Project Files
CREDITS
Artistic director, programming, concept:
Erich Lesovsky
Projectmanagement:
Stefanie Fischer
Virtual Production in Performing Arts- What it is and how it can work
This article is for all the artists who are interested in VR but don’t have a clue about programming – like myself. But in this project I learned a lot and I want to show you the very basics of virtual production and what it can mean to performing arts.
But first things first: What is virtual production and how is it related to VR?
Virtual production is a bridge between analogue and virtual creative work. It is used in film productions to synchronize the rehearsing of a scene and its postproduction. Most of latest films are created in front of a green screen. The scene is filmed and afterwards the special effect-crew puts on the background, the lights, digital details etc. To bring this process together again, large LED walls are used to display the virtual set. A large benefit for actors and directors, because virtual adaptations can be made in realtime and the virtual and the analogue camera are synchronized. So you film the analogue actor and the virtual background at the same time! This saves a lot of time and makes the production more vivid. This changed the whole production process and the interest of the film industry pushed the development of game engines and needed hardware as interfaces for virtual production even more.
But the main factor aren’t the LED walls, it is to use a game engine to enable realtime work in a virtual space with high quality optics.

And this is the point where “WYSIWYG” steps in: Why not producing completely in a realtime VR space?! Why not enabling dancers, lightning-crew, directors and the whole team to work simultaneously on a virtual stage, like the would do in real but with all the creative benefits of virtuality? Abstract stage designs or costumes and special effects could match the realtime and lifeness experience of a performance.
This is where we want to go to.
Yes, this is possible.
No, it is not completely unaffordable.
We are now at a point, where the needed VR hardware reaches a new level:
-
Graphics processors have become extremely functionable
-
Displays (especially those headmounteddisplays as they are used in VR) made a huge development in the last 2 years
-
And of course: All the hardware became affordable.
We are talking here about a development that started 2 years ago and nothing of what we are doing would have been possible earlier. We are working with the state of the art and combining technics and hardware from different fields like game development, 3D animation and VR. Since all of this is very new and we don't use it in the way it was planned originally (in order to make it serve the needs of the performing arts) Erich is constantly shooting the problems, that come up daily. Because one of the most difficult things is to create a realtime performance in high quality optic for VR and 2D audience at the same time.. High quality optics is the center of animation – which is normally NEVER in realtime. Realtime virtual interaction is the center of game development which optics are good but demand a very good graphics processor. So we working on bringing it together to open up a new space for dance in VR.
To save you some time in your own project, we will document as good as we can our VR production set up, inclusive hints and tricks.
But now, let’s get specific. Here is the technic and the software we are using to develop a 2 persons VR performance stage:
Hardware
-
Oculus Quest 2 (it popped up on US market on 13 october 2020, regrettably you have to have a facebook account to use them, because they are invented by them.This will be changed soon during facebooks transmormation to "Meta" )
-
Rokoko Motion Capture Suits (a suite which tracks your motions keenly. Used to record 3D animations but can also be used for highly precise realtime motion capture. One of the most wanted gadgets this year! Since they are so succusfull, you have to wait 3-6 month for your order.)
-
Storage batteries for the suits
-
HTC Vive Tracker (while Rokoko tracks the motions, this things will capture your location within the room.)
-
HTV Vive Base Station (Those receive the signal of the Tracker and capture the actual ambient air, appr. 3,5 x 3,5 m. Don’t forget to organize some tripods, because they have to be installed 2 meter above the floor to capture this as well.)
-
Wifi 6 Router to support the dataline between the motion capture suits, the VR headsets and the computer, since we don’t want to work with limiting cables.
-
One computer for each VR headset, in which one works as main computer and Project Server
-
Technical details computer 1
CPU: AMD Ryzen 7 2700
GPU: Nvidia RTX 3090
SSD: Intel 660p NVMe PCIe M.2 2TB
SSD: Kingston SA2000M81000G 1TB
HDD: Toshiba P300 3TB
RAM: Corsair Vengeance LPX DDR4 2666 C16 2x16GB
MBD: Asus ROG STRIX B450-F GAMING
-
Technical details computer 2
CPU: AMD Ryzen 3 3200G
GPU: Nvidia GTX 1660-Ti
SSD: Kingston SA400M8240G 240GB
If you wonder, how much you have to calculate for this set up: we are working with a budget of approximately 12.000€ (without taxes) for the technique.
What does each computer do?
Since we don’t want to work with a cloud where every one downloads the VR set up like a game, because this would not allow to be still in a production mode in the software, we have to build a network between all the data generation and computing elements.
Therefore, we need two computers:
-
Each computer runs the same virtual set up and computes the character in its appearance. Each computer is responsible for one VR headset.
-
The first computer is the main one, which is used as a server. This one receives all motion and location data from the Rokoko suits and the HTC tracker for both players.
-
The second computer only runs the VR headset for the second player and computes it's 3D surrounding.The Motion data from the Rokoko Studio Software on PC1 gets mirrored and sent via UDP to the second PC and its UE4 instance.
-
The server gets the " Player State Information" from the second computer about the second player. This is possible through a log in of the second player in the VR set up on the server. The server now synchronizes the location data about both players and the 3D information about the second player.
-
So there is a constant data stream between both computers, in which the second one is following the server instance.
Which software do we use?
All the magic is happening in the UNREAL GAME ENGINE. On their website they describe themselves as the following: “With Unreal Engine, you can bring amazing real-time experiences to life using the world’s most advanced real-time 3D creation tool.”
And they are right. The tool is complex and allows really mind-blowing visuals and connection of different data sources from gamepads over LIDAR inputs to web interfaces. It is based on Epics node based programming language "Blueprint" or the often used C++ scripting language. But due to the huge library of ready made functions in Blueprint, you can basically program a whole game without writing all the code lines by yourself or customize functions.



Basically, that’s all you need!
We are now working details for the workflow during a rehearsal. So next time we present the interface to jump between 3D scenes and different characters.
funded by
.png)


made with


