Virtual Production of The Mandalorian
Monday. Called Eki to catch up. NA. When I got him, he said he'd been on a shoot. That always gets my attention. Asked him what was the project. He said they were doing Virtual Reality tests. And began to tell me how it works. Whoa. I was totally lost.
He sent a STAR WARS video of a crew working on VR. I could see a circular screen. Still didn't have a clue. Called him back. He tried to explain. But it was only after I compared it to 1930s pre-on-location movies, where the action might take place in Monte Carlo. The French cafe' is on a set, and the MC background is stock film. But VR surrounds the actors and they can see the action. I think.He said he'd been working on the technique on and on for three years. And sent a video of the set-up in his home office. Three years sounded like a long time for Eki to nail something. So it must be tough to cracK VR. I want a Dummy: VIRTUAL REALITY 101. Over to you EKI.
Source: personal experience
Next week: FLYING through COVID
Note: Okay, this is a big one to bite at one go but let's try ;-)
First of all, it's not Virtual Reality, but rather Virtual Production. The two are related but not the same. Virtual Reality is when one puts on VR glasses and is immersed in a video game. Virtual Production is when similar techniques are used to produce movies or other video content.
There are a few different disciplines of Virtual Production, but I'll just concentrate on the two that best apply here. The common characteristic is that they both share the use of real footage from a camera combined with elements (usually a background) that is computer-created with a game engine, in real-time.
Virtual Production Demo Day, Helsinki 2020
The first is what was used with The Mandalorian, a Star Wars spinoff series. As Maggy writes, it's somewhat similar to the rear projection techniques famously used on the original 1930s King Kong and other films from that era and well to the 1960s. On those, a background scenery film was projected on the rear side of a translucent movie screen, and the live-action set was built in front of it. The camera captured the actors and the background at the same time. The effect was sometimes quite convincing, and no post-processing was required. These days, the film screen is replaced with a led wall - essentially a huge television set. This technique is in wide use also in television programs, where the led walls often replace practically built sets.
King Kong, RKO Radio Pictures 1933 |
The challenge with this method is that the angle of view is fixed. There is no depth to the background image, which is fine if the screen is used just as an abstract set piece like in television shows, but becomes a problem if the backing is supposed to be a real environment like in movies. The illusion breaks the moment the camera starts moving as the backing is just a flat 2D image - the perspective works correctly just from one single angle.
The solution is to use a computer-generated 3D background that takes the changes in perspective into account. The location and rotation of the real-world camera are tracked in real-time and the background seen on the screen is always rendered from that point of view. When the camera moves, the backdrop changes accordingly, and everything lines up. One can think of it as if the camera was a character in a 3D game - which is actually quite close to how the effect is really done: the backgrounds are usually made with the same game engines that are used when making computer games.
On the high end of the production scale, the video screens are not just flat panels, but rather surround the set and the actors almost 360 degrees, often called a "volume". Even the ceiling can be a screen. This has the benefit of immersing the actors in the scene. The set looks and feels real to those in it - if there's a distant city on the horizon, it's not only seen by the camera but also by the actors and crew on the set. Another important benefit is that the screens act as a light source, giving a realistic ambient light to the set. They also show in reflections, which can be very important if the subject that's being shot is e.g. a car, most of the look of car paint actually comes from the reflected environment.
My Virtual Studio setup at an early prototype stage.
The second Virtual Production technique is similar, but using a green screen instead of a led screen. The actors are in front of a green wall, and the computer-generated backdrop is keyed behind them on a computer, in real-time. The same game engine as in led screen productions is used, the real camera is tracked in the same way. The main difference is that the camera does not capture the final image, but rather just the foreground element, and the actors need to imagine their surroundings.
Lucia – a Christmas story littlemargieproductions 20©08 |
This technique also has a long history, and even moving cameras have been used for a long time - but not in real-time. Tracking and matching the camera moves was usually done in post and could be a very slow and tedious process. Because of this, most productions opt for a stationary, locked camera. This is also what we did for the acted "old black & white" segments of littlemargieproduction's Lucia - a Christmas Story.
On television, real-time virtual sets have been used since the late 1990s but they required state-of-the-art supercomputers of the day and were rather cost prohibitive. In Finland, only the large television broadcasters could afford setups like this. They were also horribly non-user-friendly. I worked with one of these systems back in the day, created the virtual sets for a digital sports channel in 1999 or so, and it was quite frustrating, to say the least.
The largest benefit of modern green screen based Virtual Production is cost. Large led volumes cost millions to own, and tens of thousands a day to rent. Greenscreens are relatively cheap. A working basic setup can be built for a few thousand, including the green screen, cameras, computers and so on. This is what I've done at our SW5 studio. It's still a work in progress, but getting there.
The green screen stage at SW5 Studio, freshly painted. |
The basis of all operations at our studio is of course the green screen stage. We built it from basic construction materials and some flexible vinyl floor for the curved areas, painted with Rosco green screen paint. It's not the prettiest in the world, but it was affordable, can be torn down if we ever have to move again, and it does the job.
The cameras are tracked using a consumer-grade Vive system. Essentially, we take the data of a Vive VR headset's trackers and apply that to our virtual cameras. This gives us an area of a few meters across, where the computer knows exactly where the real camera is positioned and how it is oriented.
The video camera's signal is connected via an HDMI interface to the computer. I use an affordable Sony a6300 hybrid mirrorless as the main camera. This is one place where the low budget shows - while the image quality is actually great, the cabling and connections are not as robust as with dedicated professional systems. Our streaming computer is a decent, but not by any means high-end graphics workstation.
UE5 demo shows the rendering capabilities of the next generation
of the Unreal Engine. All this film-quality animation is real-time - no more waiting.
The virtual background is created using Unreal Engine. Unreal is a popular and very powerful piece of software, it can be used to create first-class video games - and the photorealistic virtual sets we need. Best of all, it's free. The data from the Vive tracker drives a virtual camera inside a "video game". The position, orientation, and lens properties are matched as closely as possible to the real camera.
If done right, the two align perfectly and the live actor can be placed within the virtual world.
The video and background could be combined within Unreal Engine (or a 3rd party addon like Aximmetry), but for now, i am using Open Broadcasting Software, OBS, for compositing the two. This may change in the future. Obs is also used for switching between different angles, mixing audio, and adding on-screen graphics, etc., as well as streaming the live video to Youtube or other platforms. OBS is another great piece of free software.
I coded a simple greenscreen keyer for OBS, as i was not happy with the results i got with the built-in tools. Once i have perfected that, i'll probably share the code so that others can benefit from it too - hopefully i can contribute a little to this amazing free tool.
Having these free tools at anyone's disposal is nothing short of a miracle. What once cost millions is now at the fingertips of anyone with an internet connection and the will to learn.
CU
--
Eki
No comments:
Post a Comment