Skip to content

EXPLORING THE FUTURE OF VIRTUAL PRODUCTION

Create a real-time, reactive background that perfectly syncs with the camera for a seamless and fully immersive filming experience.

Play Video

Mandalorian | Radiodetection showcases advanced technology

The Star Wars series of American drama MANDALORIAN (Mandalorian), as the opening work of Disney’s streaming media platform, achieved great success.

The play has achieved super high ratings of 8.8 and 9.2 on foreign IMDb and domestic Douban platforms, respectively , and also won the best film model of the 18th American Visual Effects Association Awards in 2020 .

“MANDALORIAN” stunning visual effects due to whole new shooting and production technology, which comprises a three-dimensional green LED giant screen to replace the traditional screen, the virtual production techniques and other aspects StageCraft.

Shot taken during several different producers technical point of team closer together, collaborative filmmaking virtual reality and game engine, real-time rendering software with the camera, and ultimately achieve a stunning visual effect.
The giant three-dimensional LED screen of the “MANDALORIAN” studio is composed of Radior’s star products BP2 and CB5. 4450 pieces of BP2 constitute a 6-meter-high, 270° oval LED video wall, and 1,180 pieces of CB5 constitute an 850-square-meter sky screen .
After the screen is set up, the 3D environment created by the shooting technical team is interactively played on the three-dimensional LED giant screen. During the process, not only can the playback content be edited in real time, but also the pixels can be accurately tracked, and the high-resolution rendered 3D image can be perspective corrected. .
In addition, the 3D environment will display the lighting effect and render according to the camera angle, providing real-time parallax, and the interactive light from the LED three-dimensional giant screen will illuminate the actors and the actual scene in the stage.
Compared with the traditional green screen stage, the technical advantages brought by the three-dimensional LED giant screen stage are many:
First, the three-dimensional LED giant screen stage creates an immersive shooting scene. The actors can directly interact with the virtual visible scene . Compared with the traditional empty green screen stage, it is more friendly to the actors, allowing them to quickly enter the play and perform roles.

Second, the shooting scene is generated by a three-dimensional LED giant screen, and the scene can be switched freely at any time in the studio.

No outside scenes were used during the entire shooting process of “MANDALORIAN”. The desert scenes and forest scenes in the play were all created by a three-dimensional LED giant screen, which greatly improved the shooting efficiency.

Third, the three-dimensional LED giant screen stage technology and playback technology have greatly reduced the post-production time for the visual effects department, and the film production cost has also been greatly reduced.

Fourth, the highlights, reflections and rebounds of the three-dimensional LED giant screen stage on the film reflective clothing are more accurate. If you shoot on set in the traditional way, the effect will be greatly reduced.

Compared with traditional LED screen applications, studio LED screen applications have more stringent requirements for accurate color reproduction, dynamic high refresh, dynamic high brightness, dynamic high contrast, and wide viewing angle without color shift .
Such an important technological realization places unprecedented high demands on LED products.
Before the shooting started, Disney tested dozens of products from various companies in the industry, and screened them layer by layer, which lasted 7 months, and finally locked in the star products Black Pearl BP2 and Carbon CB5

LIFELIKE COLOUR

The true-to-life colours and image bit depth of HDR transform an LED screen into a realistic 3D background on camera, without compromise

IMAGINATION TO SCREEN

Create huge sets and worlds limited by imagination, not budget. LED screens offer consistent lighting, engaging environments on set, and streamline the lengthy post-production process.

ON CAMERA AND IN CONTROL

Low-latency, genlocked LED processing in combination with new camera tracking technology create a seamless virtual set that synchronises perfectly with the action.

THE FUTURE OF TV AND FILM

Virtual sets – using LED screens as green screen replacements – are game changers for filmmaking.

Lengthy and expensive location shoots can now be streamlined into one trip – capture the ideal light once then bring it back to the studio and recreate it as many times as needed in a controlled environment.

VFX teams can create intricate, fantastical worlds that actors can see and react to in real time. Directors of sci-fi and fantasy see their worlds through the lens immediately, and can work collaboratively to change them in real-time, rather than after filming wraps.

The Brompton team has already been part of several pioneering film and TV projects using virtual production and LED walls as green screen replacements, as well as having years of experience delivering great results on camera.

We work closely with companies who are at the forefront of this exciting new technology to deliver immersive, believable worlds that take film beyond LED panels.

BETTER BY DESIGN

Brompton products are designed from the outset to make LED panels look as good on camera as they do in person, providing LED processing features that are ideal for creating lifelike 3D sets on a flat LED screen.

HDR: Precise reproduction of the colours in your content

Genlock: Perfect frame and PWM synchronisation with advanced options for phase offset adjustment

Great at any frame rate: PWM settings are automatically adjusted to be optimal for the frame rate actually being used, not just for 60Hz

BETTER BY DESIGN

Brompton products are designed from the outset to make LED panels look as good on camera as they do in person, providing LED processing features that are ideal for creating lifelike 3D sets on a flat LED screen.

HDR: Precise reproduction of the colours in your content

Genlock: Perfect frame and PWM synchronisation with advanced options for phase offset adjustment

Great at any frame rate: PWM settings are automatically adjusted to be optimal for the frame rate actually being used, not just for 60Hz

BETTER BY DESIGN

High Bit Depth: Receive up to 12 bits per colour via HDMI, with internal processing and transmission to panels that maintains this level of precision

PureTone: Eliminate unsightly colour casts in your greyscale for balanced, neutral output

Ultra Low Latency: Lowest possible LED system latency

PROCESSING POWER

The award-winning Tessera SX40 LED Processor is the perfect solution for green screen replacement options, and is already the processor of choice for several TV and film projects using virtual scenery.

PROCESSING POWER

The award-winning Tessera SX40 LED Processor is the perfect solution for green screen replacement options, and is already the processor of choice for several TV and film projects using virtual scenery.

FURTHER READING

“Production design is further transformed via virtual production with realtime LED wall technology to seamlessly blend foreground sets with digital set extensions… Real-time live visual effects via LED walls and screens enable final-pixel visual effects to be achieved live on set instead of relegated to post-production.”

Epic Games: The Virtual Production Field Guide

Epic Games also has a central hub full of resources from a host of companies working in virtual production: Exploring the Future of Virtual Production

Latest technology
Creative solution
Superior Service

AT THE MOST COMPETITIVE PRICE

Videos

Play Video

Real-Time In-Camera VFX for Next-Gen Filmmaking | Project Spotlight | Unreal Engine

Play Video

Behind the Scenes: Treehouse Digital - LED Screen / Virtual Production Test Shoot

Play Video

Exclusive Unreal Engine video: the ins and outs of virtual production

Play Video

PXO Virtual Production

Play Video

Virtual Production - LED wall R&D Shoot

Play Video

Why 'The Mandalorian' Uses Virtual Sets Over Green Screen | Movies Insider

Virtual Production Can be Real for Everybody—Here’s How

Think virtual production is the preserve of James Cameron? The confluence of games engines with faster PCs, LED backlots and off-the-shelf tools for anything from performance capture to virtual camera is bringing affordable real-time mixed reality production to market.

Cameron saw this coming, which is why he has upped the ante to where no filmmaker has gone before and decided to shoot the first “Avatar” sequel as a virtual production under water. Not CG fluids either, but with his actors holding their breath in giant swimming pools.

“The technology has advanced leaps and bounds at every conceivable level since ‘Avatar’ in 2009,” says Geoff Burdick, SVP of Production Services & Technology for Cameron’s production company Lightstorm Entertainment.

Massive amounts of data is being pushed around live on the set of “Avatar 2,” Burdick says. “We needed high frame rate (48fps) and high res (4K) and everything had to be in 3D. This may not be not the science experiment it was when shooting the first ‘Avatar’ but … our setup is arguably ground-breaking in terms of being able to do what we are doing at this high spec and in stereo.”

This is just the live action part. Performance captured of the actors finished two years ago and is being animated at Weta then integrated with principal photography at Manhattan Beach Studios.

“Avatar 2” may be state of the art, but it’s far from alone. Most major films and TV series created today already use some form of virtual production. It might be previsualization, it might be techvis or postvis. Epic Games, the makers of Unreal Engine, believe the potential for VP to enhance filmmaking extends far beyond even these current uses.

Examined one way, virtual production is just another evolution of storytelling—on a continuum with the shift to color or from film to digital. Looked at another way it is more fundamental, since virtual production techniques ultimately collapse the traditional sequential method of making motion pictures.

The production line from development to post can be costly in part because of the timescales and in part because of the inability to truly iterate at the point of creativity. A virtual production model breaks down these silos and brings color correction, animation and editorial closer to camera. When travel to far flung locations may prove challenging, due to COVID-19 or carbon neutral policies, virtual production can bring photorealistic locations to the set.

Directors can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail.

What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render—which is light years from where directors used to be before real-time technology became part of the shoot.

In essence, virtual production is where the physical and the digital meet. The term encompasses a broad spectrum of computer-aided production and visualization tools and techniques, which are growing all the time, meaning that you don’t need the $250 million budget of “Avatar 2” to compose, capture, manipulate and as good as publish pixel perfect scenes live mixing physical and augmented reality.

GAME ENGINES

The software at the core of modern, graphics-rich video games is able to render imagery on the fly to account for the unpredictable movements of a video-game player. Adapted for film production, the tech consigns the days of epic waits for epic render farms to history.

The most well-known is Epic’s Unreal Engine, which just hit version 5 with enhancements intended to achieve photorealism “on par with movie CG and real life”. A virtualized micropolygon geometry, for example, frees artists to create as much geometric detail as the eye can see. It means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into the engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real-time so there are no more polygon count budgets, polygon memory budgets or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.

Epic also aims to put the technology within practical reach of development teams of all sizes by partnering with developers to offer productive tools and content libraries.

It’s not the only game in town. Notch has a new real-time chroma keyer, which when combined with its automated Clean Plate Generation produce “fantastic” results with almost no setup or tweaking while providing all the features you’d expect such as hair, liquid handling and hold-up mattes all within less than a millisecond.

ILM, which uses a variety of engines, also uses proprietary real-time engine Helios, based on technology developed at Pixar.

“The Jungle Book,” “Ready Player One” and “Blade Runner 2049” all made use of Unity Technologies’ Unity engine at various stages of production thanks to custom tools developed by Digital Monarch Media.

For example, on “Blade Runner 2049,” director Denis Villeneuve was able to re-envision shots for some of digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film, using DMM’s virtual tools.

Games engines rely on the grunt power of GPU processing from the likes of Intel, Nvidia and AMD, which has got exponentially faster to enable real-time compositing.

DIGITAL BACKLOTS

Play Video

More advanced versions playing pre-rendered sequences were deployed by ILM on “Rogue One: A Star Wars Story” and its follow-up “Solo” and during a sequence set on a Gotham metro train in “Joker.” A system is also being used on the latest James Bond, “No Time To Die.”

The most sophisticated set-ups combine LED walls (and ceilings) with camera tracking systems and games engines to render content for playback not only in real-time, but in dynamic synchronicity with the camera’s viewpoint. The result allows filmmakers to stage scenes with greater realism than with a green or blue screen and with far more chance of making decisions on set.

“The big change has come with more powerful GPUs combined with games engines providing the software for real-time rendering and ray tracing,” says Sam Nicholson, who heads Stargate Studios. “When you put that together with LED walls or giant monitors, we think that at least 50% of what we do on set can be finished pixels.”

For HBO comedy-thriller “Run,” the production built two cars outfitted to resemble an Amtrak carriage on a soundstage in Toronto. These rested on airbags that could be shaken to simulate movement. Instead of LEDs, a series of 81-inch 4K TV monitors were mounted on a truss outside each train window displaying footage preshot by Stargate from cameras fixed to a train traveling across the U.S.

CAMERA TRACKING

Another essential component is the ability to have the virtual backlot tracked to the camera movement by a wireless sensor. This means that as the cinematographer or director frame a shot, the display, which is often the main lighting source, adjusts to the camera’s perspective. That’s no mean feat and requires minimal to zero latency in order to work.

Professional camera tracking systems from Mo-Sys and N-Cam are the go-to technologies here, but if purely filming inside a games engine there are budget ways of creating a virtual camera.

To create raw-looking handheld action in his short film “Battlesuit,” filmmaker Haz Dulull used DragonFly, a virtual camera plugin (available for Unity, UE and Autodesk Maya) built by Glassbox Technologies with input from Hollywood pre-viz giants The Third Floor.

Another option is the HTC VIVE tracker, which costs less than $150 and has been tested at OSF. “If you want to shoot fully virtual, shooting in engine cinematic is amazing with a VIVE as your camera input,” it sums up. “If you want to do any serious mixed reality virtual production work or real-time VFX previz, you are still going to need to open your pocket and find a professional budget to get the right equipment for the job.”

PLUG-IN ASSETS

The Rokoko mo-cap suit can stream directly into UE via a live link demoed by OSF. The facility explains that the suit connects over the wireless network to the UE render engine and into Rokoko Studio where OSF assigns the suit a personal profile for the performer. It then begins streaming the data into UE by selecting the Unreal Engine option in the Rokoko Studios Live Tab (a feature only available to Rokoko Pro Licence users). The system is being refined at OSF with tests for facial capture in the works.

Play Video

Reallusion makes software for 3D character creation and animation, including iClone and Character Creator 3D. The Unreal Live Link Plug-in for iClone creates a system for characters, lights, cameras and animation for UE. The simplicity of iClone combined with UE rendering delivers a digital human solution to create, animate and visualize superior real-time characters. It’s also free for indie filmmakers; details are here.

Character Creator includes a plug-in called Headshot, which generates 3D real-time digital humans from one photo. Apart from intelligent texture blending and head mesh creation, the generated digital doubles are fully rigged for voice lipsync, facial expression and full body animation. Headshot contains two AI modes: Pro Mode & Auto Mode. Pro Mode includes Headshot 1,000-plus sculpting morphs, Image Mapping and Texture Reprojection tools. The Pro Mode is designed for production level hi-res texture processing and ultimate face shape refinement. Auto Mode makes lower-res virtual heads with additional 3D hair in a fully automatic process.

OSF ran this through its paces, using Headshot to automatically create a facial model, which was animated within iClone 7 using data from actors performing in Rokoko mocap suits streamed live to iClone allowing real-time previews and the ability to record animations. OSF also used Apple’s LiveFace app (available for download on any iPhone with a depth sensor camera) and its own motion capture helmets to capture the facial animations. The next part of the pipeline is to transfer the assets over to UE with the Unreal Engine LiveLink plugin and Auto Character set up plugin, which creates skin textures in the same way as Epic Games’ digital humans.

VIRTUAL PRODUCTION ON A BUDGET

British filmmaker Dulull made the animated sci-fi short “Battlesuit” using Unreal Engine, on a skeleton budget and team of just three, including himself.

Rather than creating everything from scratch, they licensed 3D kits and pre-existing models (from Kitbash3D, Turbosquid and Unreal). Dulull animated the assets and VFX in real-time within Unreal’s sequencer tool.

They retargeted off-the-peg mocap data (from Frame Ion Animation, Filmstorm, Mocap Online) onto the body of the film’s main characters. For facial capture they filmed their actor using the depth camera inside an iPad and fed the data live into UE.

“We had to do some tweaks on the facial capture data to bring some of the subtle nuance it was missing, but this is a much quicker way to create an animated face performance without spending a fortune on high end systems,” Dulull says.

Powering it all, including real-time ray tracing, Dulull used the Razer Blade 15 Studio Edition laptop PC with Nvidia Quadro RTX 5000 card.

Every single shot in the film is straight out of Unreal Engine. There’s no compositing or external post apart from a few text overlays and color correction done on Resolve.

“If someone had said I could pull off a project like this a few years ago that is of cinematic quality but all done in real-time and powered on a laptop, I’d think they were crazy and over ambitious,” he says. “But today I can make an animated film in a mobile production environment without the need for huge desktop machines and expensive rendering.”

This story originally appeared on TVT’s sister publication Creative Planet Network.

GROUNDBREAKING LED STAGE PRODUCTION TECHNOLOGY CREATED FOR HIT LUCASFILM SERIES 'THE MANDALORIAN'

Industrial Light & Magic (ILM), and Epic Games (maker of the Unreal Engine), together with production technology partners Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI unveiled a new filmmaking paradigm in collaboration with Jon Favreau’s Golem Creations to bring The Mandalorian to life. The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using real-time game engine technology and LED screens to represent dynamic photo-real digital landscapes and sets with creative flexibility previously unimaginable.

Over 50 percent of The Mandalorian Season One was filmed using this ground-breaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by NVIDIA GPUs. The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Fraser and Barry “Baz” Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of partners such as Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI together with ILM’s StageCraft virtual production filmmaking platform and ultimately the real-time interactivity of the Unreal Engine platform.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” explained Jon Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working towards using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” said Rob Bredow, Executive Creative Director and Head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real-time on stage providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Richard Bluff, Visual Effects Supervisor for The Mandalorian added, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

Play Video

Mandalorian | Radiodetection showcases advanced technology

The Star Wars series of American drama MANDALORIAN (Mandalorian), as the opening work of Disney’s streaming media platform, achieved great success.

The play has achieved super high ratings of 8.8 and 9.2 on foreign IMDb and domestic Douban platforms, respectively , and also won the best film model of the 18th American Visual Effects Association Awards in 2020 .

“MANDALORIAN” stunning visual effects due to whole new shooting and production technology, which comprises a three-dimensional green LED giant screen to replace the traditional screen, the virtual production techniques and other aspects StageCraft.

Shot taken during several different producers technical point of team closer together, collaborative filmmaking virtual reality and game engine, real-time rendering software with the camera, and ultimately achieve a stunning visual effect.
The giant three-dimensional LED screen of the “MANDALORIAN” studio is composed of Radior’s star products BP2 and CB5. 4450 pieces of BP2 constitute a 6-meter-high, 270° oval LED video wall, and 1,180 pieces of CB5 constitute an 850-square-meter sky screen .
After the screen is set up, the 3D environment created by the shooting technical team is interactively played on the three-dimensional LED giant screen. During the process, not only can the playback content be edited in real time, but also the pixels can be accurately tracked, and the high-resolution rendered 3D image can be perspective corrected. .
In addition, the 3D environment will display the lighting effect and render according to the camera angle, providing real-time parallax, and the interactive light from the LED three-dimensional giant screen will illuminate the actors and the actual scene in the stage.
Compared with the traditional green screen stage, the technical advantages brought by the three-dimensional LED giant screen stage are many:
First, the three-dimensional LED giant screen stage creates an immersive shooting scene. The actors can directly interact with the virtual visible scene . Compared with the traditional empty green screen stage, it is more friendly to the actors, allowing them to quickly enter the play and perform roles.

Second, the shooting scene is generated by a three-dimensional LED giant screen, and the scene can be switched freely at any time in the studio.

No outside scenes were used during the entire shooting process of “MANDALORIAN”. The desert scenes and forest scenes in the play were all created by a three-dimensional LED giant screen, which greatly improved the shooting efficiency.

Third, the three-dimensional LED giant screen stage technology and playback technology have greatly reduced the post-production time for the visual effects department, and the film production cost has also been greatly reduced.

Fourth, the highlights, reflections and rebounds of the three-dimensional LED giant screen stage on the film reflective clothing are more accurate. If you shoot on set in the traditional way, the effect will be greatly reduced.

Compared with traditional LED screen applications, studio LED screen applications have more stringent requirements for accurate color reproduction, dynamic high refresh, dynamic high brightness, dynamic high contrast, and wide viewing angle without color shift .
Such an important technological realization places unprecedented high demands on LED products.
Before the shooting started, Disney tested dozens of products from various companies in the industry, and screened them layer by layer, which lasted 7 months, and finally locked in the star products Black Pearl BP2 and Carbon CB5