In February 2022, 1.5 million households tuned in for the season finale of The Book of Boba Fett, the latest entry in the Star Wars saga and Lucasfilm Ltd.’s most ambitious fan propelled project to date. As the episode scaled emotional heights, so did the visual spectacle with the literal ascension of the dreaded Rancor to the top of the tower. It was homage at its finest — a tip of the hat to King Kong (1933) and stop motion animator Willis H. O’Brien, who excelled in immersion and execution. These critical attributes of polished presentation are paramount in all forms of immersive storytelling. Whether it’s the oral traditions, the circular iconography of the Ouroboros or the silver screen, it’s the duty of the craftsmen, stage directors and storytellers to look for new ways to engross the populace. “The Volume” builds upon this lineage and the evolving/revolving forms of special effects storytelling first brought into this world by VFX wizards… from a century… far, far away.
The brainchild of ILM (Industrial Light & Magic), the Volume is an enclosed, half-circular walled space which may prove to be the film industry’s final stage. Deserving of the most recognition for this technological treasure are visual effects artists Richard Bluff and Rob Bredow, along with Jon Favreau, the director whose vision and motivational instigation ignited results. Visualize something massive; it’s better to think of the Volume as a cylinder-shaped stage theater, hence its original name, Stagecraft. Still, as with most technology, the nitty-gritty makes the difference. These “walls” are, in fact, pixel pitch LED panels with a ceiling that is 75 feet in diameter. Within this space, 11 computers running on Epic Games’ Unreal Engine generate computer graphics to “paint” the walls and immerse the actors, viewers and crew in fabricated worlds. The composition and function of the team are game-changing. Special effects crews can ride shotgun with the director and tweak special effects before editing begins. Each showrunner efficiently incorporates analog special effects alongside digital ones. According to Bryce Dallas Howard, “It’s the convergence of an ancient art form with an entirely new one.” The cyclorama is one of those ancient art forms from which the Volume traces its physical shape.
Read More at VV — Know the Cast: ‘The Book of Boba Fett’
In 1787, artist Robert Barker created large, immersive circular paintings of events and places. His “panorama” concept spread and gave birth to the cyclorama. By the late 1800s, these massive works, suspended like a giant circular shower curtain, gave patrons the most realistic illustrations that they had ever seen. One drawback was the necessity of having patrons gather in a circular viewing platform. Similarly, there are physical limitations to the Volume, as the range restricts actors’ movements. The payoff with the Volume is that, unlike the cyclorama, the objects and creatures “come to life” through digital rendering and the transferring of data. In the grander scheme, this type of Hollywood magic is the culmination of years of technical tinkering.
As one repeatedly hears in Disney Gallery: Star Wars: The Mandalorian (2020), “Necessity is the mother of all invention.” The best ideas usually come about when great inventors work adjacent and in tandem. In the case of the 1933 classic King Kong (Favreau’s favorite film), these principles hold true. As the cyclorama shows up on the market, so does technological advancement. Inventors start to miniaturize the cylinder-shaped experience and add moving images. Eadweard Muybridge was at the forefront of this advancement. The English photographer broke ground with his famous stop-motion images of a horse at full gallop, which are a testament to his technical skills. To freeze the steed, Muybridge developed new fast-acting emulsions, an electro-mechanical trigger and a shutter that opens and closes at 1/1000 of a second. Most important was his creative application of images onto a spinning glass disc, the zoopraxiscope. The entertainment wheel started turning again toward the holy grail of moving images.
Read More at VV — Portrait of a Photographer on Fire
Ten years later, the Lumiere brothers and Georges Méliès built upon Muybridge’s progress. A breakthrough of early cinema was matte paintings. Created by the Lumiere brothers, the matte’s sole purpose was to obscure certain sections of the film’s background. Later in the editing process, the editors could combine the live footage with complementary illustrations and special effects. Méliès incorporated this technique into all of his films. A significant limitation was that it could only be used to create static imagery. Then, as if overnight, a ripple effect of inventions evolved: traveling mattes, matte paintings and, best of all, the Dunning-Pomeroy Process, which used matte paintings to create the illusion that exotic lands and strange new worlds were just beyond the horizon. The Dunning process was cutting edge for its day — yet just as ILM needed to update Stagecraft and the production cycle, the creation of 1933’s King Kong called for a faster and more reliable system.
To create Skull Island, Kong’s native homeland, the special effects team relied upon the newly christened Williams process. This followed the same underlying principles and techniques as the Dunning process. It utilized an optical printer to streamline the world-building and to keep the filming schedule on course. However, the production team’s improvements to the projection process, known as rear screen projecting, ultimately proved to be the ticket and gave King Kong a more authentic and immersive edge over the competition. RKO’s very own Sidney Saunders was in charge of these advanced techniques. One could now plaster images of monsters onto a backdrop and capture the actors’ reactions in real-time. From the performers’ point of view, a limitation of this technique was a blurring effect. In her autobiography, King Kong actress Fay Wray found the process challenging. “From my position, all I could do was see large blurry shadows on the screen.” Today, with credit to ILM’s ingenuity and Epic’s Unreal Engine, these issues have vanished. Through a multi-step process involving thousands of images, scanning and the generation of 3-D models, filmmakers can now render true 3D. But, to reach this point, the industry had to overcome additional growing pains, some of which are traced back to THE 3D film, James Cameron’s Avatar (2009).
Read More at VV — WOMAN: At the Heart of James Cameron’s Filmmaking
On the eve of Avatar’s release, critics hailed the film as the first truly immersive 3D experience. In hindsight, that assessment was a bit of a stretch. Still, I concede that the Volume workflow would be unachievable without Avatar’s motion capture technology. The champion of Avatar’s virtual production is Robert Legato. Under his leadership, the R&D team developed a revolutionary virtual display process, one that allowed Cameron to create a “director-centric” vision. This process required the placement of hundreds of monitor cameras in a spatial volume (essentially a warehouse) and placement of markers on the subjects. Through a signal relayed to the primary monitor camera, the viewer could see actors transform into The Na’vi. Compared to director Peter Jackson’s production of King Kong (2005), it’s easy to recognize the advancement. During the remake, the director did not have the advantage of seeing Andy Serkis perform his CGI portrayal of Kong in advance, nor did he know of the tech’s limitations. This led to significant post-production revisions. That said, even with advancements in motion capture, it’s still a cumbersome process, leaving VFX teams stressed to meet deadlines with no way of seeing the product until it arrives at their doorstep.
Before the Volume, the typical film work process was comprised of Filming > Editing > VFX. Now, with the miracles of technology and a Disney platform that leaves audiences wanting shows constantly, the production flow is streamlining towards VFX > Filming > Editing, Finalizing. The perks of this new dynamic are twofold. First, the tech crews finally get to work with directors and actors in real-time. They no longer have to key green screens, correct light reflection and rectify green screen spillover. Secondly, actors need not rely on improv for their green screen reactions. Virtual queues are now placed on the set using LED floors. Actors can join in on pre-visualization fun and memorize their scenes in advance; directors have time to draw inspiration from filmmaking classics, like Akira Kurosawa’s Seven Samurai (1956). All of these wonders evolved from Legato’s motion capture technology. To further appreciate the technological and logistics enhancements that the ILM has introduced to the medium, let’s reminisce upon the various adaptations of Ben-Hur.
Read More at VV — The Birth of Suzuki Action and Style: ‘Youth of the Beast’
Ben-Hur has been the subject of multiple adaptations. The first of these cinematic spectacles was Ben-Hur: A Tale of the Christ (1925). Production cost was estimated to be four million dollars. It required the construction of an entire arena solely for the chariot race and the utilization of roughly 45 cameras. In contrast to these Herculean productions, Werner Herzog’s futuristic office in The Mandalorian required only a desk and chair, with the rest rendered entirely on LED screens. For the shooting of the famous chariot scene in MGM’s Ben-Hur (1959), both cast and crew were in Italy for over five months. For this production, very accurate measures were required to achieve the lighting, and the staging of the cameras was painstaking. According to Libero Grandi in American Cinematographer (April 2019), “An ever-present problem during the filming of Ben-Hur was maintaining a vigilant check on the color temperature of the light.” This required extensive tests of light conditions and corrective filters on all Panavision cameras. These variations in the color temperature of light challenged the crew as they struggled to resolve deviations in the actors’ faces due to sun and wind exposure. In stark contrast, the lighting and reflections of the Volume can remain at “the magic hour” for as many days as it takes to get the scene right.
At the dawn of the new millennium, ILM doubled the size of its visual graphics department and boldly took the plunge into the Star Wars prequels. The tech giant rose to the challenge of managing an ever-increasing magnitude of data and balanced VFX digital and analog. In March 2022, ILM announced plans to build an even bigger Volume workshop in Vancouver, Canada. It seems inevitable that the film industry will soon go through its own Ouroboros recreation process, one where teamwork is vital. Last June, a curated program at the Tribeca Film Festival gave indie film companies the chance to learn how to apply Unreal Engines’ gaming technology to smaller projects. Technicians are learning to previsualize their camera angles long before physical execution begins. To be seen is where that leaves the viewers and the artists with singular and idiosyncratic visions.
Last month, 60 Minutes host Anderson Cooper stepped inside Laurie Anderson’s mind. Not literally, of course; this was a virtual reality world she created, a digital representation of her most inner thoughts. VR is still best employed as a production tech tool. User-friendly VR art installations are getting closer but not yet of prime quality. One can envision a digital cyclorama for exhibitions. Think of a 360-degree canvas, not of analog paintings or works, but one where filmmakers add digital effects and images in 24 hours or less, viewable from all sides.
Meanwhile, 2022 productions like Obi-Wan Kenobi and Thor: Love and Thunder will likely be great hits. Still, the old silver screens have reached their limit, and even 4K will become passé. Perhaps it’s time for Hollywood and theater chains to up the ante by investing in venues and screens similar to the Volume. Epcot’s Circle-Vision 360° has already laid the groundwork; now, it’s just a matter of figuring out something that’s cost-effective. Meanwhile, consider this quote from David Jonathan, the Harry Potter film series producer, “The most important thing is that you have to have the visual effects working for you, instead of you working for the visual effects.”
Peter Bell (@PeterGBell25) is a 2016 Master of Arts – Film Studies graduate of Columbia University School of Arts in New York City. His interests include film history, film theory and film criticism. Ever since watching TCM as a child, Peter has had a passion for film, always trying to add greater context to film for others. His favorite films include Chinatown, Blade Runner, Lawrence of Arabia, A Shot in the Dark and Inception. Peter believes movie theaters are still the optimal forum for film viewing, discussion and discovering fresh perspectives on culture.