The use of mixed reality technology was central to the production of an extraordinary augmented reality feature, shown at the Baltimore Ravens’ M&T Bank Stadium where a giant-sized raven was seen to swoop and land in the stadium and respond to in-game live moments.
“The goal with Augmented/Mixed Reality for this project was to produce a dynamic, in-stadium fan experience for our client, the Baltimore Ravens, and bring their Raven mascot to life, creating a true “wow” moment. We wanted to give people a glimpse into what is possible with mixed reality production,” said partner and owner of The Famous Group, Jon Slusser.
The Future Group initially provided technical guidance as the digital raven was designed using CG. Three days before the event, The Future Group’s experienced team arrived at the stadium to integrate the various elements of content and prepare it for rehearsals, adaptations and eventually for delivery of the final live production. Apart from providing the core technology to execute the mixed reality visuals, The Future Group also provided expertise to act as a liaison between all the various departments which included camera, live program vision mixing and in-stadium display systems.
Jon Slusser continued: “First and foremost, The Future Group took time to understand the project and help strategize with us as we communicated back to the client. Their expertise in the space was a big value-add, and when it came time to developing the various animations, Pixotope helped us with refining the look and feel of the character.”
The Future Group installed its mixed reality platform Pixotope as the central hub of the complex production, which would later layer the real-time augmented content over the live-action shots. Pixotope creates a virtual environment, which just like a real studio or set, contains lights, a camera and objects to be photographed (or “rendered”). In this case, the virtual scene in Pixotope contained the animated raven, a laser-scanned model of the stadium, as well as a camera and lights to match those in the real-world stadium. Additional video hardware was provided by Quince Imaging.
An important task for The Future Group’s team was to accurately sample the stadium lights, so that when the digital raven was incorporated into Pixotope’s virtual environment, it would be lit exactly as it would have been had it really existed in the stadium itself. Frank Daniel Vedvik, senior product specialist at The Future Group explained, “We initially used 360-degree high-dynamic range photography to measure both the location and the relative brightness of each of the light sources. Accurate replication of the lighting is essential to ensure the augmented elements look real when added to the live background. In fact, we had almost 30 real-time light sources within Pixotope’s virtual environment, including bounced green light to mimic the effect of reflected light from the green turf of the sports field.”
The laser-scanned model of the stadium, which was also imported into Pixotope, was used as a “shadow catcher”. This is a CG model that accurately replicates the shape and form of a real-world scene, but one that is not rendered out in vision directly. How it works is that only the shadows that fall upon the model of the stadium are rendered, which are then composited over the live action shots. This technique resulted in the augmented raven accurately and realistically casting shadows over the live shots of the stadium.
Another task for The Future Group’s team was to prepare the raven digital model for real-time production. As The Future Group’s chief creative officer, Øystein Larsen described,
“One key difference between real-time (live) use of computer animation and the more traditional use within a post-production environment, is that there are many more variables to prepare for with live events. Improvements and adaptations to the augmented elements occur right up until the last second and, therefore, the duration of shots cannot be precisely known ahead of time.”
Normally, CG animated objects are pre-keyframed to run their designated animation and then stop. But this does not work for a live scenario. For example, a brief for the raven to fly into the stadium and land on the goal post, squawk and then take off again would require the timing of those segments to be pre-set into the animation. However, in a live scenario, it is not known how long the shot will be because the duration will depend upon unfolding game-play events. Therefore, the show director will want to call and dismiss the raven on cue.
To allow for this scenario, The Future Group relied on the ability of Pixotope to fully access the powerful underlying Unreal Game engine. This enabled the use of game logic to migrate on-demand between different states and animations of the CG raven model. By doing this, the raven could be instructed to loop a certain section of the animation, for example, while it waited on the goalposts, until the director’s cue. At this point, the game engine could be triggered via Pixotope to merge to a different animation, such as making the raven take-off and fly away. This merging process essentially creates “live” animation to move each part of the raven model from the position it was last in, to the position set out in the next animation segment, over a short period of time. This process leads to a seamless, jump-free connection between the various animations. “It is this hybrid of game infrastructure and broadcast-ready services that make Pixotope so powerful”, stated Øystein Larsen.
To ensure that the raven could be positioned anywhere within the stadium and be able to properly react to the lighting in any zone, The Future Group adjusted the “shaders” used to render the digital raven model. Shaders are sub-programs that describe how a given surface of a CG model reacts to light. Frank Daniel Vedvik notes further, “The modifications ensured maximum flexibility to match the raven to the high contrast lighting changes between different stadium areas, while at the same time also ensuring that the digital raven could be easily rendered within the Pixotope system in real-time at 59.94 frames per second.”
Another requirement was to have the raven perching between the goalposts. Adding augmented content to a background shot might seem simple enough, but this only allows for the augmented content to sit in front of the live shot scene. Due to the viewing angle of the goalposts, the closest post to the raven would have to appear in front of it.
In normal post-production, this would be achieved with a key (such as a green screen chroma key) or by using a matte (rotoscoping). Neither is possible though in a live scenario. Keying cannot be relied upon because it is not able to predict what colors will appear around the object to be keyed. Rotoscoping is also another post tool that cannot be implemented in this case because the framing of the target object could not be known ahead of production time, as it was shot from a freely moving camera with a variable zoom lens.
To overcome these complex challenges, the clever solution The Future Group employed was to build the goalposts as a 3D object within Pixotope. These accurately matched the size and position of the real posts. Frank Daniel Vedvik added, “The goal-posts model was used to create a “hold out mask”, which created a hole in the alpha channel of the raven, so that when it was added to the background, the part that would cover the foreground post was effectively erased.” This presented the audience with the illusion that the raven appeared to sit perfectly between the goalposts.
It is the small, finer details added to CG which contributes to it looking realistic. Real-world shots tend to have blurs, flares and noise due to optical limitations in lenses and cameras. CG is naturally devoid of such imperfections and so can appear unrealistically sharp and vivid. However, adding organic and natural effects such as flares and blurs are processor intensive. This is why they are used very sparingly in computer games where real-time frame-rate is more important that realism. To combat this, The Future Group’s Pixotope platform comes with its own set of highly efficient custom post-processing effects, which when combined with Pixotope’s bespoke single pass renderer, enable very realistic images to be created in real-time.
One such post-processing effect used on the Baltimore Ravens project was a custom “light-wrap” effect. Light wrapping happens when a foreground object travels in front of a bright light source. In the real world, the bright background light seems to eat away at the edge of a foreground object, with its light spilling over in front. Think of the image of a person standing in front of the sun, with the sun just peeping out from one side. This makes a large flare become visible, which covers part of the person even though the sun is behind them. Pixotope’s light wrap feature simulates this phenomenon, while still impressively maintaining real-time performance, even at 60 frames per second.
Once the virtual environment within Pixotope was set-up, the resulting images were then layered over the background camera shot. In order to achieve this, Pixotope had to simulate the real-world camera position, viewing direction and focal length of the real-world camera’s lens. This process ensured that the virtual raven was “filmed” correctly from exactly the correct angle. Stype was used to provide the camera tracking information to Pixotope.
It is imperative that the quality of the background shot must be preserved when compositing augmented reality on top. As Øystein Larsen sets out, “If the background image is brought into the virtual environment and then rendered back out again, it will almost certainly be degraded due to the addition of anti-aliasing, motion blur, unnecessary color conversions and the like.” Pixotope takes an ingenious approach to combat this. While the background shot is present in the virtual environment as a light source to affect the CG objects, these are then rendered over a direct feed of the background shot in a process exclusive to Pixotope. This guarantees that when Pixotope augments material onto a camera feed, the original image qualities are left perfectly intact.
Once all the technical aspects had been set up, rehearsals could begin. Since Pixotope works in real-time, adjustments and improvements to most aspects of the production can (and do) occur right up until the last minute. In the case of the Baltimore Ravens project, the agility of Pixotope allowed the creatives and show director to try out alternatives in pursuit of the perfect augmented experience.
The final execution worked flawlessly, with the giant raven swooping into the stadium on cue, which exhilarated the attending audience. Positive social media messages about the event were a testament to how much the mixed reality additions enhanced the audience’s experience and how realistic it was.
Creative agency partner and owner Jon Slusser, was thrilled at the outcome, “The overwhelmingly positive results were felt in the stadium, on social media and with traditional media outlets. We received over 11 million views of the Raven in flight on various social media platforms, all referring back to the Ravens’ original social media posts. We also saw a huge spike with traditional media outlets like ESPN, Bleacher Report and Sports Illustrated”. CBS Sports reported that the mixed reality segments “Had fans, (and anyone who saw the video on social media), in awe.”
The Future Group’s CEO Marcus Blom Brodersen concluded, “Mixed reality content has the power to grab the attention and drive engagement of a wider audience, by providing extra dimensions to the viewer’s experience and creating cut-through, stand-out moments that are so shareable on social media. We are very proud of our work with The Famous Group to bring the Baltimore Raven to life.”