SPORTS' VOLUMETRIC DASH FOR THE HOLOGRAPHIC END ZONE

By Beth Marchant

September 20, 2021

Reading Time:
8 Minutes

Sports broadcasters, leagues and rights holders have long increased fan immersion through ever-greater camera and display resolutions. The fidelity is striking, but flat 2D broadcasts never quite replicate the feeling of being in a stadium with thousands of other fans. Now, thanks to rapid advances driven by the recent adoption of real-time gaming engines and 3D volumetric video capture in sports broadcast, that’s about to change. Game on, volcap.

Rich in life-like detail because it captures moving images of the real world in three dimensions, volumetric capture, or free viewpoint video, adds depth to virtual applications that 360-degree video never could. It is captured with legions of cameras, both on green-screen stages and out on the field, creating more realistic digital human avatars than earlier 2D and 3D graphics techniques. For sports, that means infinitely extended views of real athletes, shot by real cameras, doing what they do best during game play. It also means anyone or everyone at once can view these avatars from any angle. Today, these digital twins can be examined in freeze-frame on the field or transported from the broadcast studio to a stage, or to fans in the stadium and at home.


Augmented and mixed reality applications on current display devices are giving fans a sense of the holographic future, but it’s only a taste of what will be possible when volumetric video capture and light field-based rendering devices like the displays being developed by Light Field Lab combine to recreate the action in front of groups of fans — no headsets required. Being able to move autonomously in a truly holographic 3D space and choose our view will make us a part of the action, whether in a stadium or in our homes. We’ll see sports that take place far from stadiums, like sailing or cycling, and we’ll experience them from exhilarating perspectives, like a player’s-eye view on the field or the court. None of it is possible without volumetric video capture.

“You know when you feel it in your gut, and you know this is going to be the future — that’s how I feel about volcap,” says Simon Thompson, head of enterprise partnerships at Verizon Media’s Yahoo Ryot. Thompson isn’t just passionate about volcap; he’s also a long-time rugby fan who wants more access to his favorite games and players. He knows he’s getting closer as volumetric enters the workflow. “Some of the barriers are starting to fall away, so you can start to see where, in a few years, it’s going to be.”

Volumetric video capture, especially in gaming, is already an explosive market, one that analysts expect to grow from the $578.3 million business it was in 2018 to a $2.78 billion industry by 2023. Most trace its earliest days to the set of 2007’s Beowulf, where actors, dotted with motion-capture sensors inside an infrared cube, were virtually filmed from every angle in three-dimensional space. But the potential gold mine of volumetric video captured by multiple high-resolution cameras in a live sports environment is what drove Intel Sports, Sony, Canon, sports leagues and sports broadcasters to invest heavily in the technology. By 2019, a critical mass of tangible visions of what volumetric capture could mean to the sports viewing experience began to appear.


Sky Sports and its creative partners at London’s Dimension Studio presented the world’s first volumetrically captured golfer during the Open Championship that July at Northern Ireland’s legendary Royal Portrush Golf Club on the North Atlantic coast. That same year, Intel Sports teamed with English Premier League Champions Manchester City to introduce True View, its volumetric camera system, eventually teaming with the NFL to roll it out in the U.S. NTT Docomo, the Japanese Telco giant, created a volumetrically captured digital twin of the competing cyclists and the entire course of the 2019 Tour de France. Canon may have gotten there first when it introduced its Volumetric Free Viewpoint Video System in 2016 during Japan’s premier soccer J1 League Cup Final. The company bowed an updated version of the system at the 2019 Rugby World Cup, hosted that year in Japan. Segments of those volumetrically captured rugby matches grabbed Thompson’s attention and became the centerpiece of a buzz-worthy demo Verizon presented at this year’s Cannes Lions.

To create the digital golf avatars shown during live broadcast coverage of The Open, Sky Sports teamed with the London-based content studio Dimension. The studio captured the golfers on location inside its mobile Polymotion Stage, then rendered and cleaned up the files. An expanded version of Dimension’s stage, which began in 2019 as a compact green-screen dome outfitted with 120 calibrated Nikon cameras, is now housed inside a massive semi-trailer the studio takes on the road with its broadcast partners to film athletes from every angle, wherever they are.

Beyond the cool factor, this kind of site-specific volcap still tends to be very expensive. Some are trying to solve these production and cost challenges with software and make it easier for content creators to work with volumetric data inside tools they already use. That’s what motivated software maker The Foundry to add volumetric compositing technology to an upcoming version of its industry-standard Nuke 3D compositing platform. To get there, the UK-based company launched an 18-month, government-funded research project, dubbed VoluMagic, that concluded last year.

Working with volumetric data in post-production is one thing. But live-streaming volumetrically captured data is another. Condense Reality, a startup based in Bristol, England, announced last year that its software technology, when paired with a robust game engine, lets broadcasters stream volcap-recorded live events to viewers instantly. Quintar, founded by former Intel Sports executives, has created a platform for AR app developers with almost no latency or delay, letting fans in the stands visualize statistics and game data through AR apps and giving fans at home a 3D tabletop version in near-sync with the live game.

But a broader shift in consumer behavior must come before volumetric goes mainstream. That’s what Verizon and its content partners sought to illustrate with its fantasia on Canon’s Free Viewpoint capture of the Rugby World Cup Semi-Final. In a keynote presentation this June at Cannes Lions, the advertising industry’s premier awards event, Verizon CEO Guru Gowrappan showed an amazing set of volumetrically captured examples created with assets that included Formula 1 racing cars and the rugby match, composited in wildly unique ways by different studios. One transported the players to a forest, a city and onto a battlefield, the other turned the players into spectral particle effects. All of it streamed over 5G. (Start watching the “Mainstage session” video on Verizon Media’s website at about 12:20 to see the presentation.)




“There’s this interesting thing of changing user behavior from the passive way of taking in a game on a huge TV screen where you just sit back, eat your chips and watch your sport.” says Thompson, who worked closely with Gowrappan on the presentation. “If we are trying to get people to pick the camera angles themselves, then that’s the kind of change we want.”

VR wearables and, eventually, light field-based holograms, he says, will drive that change. “Once they can walk around and see it all from every angle, of course more users will want their sport in 3D.”

Defining a standard volumetric format is key to pushing content forward. Thompson works closely with Denny Breitenfeld, director of volumetric technology at Verizon and the president of the newly launched Volumetric Format Association (VFA). The group wants to demystify the volcap process through education and outreach but also map out a standard set of specs content creators across entertainment, live events and sports can use as the volumetric ecosystem evolves.

Thompson has also been working with VFA members Canon and Unity. Market-leading game-engine developers Epic and Unity are turning to sports for proof-of-concept projects. Unity is investing heavily in volcap and is poised to ramp up its involvement in live sports with the recent hire of former Liverpool Football Club CEO Peter Moore, an Electronic Arts, Microsoft Xbox and Sega alum. At Super Bowl LV, Epic’s Unreal Engine sent a fiery particle effect logo and football avatar in augmented reality across the field during the CBS Sports broadcast.

For the Cannes example, Ryot gave each agency, including its own studio, two weeks to create projects that would show fellow creatives how easy volumetric data was to work with. The results were impressive given the short development window. “But, honestly, it’s not super-easy at all,” admits Thompson. “It was a massive amount of data that we had to play around with. Right now, the data comes through like a big soup, and the players aren’t individually tagged up.” Once players can be more easily isolated, he says, a creative team could work with players individually or group them as needed. And, as volumetric rendering quality improves, close-ups on the athletes will become more common — a development that will come with new rights issues.

Beyond rights, other barriers like bandwidth and data processing have several laps to go before we get to the finish line. “The tipping point for this will be broadcast quality, real time and people having devices that will let them really see in 3D,” he adds. “Light-field devices would let them move around unencumbered, but I think that will come later. Once we’ve got that real-time content at broadcast quality, then the implications are so big. It’ll no longer just be about the broadcast, which will benefit, but about all brands.”

For Matthew O’Connor, Innovation Leader at NTT Distribution in London, new capture formats will always create new opportunities that inch closer to a truly holographic ideal. He says NTT Docomo’s R&D Center has been experimenting with images that look similar to holograms for some time, showcasing proofs of concept as early as the mid-2000s. “One video they released in Japan showed where these concepts are all being pointed during live events,” he said. “You have a stadium of people who are watching a gigantic sumo wrestler fight but they see them as two giant entities being projected in front of them. These guys are not any bigger than the average sumo wrestler and they are fighting somewhere else, but they’re being projected as two superhuman, almost Ghostbuster-type characters, like the Stay-Puft Marshmallow Man. It’s created this unique experience above and beyond an ordinary sumo fight, where the immersiveness is more important than the actual event. You’re creating new formats and viewing experiences.”


O’Connor works with a team helping to bring ultra-reality viewing to Major League Baseball and to transmit 12K composite images of the ball game across giant screens to fans in other viewing locations. NTT’s “Tokyo 2020 5G Project” deployed a similar example at the Olympic Games, bringing the typically remote sailing competition closer to spectators on shore. In both NTT projects, the action is captured by a fleet of 4K cameras and sent in real time over 5G. Images are then stitched into an ultra-wide 12K image by NTT’s Kirari! using volumetric compression and rendering technology and then shown on a giant screen close to spectators. Referring to continued pandemic-based restrictions on live events, NTT calls it a “new way to watch sports events in the age of the new normal.”

To predict where all this is headed, O’Connor cites the analysis of Matthew Ball, a venture capitalist and the former head of Global Strategy for Amazon Studios. A prolific and much-quoted essayist on the interconnected past and future of technology, Ball describes a pattern of innovation in sports coverage in television and other media. “Basically, he says that while a live sports experience may never change, new formats will continually emerge to enhance that experience,” creating new billion-dollar businesses along with them, O’Connor says. “Take, for example, esports, or the fact that the Olympics is now full of sporting events that didn’t exist 10 years ago. The concept of esports wouldn’t exist without gaming engines or global concurrency. Holographic technology will do the same thing. It will create new formats, and they will be tested in sports and entertainment before they will be anywhere else, because those are the places that will always draw the biggest audiences.”