IDEA: BUILDING IMMERSIVE MEDIA STANDARDS

By Debra Kaufman

June 28, 2023

Reading Time:
5 Minutes

Since its inception, the media and entertainment industries have cycled through innumerable standards, formats, and media. Highly flammable nitrate film gave way to safety film, Ampex 2-inch “quad” videotape of the 1950s gave way to ¾-inch U-matic and then 1-inch Type C, and digital and file-based formats gave birth to a plethora of other standards. Throughout it all, standards-setting bodies and their members representing all aspects of the industry ensured adoption of uniform technical parameters.

IDEA presentation at CableLabs - Image courtesy IDEA

Hence IDEA, the Immersive Digital Experiences Alliance, a non-profit trade alliance formed in 2019 to create specifications and tools to enable the interchange of the burgeoning immersive media sector. IDEA’s proposed suite of tools are also royalty-free and Open Source as well as extensible to allow for constant improvements. IDEA chairman Jules Urbach, who is also founder/chief executive of OTOY, says that IDEA’s mission is to solve problems by working as an alliance in cooperation with many interested parties, including the Metaverse Standards Forum, MPEG (the Moving Picture Experts Group), ASWF (the Academy Software Foundation) and the Khronos Group, an open, non-profit,member-driven consortium of over 150 industry-leading companies creating advanced, royalty-free interoperability standards for 3D graphics, augmented and virtual reality, parallel programming, vision acceleration and machine learning.

Shortly after its inception, in October 2019, IDEA released its first version of the Immersive Technologies Media Format (ITMF) – designated as version 0.9 - as a baseline for handling the immersive media ecosystem, built on already-accepted technologies for representing immersive images and environments. The concepts baked into ITMF leverage 3D schemas used across the 3D industry and support plugins for all major tool sets including game engines Unity and Unreal.  

Early this year, in February 17, IDEA released ITMF Version 2.0 with three new specification documents: the ITMF Scene Graph Specification, ITMF Data Encoding Specification and ITMF Container Specification. “It’s available for a free download,” says Urbach. “It’s a major refresh of three standards in the suite that will be useful for other standards groups and leading-edge users.” Urbach reveals that ITMF embodies a paradigm shift that replaces the typical raster, used to transmit images from 720 to 4K, with a full 3D environment that is reconstructed during playback by a renderer. “The high-density holographic images – which might have 50 billion light rays -- exceed the capabilities of the raster,” he notes. “This is a fascinating way to resolve the underlying problem.”

ITMF Scene Graph - Graphic courtesy IDEA

In Version 2.0, the image characteristics of the very dense 3D environments are described in terms of computer graphics primitives in a way the renderer can understand. “The data is a very light, compressed file that describes a very dense image,” Urbach explains. “The renderer rebuilds the images as described and sends it to the holographic display, avoiding the raster portion completely.” He adds that the technique was “perfected in the computer graphics industry,” and ITMF Version 2.0 simply uses it as the basis for image transmission. “It’s not a proprietary format,” he says. “We might use the term a confident interchange. Once you have your holographic content, you can compress it and send it to hundreds of holographic displays and headsets. The same image format can be represented on all of these different heterogeneous devices.”

Jules Urbach, Chairman IDEA

Urbach points out that the solution is an analog of the MPEG decoder chip used in digital TV. “In the future it will be possible to replace a simple decoder chip with a sophisticated physics-based renderer,” he says. “It’s an important part of the device ecosystem that reconstructs the ITMF format.” IDEA is working closely with display manufacturers including co-founding IDEA member Light Field Lab.

IDEA’s development takes place in working groups, two of which have been particularly active. “The Media Format Working Group is looking at descriptors of image environments,” says Urbach. “The group is looking for the best ways to reconstruct image parameters and build that into the ITMF format.” The working groups are staffed by representation from IDEA members, “with four or five selected engineers from each company,” he adds.

The second priority is the Network Architecture Working Group, which includes the cable industry and other transmission enterprises. “They’re interested in optimizing carriage networks to work best for immersive media,” says Urbach. “The goal is to be compatible with legacy networks as opposed to making them obsolete. There are many things we can do to enhance the network and this group is considering what kinds of features – parsing, split-rendering and content caching – might work for ITMF performance.” He notes that a portion of the rendering could also be done at the headend, with the rest completed at the end-user’s device. “Very interesting considerations are being taken into account,” he says.

IDEA demonstration - Image courtesy IDEA

Also of interest are NeRFs (neural radiance field), a way to generate novel views of complex 3D scenes based on 2D image capture. “A lot of progress has been made to reconstruct these photogrammetric 2D images into 3D images using neural networks,” Urbach says. “There will likely be some very good applications although it might be too early to integrate this technology into ITMF.” On its website, IDEA gives a tip of the hat to burger chain McDonald’s and artist Karen X Cheng, who created a 30-second ad for Lunar New Year that used NeRF technology to create a 3D scene.

IDEA is also looking at the issue of provenance, related to the origins of 3D objects and the associated intellectual property considerations. “Emerging in the last couple of years are ways to track where all these 3D objects came from and to reconstruct that path into an asset database with the contractual and copyright information,” says Urbach. “ITMF might be able to offer solutions for this, because ITMF’s scene graph tracks all the objects, we have a framework in place we could leverage to provide provenance as well. The combination of scene graph provenance and blockchain cryptography can expand to provide important tool sets for media authentication and verification in an age of synthetic artificial intelligence(AI) imagery.”