OBS produces its video content in ultra-high definition and high dynamic range, which should improve the level of detail and color in each shot. Content is also captured in all sorts of formats: vertical video for watching clips on phones, 8K video for high-definition broadcasts, and 360-degree shots for total immersion.
OBS claims to have more than doubled the multi-camera systems OBS uses traditional sports cameras to capture multiple angles of the action and slow-motion replays. It will also use cinematic camera lenses, which can capture more artistic shots, like the enhanced depth-of-field shifts you’ve probably seen in movies. The problem with traditional sports cameras is that the time it typically takes to process these complex shots prevents them from being used in live production. But OBS is leveraging AI and cloud technologies to speed up processing time enough to use these shots in its live coverage. Exarchos says its new processes enable shots that were previously impossible to capture and present live, like 360-degree replays that have the viewer spinning around the athlete as they navigate the air.
“The effect that exists in the Matrix “A film that you could make in a cinema, you can make live,” Exarchos explains.
The OBS is also recording audio in 5.1.4, with the aim of capturing immersive sound from venues during events and during pitchside interviews with athletes. This, along with elements such as augmented reality stations that give people a glimpse of what the Olympic field looks like, are intended to give those at home a sense of being closer to the Games.
“If we repeat the previous Games, which were successful, we will have failed,” says Exarchos. “Because as in sports, it’s all about innovating, breaking new boundaries and going further and further.”
Technological testing grounds
As expected in 2024, artificial intelligence tools will be widely used during the Olympic Games.
Broadcasters like the Olympic Broadcasting Service and NBC will use AI to gather key moments from thousands of hours of footage, neatly package them and deliver them directly to the viewer. Some companies have bet on AI; NBC will use the legendary Sportscaster Al Michaels to recount his best moments on Peacock. The team trained its AI generative voice engine using Michaels’ past appearances on TV shows, and the results sound both smooth and still undeniably weird.
As you watch live games, AI will be able to generate key information in real time and display it on screen: statistics about athletes, percentages of how likely they are to make their shot or beat the clock, and artificially augmented views of what is happening on the field. AI’s foray extends beyond games; NBC is incorporating AI in its advertising platform, with the aim of better personalizing the advertisements broadcast during breaks.
This exorbitant broadcast bacchanal is still a training ground for these new technologies. NBC is using the Olympics as the first major test of its Multiview capability and user customization features, so expect to see these things pop up more often in regular live sports broadcasts. According to an NBC representative, the company hopes the technology that debuted during the Paris Olympics could be deployed for other live sporting events, and even non-sports broadcasts like the Macy’s Thanksgiving Day Parade.
Ultimately, Exarchos says, the goal of these technologies is to make people feel more connected to these events and the people participating in them, especially after the last two Olympics were mired in pandemic-related restrictions that limited the number of people who could attend.
“We’re in a time where people have a huge desire and longing for physical experiences, especially with other people,” says Exarchos. “Sport is an important catalyst for that.”