Coming Soon

« Company Overview
499,422
2024-04-01 to 2025-03-31
Collaborative R&D
The process of lighting a scene is one of the fundamental elements of creative expression in film and TV production: moving beams of light to create drama in a game show; a child carrying a torch into a darkened room; the silhouette of a neon-lit actor reflected in rain puddles. Technical lighting solutions alongside creative practices are well established for content shot on location or in studio. In Virtual Production, however, lighting is still a frontier for innovation. VP refers to the process of blending physical and virtual environments. This is commonly achieved by filming and recording actors in front of an evenly lit green-screen environment; these shots are then isolated and placed or composited within the virtual scene which tends to have numerous light sources. In this process, lighting does not inherently transfer between physical and virtual elements. As a consequence, the visual cues that are crucial for creating a believable composition need to be 'faked', such as how actors' shadows and reflections are projected into the virtual scenes; and how the lighting from a virtual world affects the actor. In filmmaking, such compositing effects can be added in post-production. However, in live and 'as live' TV production, such effects cannot currently be achieved in real-time, severely limiting the commercial and creative potential of live virtual production. This project will leverage the latest advances in Computer Vision AI and Generative AI alongside green-screen virtual study technology to achieve real-time light interactions between physical and virtual worlds. In this innovative approach, a neural network is trained through the use of synthesised green-screen data to simulate the different pathways light travels between the physical and virtual elements of a scene. This innovation will enable live TV productions to apply established creative lighting practices to mixed-reality environments, achieving fully dynamic light simulation across both physical and virtual worlds. Bringing together leading creatives, engineers and VP specialists from dock10, AI researchers from the University of York and entertainment content producers from 2LE, the project will produce a proof-of-concept technology stack, new creative practices and a demonstrator production. The proposed R&D will be a critical innovation in responding to rapidly growing audience demand for live content that lies at the cross-over between physical and virtual worlds.
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
130,160
2020-12-01 to 2021-06-30
Collaborative R&D
306,221
2019-01-01 to 2021-06-30
Collaborative R&D
"Esports is the term used for describing video games that are played competitively and watched by massive audiences. In 2017, over 388 million people worldwide watched esports, and the number of esports fans is projected to grow a further 50% by 2020 (Newzoo, 2018). The esports audience, today, truly is an ""audience of the future"" -- esports fans are tech savvy, early adopters and regularly engage with new immersive experiences, such as AR, VR and XR (Nielsen, 2017). Fans are highly social, engaging with each other via chat and social media. The typical esports fan consumes esports broadcast on multiple screens, complementing coverage with statistics and visualisations of game data. In esports, every match ever played is recorded in depth and made publicly available. This project will produce a new platform called WEAVR that leverages the data-rich environment of esports to transform the way esports -- and, further down the line, physical sports -- are experienced by remote audiences. WEAVR envisions immersive experiences for remote audiences that seamlessly stretch across virtual and physical spaces, multiple displays, mobile devices, VR video telepresence and augmented reality overlays, enabling viewers to teleport in between the live arena, virtual game worlds and augmented living rooms. Responding to the fans' eagerness to learn and to become better players, WEAVR will create cross-reality spaces in which fans immerse themselves in high fidelity statistics, visualisations and data-driven stories that give them deep insights into the live match. WEAVR will move away from linear ""one-for-all"" coverage towards hyper-personalised. WEAVR experiences are tailored to each viewer's interests, fully interactive, and provide individualised insights by comparing each viewer's own amateur performance statistics to those of professional players. Viewers will be able to share their individual viewing experiences with other WEAVR users and via social networks in real-time, blurring the boundaries between consuming and creating. WEAVR will integrate large-scale, live audience analytics, enabling this project to generate insight into how audiences of the future engage in immersive experiences, and how this engagement can be exploited commercially. Through a consortium that includes ESL, the largest esports content producer in the world, as well as leading academics and innovators across VR / AR, AI, data-driven content production and broadcast, WEAVR will transform the experiences of millions of esports fans."