Coming Soon

Public Funding for Jingo Juice Limited

Registration Number 09880283

STREAM [Synchronous Tools for Realtime Experiential Activity Management]

100,771
2022-10-01 to 2023-12-31
Responsive Strategy and Planning
STREAM \[Synchronous Tools for Realtime Experiential Activity Management\] is a suite of tools designed to support the streamlined delivery of co-present & high fidelity XR experiences to mass audiences. The project is developed by Marshmallow Laser Feast, All Seeing Eye and The VR Lab at the Digital Cultures Research Centre, University of the West of England (UWE Bristol). At present there is a critical barrier to effectively offering the immersive industry high quality, real-time, content alongside high audience or user numbers. This collaborative R&D bid will address this issue. STREAM will be an investigation into removing technical dependency on key hardware in the delivery of All Seeing Eye & Marshmallow Laser Feast's industry leading location-based experiences, as well evaluating the effectiveness of the tools through industry leading user-centred testing. This research is applicable to all stakeholders in Location Based VR (LBVR) and the broader immersive sector. The tools will shift industry standards for delivery of immersive Location Based Experiences (LBE)s, with impact across entertainment, visitor experience, arts & culture and beyond. Our industrial research & outputs will provide vital insight for the region to build on through future work with immersive technology. This work will provide a new model for immersive content delivery, which will create market differentiation for the partners whilst offering the sector at large a more inclusionary and viable commercial model. Project objectives include: * Disrupt the way content is delivered to audiences by re-thinking the delivery pipeline. * Augment and improve existing solutions by increasing throughput of audiences at least 3 fold. * Offer a suite of new tools for deployment solutions * Validate the processes developed in environments representative of real life operating conditions through user testing * Explore the commercial application of the toolkit through licensing * Increase skills and expertise within the team, and wider sector, which will support commercial viability and drive future sector development through innovation. * Share relevant learnings with the WECA region, targeted to those who can benefit the most from tools such as 1\. those who need to increase confidence in commissioning & presenting immersive work or 2\. those who could offer audiences new compelling content if there was commercial viability. * Offer the wider sector market leading solutions to support robust commercial delivery of immersive content as LBEs. * Share world-first research from the West of England region internationally, which gives insight into the immersive audience experience & innovation processes between cultural & technology sectors.

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

Immersive performances of the future

320,440
2020-12-01 to 2021-12-31
Collaborative R&D

The Virtual Being Entertainment System (ViBES)

74,994
2020-06-01 to 2021-03-31
Feasibility Studies
The Virtual Being Entertainment System (ViBES) by Marshmallow Laser Feast (MLF) is a solution to strengthen virtual production tools available in the market, based on real time observations in the Covid-19 pandemic. ViBES is about the ability to remotely and natively build content for advanced digital offerings. The problem we will focus on is the virtual & distributed production of characters in performative environments. ViBES will allow remote contribution & collaboration. It will allow creatives to remotely contribute assets (motion capture data & audio) from input devices. The MLF system will input that data, then output to multiple platforms for viewing. A performer captures movement from home then through ViBES can puppeteer an avatar that a director can view on their smartphone, browser or VR headset. With ViBES you could get data from anyone in the world with the right hardware, stream & share via general plugins for Unreal and Unity. Working with multiple inputs & outputs will allow this tool to be flexible enough to accommodate a wide range of users. This will allow creatives such as dancers and actors to be able to provide inputs such as movement remotely which can then be translated into a usable format that can drive an avatar or something of sort on different platforms and devices such as an android phone, a chrome web browser or an iOS system. ViBES provides agility, allowing for efficient remote work to take place alongside setting up cost effective solutions for companies in the virtual production sector (once re-opened) to integrate low-cost solutions into their workflow. Our objectives are to: 1\. Contribute to a more innovative & financially resilient offering from organisations in UK arts & culture. 2\. Support the continued employment of UK based performance talent, even in isolation or remote locations. 3\. As a creative organisation ourselves, support our own pipeline and pivot to offering innovative at home experiential content. ____ Completed to a functioning MVP in November 2020, the team thereafter will use the extension for impact grant to further refine the quality of the tool and implement learnings through user testing. This work will ensure production level quality across a variety of fields in the creative industries. This feature, along with existing features in the MPV will be extensively tested in existing MLF productions as well as selected specialist users. With this understanding we can fundraise or bootstrap the product to launch & sales. We intend to use the extension for impact funding to also bring in specialists to support the strategy making for marketing, audience reach / market penetration, pricing, competitor scanning and launch.

Immersive Performances of the Future: Marshmallow Laser Feast Continuity Funding

146,092
2020-06-01 to 2020-11-30
Feasibility Studies
no public description

Immersive performances of the future

376,277
2019-01-01 to 2021-12-31
Collaborative R&D
This Audience of the Future demonstrator project, led by the Royal Shakespeare Company (RSC), brings together, for the first time, a unique team of cultural industry practitioners and researchers who are ideally placed to inform and guide the next developmental stage of Live Performance. Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) offer under-exploited opportunities for the UK cultural sector to make Live Performance more immersive. We have already seen a new era of theatre borrowing conventions from video games to tell stories - and we know that audiences want "to find fun, interactive experiences and share them with the world" (Brian Schwab, Head of Interaction Lab at Magic Leap). Our consortium will explore what it means to perform live through multiple channels, and the future of real-time immersive performance connected across multiple stages. Audiences will no longer be bound by their location - instead we will use devices such as mobile phones, Extended Reality (XR) headsets and streams into a Live Performance environment, or in the home. We will investigate the new workflows required to deliver this, and the new ways of making the creative content that bring VR/AR/MR into theatre-making and Live Performance. During 2019 each partner will develop a series of prototype projects, drawing on their extensive expertise in site-specific performance, whether this is music and audio, audience development or theatre. We will develop new models for Live Performance that focus on the future needs of audiences with the development of new technologies, working in partnership with Magic Leap, Intel and Epic Games, and specialist companies such as Vicon and FBFX. This research and development (R&D) work will lead to our main demonstrator performance, at the heart of the RSC's autumn 2020 programme, connecting to a main stage Shakespeare production in the Royal Shakespeare Theatre in Stratford-upon-Avon. This project will broaden the possibilities of Live Performance, from digital broadcast, as is the case for live-to-digital work currently, to a mass distributed digital model on multiple platforms. Audiences will connect with the performance live, wherever their location, celebrating the strengths of digital connectivity and establishing a high quality Live Performance to be enjoyed, in a variety of ways, around the world.

Derformable Objects for Virtual Experiences (DOVE)

303,669
2017-06-01 to 2018-11-30
Collaborative R&D
At present there are no solutions at market that can rapidly generate a virtual reality 'prop' from a generic object, and then render it into an interactive virtual environment, outside of a studio. A portable solution such as this would enable creation of deployable immersive experiences where users could interact with virtual representations of physical objects in real time, opening up new possibilities for applications of virtual reality technologies in entertainment, but also in sports, health and engineering sectors. Project dove, 'Deformable Objects for Virtual Environments,' will combine novel alogrithmic software for tracking deformable objects developed at the University of Bath's (UoBath) CAMERA research centre, interactive stereoscopic graphics for virtual reality, and an innovative configuration of existing hardware, to create the Marshmallow Laser Feast (MLF) DOVE system. The project objective is to create turn-key tools for repeatably developing unique immersive experiences and training environments. The DOVE system will enable MLF to create mixed reality experiences such as live productions, serialised apps & VR products/experiences to underpin signiticant business growth and new job creation opportunities. The demonstrator application - Sweet Dreams - will showcase the innovations achieved through DOVE in the world's first VR dining experience, created in collaboration with FDEK.

Get notified when we’re launching.

Want fast, powerful sales prospecting for UK companies? Signup below to find out when we're live.