University College London and Mo-Sys Engineering KTP 21_22 Round 4
Knowledge Transfer Partnership
The aim is to make Mo-Sys the leader in AI-enabled virtual and remote production in the film, TV and broadcast industry. The KTP will give the company's robotic camera heads a new auto-framing capability, which will transform virtual and remote production by making it smarter, simpler and more accessible.
SMARTLENS: Closing the gap between real and virtual filming through smart high end lens calibration
The SMARTLENS project leverages Mo-Sys' leadership in studio-quality virtual production and opens up broad new markets to affordable high-quality video production.
Video is ubiquitous, comprising 82% of all Internet traffic and displacing other forms of media in advertising, in education, news and beyond; attracting professionals from across the creative industries (£268 Bn) to explore and participate in new forms of content production.
'Virtual Production' technology allows filmmakers to combine images captured on camera with computer-generated elements. The SMARTLENS project creates a new automated method of essential lens calibration, allowing significant cost-saving (and carbon-saving), as production teams no longer need to travel to remote locations to capture difficult shots.
This is a collaborative project building on long-time collaboration with Professor Simon Julier (UCL/TMI), innovating novel AI-based automated lens calibration. All lenses distort the path of light in some way: changing the focal length changes the field of view of the lens, and straight lines in the real world do not necessarily look like straight lines in the image plane. Failure to account for this distortion means that the virtual graphics do not line up with the real world, and any notion of the graphics being anchored in the real world is lost. Conventional lens calibration techniques are slow and cumbersome, requiring the operator to manually measure targets and camera movements repeatedly.
SMARTLENS develops a set of algorithms and techniques which will automate the calibration approach. First, a suitable model of lens parameters will be chosen. Conventional computer vision models do not describe effects such as depth-dependent radial distortion for out of focus images. Therefore, we will chose a suitable model from the photogrammetry literature. Second, we will develop the back end optimisation techniques which will fit the parameters to suitable data. Third, we will investigate the use of various kinds of calibration targets, including points and lines to see which ones are most robust over the range of operating conditions. Finally, we will investigate suitable practices which give the best performance.
Mo-Sys is the largest virtual production technology provider today, with a substantial IP portfolio and a sophisticated high-end customer base (Sony, Warner Bros, Disney, Netflix, BBC, NHK, Fox, ESPN, Sky, CNN). Credits range from BBC's Match of the Day to films such as Gravity and Life of Pi. Through this project, this collaboration now continues to innovate and extend lens calibration and tracking systems,
Translating Advanced Camera Tracking Technology to High Precision and High Reliability Indoor Navigation Sensing Enabling More Applications for the Growing Robotics Market
"Mobile robotics have already changed the face of industry. By moving materials and products in an automated manner, they offer efficient and cost-effective ways to store and manage the flow of products through many industrial activities.
Automated Guided Vehicles (AGVs) are used for this purpose with minimal human intervention. Amazon, in particular, are using AGVs extensively within its warehouses to speed-up the collection and shipping of goods.
However, to be able to carry this out well, the robots must know where they are (localisation), where they need to go, and to move in such a way that the materials are safely transported and are not damaged (control). AGVs use a range of guidance systems to do this. These range from simple wire, tape, or magnetic systems to more complex inertial, laser target, or Simultaneous Localisation And Mapping (SLAM) systems.
For simple, repetitive motions, and for open environments that remain largely unchanged, wire/tape and other guidance solutions are usually adequate. However, where the factory environment is difficult (e.g. complex pipework) or when high-precision is required for coupling/joining of containers or parts, and particularly in safety-critical environments with dangerous chemicals, most current AGV solutions fall short in terms of accuracy, reliability, and functionality.
Mo-Sys has developed a world-leading camera positioning and tracking system for use in the media industries. ""StarTracker"" is an upwards looking visual sensor that orientates itself relative to randomly applied reflective stickers. The method is matured, patented and used for over three years in TV green screen studios. It is robust, immune to contrast changes, and can work in darkness.
It is the intention of this project to develop this technology, primarily in the areas of accuracy and control, for AGV applications. It would not require any floor-based guiding systems or laser scanners that try to recognise a position within a dynamically changing environment. Once the stickers are on the ceiling, all objects refer to the same world and can collaborate in a smart factory. StarTracker would be able to guide an AGV robustly and precisely within an unlimited indoor area.
Aside from benefits in cost, time, and productivity for the relevant industries, it would have numerous wider benefits in terms of reducing human error, increasing safety in hazardous environments, and removing humans from monotonous tasks in to more value-added activity. The technology also has the potential to trickle-down in to healthcare, domestic and other robotic applications."