Coming Soon

« Company Overview
106,543
2025-10-01 to 2025-12-31
Feasibility Studies
The UK electricity network stands to gain significant advantages from deep learning-based computer vision models for asset management; reduced analysis and data collection costs, improved output quality, lower lifecycle costs from consistent assessments and better predictive models. Independent development of these models by individual networks is costly, inefficient, yielding inferior results. Networks lack sufficient data for rare defects and components. A "foundational model" is proposed: a central model or models continuously updated with shared industry-wide datasets. This collaborative approach would lead to more reliable and robust models, be more cost-effective for consumers, and free networks to focus on value-adding activities.
32,840
2022-04-01 to 2023-09-30
Collaborative R&D
Autonomous flight is a general purpose technology with enormous commercial potential. As a result a lot of resources are being put into solving for Logistics and Advanced Air Mobility (AAM). We want to provide these companies with the world's most advanced solution for autonomous unmanned flight and we took the first steps in FFP2\. In FFP2 the consortium led by [sees.ai][0] has developed a technology that enables BVLOS flights on demand, at low altitude and close to obstacles. In our system the Pilot designs the mission; our UAS senses and maps the world in 3D using Lidars and Cameras, allowing it to fly autonomously close to and amongst infrastructure. The 3D map is sent in real-time to the pilot who supervises the mission remotely for safety. This technology provided the necessary safety and risk mitigations that allowed us to obtain the first authorisation for routine BVLOS approved by the CAA. Although the system is designed to work in any congested area (including urban) the authorisation is limited to a number of predefined industrial sites in which we will build our safe flight record. In FFP3 we are pushing the technology and our operational safety case to extend our current capabilities to: * Enable Atypical Airspace (AA) BVLOS inspection of assets in the public domain by: * Advancing the capabilities of our DAA (Detect and Avoid System) so it can detect vehicles and people on the ground and any approaching aircraft * Embedding our operations into the wider aviation ecosystem by integrating them into a commercial UTM system. * Developing a system that can provide comms between UAS and pilot in areas with poor or no 4G/5G coverage. * Leveraging our advanced spatial awareness and our integration into the aviation ecosystem to create a solid Concept of Operations that will allow us to obtain one of the world's first approvals for AA-BVLOS. * Enable a pilot to control multiple UAS, an important step towards increasing the efficiency and scalability of UAS operations. We will also be aiming to be one of the first companies to obtain regulatory authorisation to fly multiple UAS simultaneously in AA. With these advancements our consortium will be hoping to contribute to the BVLOS infrastructure of the future. At the end of the project we will have a number of systems that will be tried and tested and ready to be deployed regularly by our clients. [0]: http://sees.ai/
32,840
2022-04-01 to 2023-09-30
Collaborative R&D
Autonomous flight is a general purpose technology with enormous commercial potential. As a result a lot of resources are being put into solving for Logistics and Advanced Air Mobility (AAM). We want to provide these companies with the world's most advanced solution for autonomous unmanned flight and we took the first steps in FFP2\. In FFP2 the consortium led by [sees.ai][0] has developed a technology that enables BVLOS flights on demand, at low altitude and close to obstacles. In our system the Pilot designs the mission; our UAS senses and maps the world in 3D using Lidars and Cameras, allowing it to fly autonomously close to and amongst infrastructure. The 3D map is sent in real-time to the pilot who supervises the mission remotely for safety. This technology provided the necessary safety and risk mitigations that allowed us to obtain the first authorisation for routine BVLOS approved by the CAA. Although the system is designed to work in any congested area (including urban) the authorisation is limited to a number of predefined industrial sites in which we will build our safe flight record. In FFP3 we are pushing the technology and our operational safety case to extend our current capabilities to: * Enable Atypical Airspace (AA) BVLOS inspection of assets in the public domain by: * Advancing the capabilities of our DAA (Detect and Avoid System) so it can detect vehicles and people on the ground and any approaching aircraft * Embedding our operations into the wider aviation ecosystem by integrating them into a commercial UTM system. * Developing a system that can provide comms between UAS and pilot in areas with poor or no 4G/5G coverage. * Leveraging our advanced spatial awareness and our integration into the aviation ecosystem to create a solid Concept of Operations that will allow us to obtain one of the world's first approvals for AA-BVLOS. * Enable a pilot to control multiple UAS, an important step towards increasing the efficiency and scalability of UAS operations. We will also be aiming to be one of the first companies to obtain regulatory authorisation to fly multiple UAS simultaneously in AA. With these advancements our consortium will be hoping to contribute to the BVLOS infrastructure of the future. At the end of the project we will have a number of systems that will be tried and tested and ready to be deployed regularly by our clients. [0]: http://sees.ai/
66,530
2020-05-01 to 2021-02-28
Study
Invasive Non-Native Species (INNS) are organisms introduced into areas outside their native region where they then threaten ecosystems. They are regarded as one of the top five threats to biodiversity worldwide (IPBES, 2019), as well as having significant economic impacts, with companies in various sectors such as transport and utilities spending considerable time and resources to identify and remove them. Current methods for identifying the presence of INNS rely on ecological surveys, which are time consuming and costly, especially within road and rail infrastructure. Keen AI, the UK Centre for Ecology and Hydrology (CEH) and Time-Lapse Systems are combining their expertise in Artificial Intelligence (AI), INNS and image collection to provide a faster and more efficient method of conducting surveys of this kind. Keen AI has expertise in providing AI solutions to companies such as National Grid, helping to streamline their visual condition assessment process. CEH have a long track record of research on invasive species and are pioneering image recognition services for Japanese Knotweed with the conveyancing sector. Time-Lapse systems are experts in capturing imagery for specialist applications. Our complementary experience, skills and resources provide an opportunity to develop a novel AI platform for detecting the presence of invasive species. Current solutions for surveying an area for INNS include sending ecologists to perform a manual survey, which is time-consuming and costly, or the manual review of photographs taken from high definition digital cameras attached to drones or planes. Using AI technology, our proposal would reduce the time it takes to conduct an ecological survey of this kind, producing cost and time savings for the customer, and providing location specific information to support decision-making and management actions. Our project vision is to assess the feasibility of developing an AI platform for detecting the presence of invasive plant species within linear infrastructure. This innovation will provide a rapid, high quality vegetation survey methodology, which will result in cost and time savings for our customers, and result in an increased understanding of market requirements for an AI innovation of this type. The project will have four key objectives: 1. Collection of vegetation imagery of sufficient quality; 2. Training of AI algorithms to identify INNS in the image dataset; 3. Processing high volumes of images to locate INNS geospatially; and 4. Evaluation of the AI model performance.