In recent years the technology required to localise a person in space with centimetre-level precision has matured e.g. Simultaneous Localisation and Mapping, Computer Vision and highly accurate GPS. Despite the obvious applications in helping blind persons navigate, these _AI-driven localisation_ technologies have not found widespread use within the vision-impaired community. This is because communicating enough navigation information to someone who cannot see a screen has previously been impossible. Audio and vibration cues do not provide the fidelity that is required to enable independent wayfinding. Thus, of the 292,000 blind people in the UK, 40% cannot make necessary journeys and 50% need support leaving the house.
This is where MakeSense comes in, a robotics start-up founded by a team of post-doctoral researchers from Imperial College in July 2021\. MakeSense has developed a remarkable new capability to efficiently communicate 3D spatial information via the sense of touch. Formal scientific trials with blind volunteers in March 2023 demonstrated efficacy which is more comparable to natural sight than to vibration or audio cues. Trials were done to a peer-review standard at Imperial College London, in collaboration with Bravo Victor, a sight loss charity.
Combining our patented _haptic interface_ with AI-driven localisation technologies, we have demonstrated a capacity for totally autonomous navigation. However, due to the electromechanical complexity of this device, the size/weight constraints and the prolonged periods of use (associated with the VIP using the device all day), the aid is subject to significant durability and design for repair challenges. MakeSense has been able to develop a functional product that is proven to benefit VIPs. MakeSense urgently needs to conduct a programme of design for durability, lifecycle optimisation and design for repair to ensure that VIPs can be supplied with a navigational aid that is reliable in use but also suitable for rapid repair/servicing in the field or repair centre.
MakeSense is seeking _Design Foundations: Repairability_ investment to conduct a fundamental reliability review for the current device in consultation with design of durability experts (Astrimar). After this review MakeSense will work with experienced industrial designers to incorporate fundings. This will help ensure our product is both dependable and easily repairable.
We hope to boost employment opportunities and enhance the quality of life of the blind community. Simultaneously, the project aligns with governmental priorities, seeking to reduce the economic burden of blindness and promote a fairer society.
332,893
2024-04-01 to 2025-03-31
Collaborative R&D
The enclosed application is focused on providing Vision Impaired Persons (VIPs) with a radical new way to navigate and, crucially, regain their independence. Due to the high cost of guide dogs (typically £55,000), and arduous waiting lists, our aim is to provide VIPs with an alternative solution for effective local navigation.
Over the past 2 years, the MakeSense team has developed a next-generation handheld personal navigation device that uses cameras, sensors, digital localisation technology and software to provide VIPs with extremely precise non-visual navigational guidance. In short, the device "sees the world" for the blind person and leverages a breakthrough human-machine interface which can guide a VIP through space more efficiently than ever before. In practice, the device is the size of a handheld flashlight and connects to a mobile phone.
MakeSense has successfully tested their first generation device that relies on the camera and sensors of a linked smartphone to provide the required digital technologies. The results have been highly successful. However, in critical situations VIPs expressed concerns about relying on AI alone and were apprehensive about having their smartphones on display during journeys, worrying about theft and/or damage during collisions.
During testing, VIPs have expressed considerable interest in the device if:
* The system included the option to be guided remotely by a trusted friend or carer.
* The requirement to have their phone out is removed.
Investment is being sought to develop and test a second-generation MakeSense unit that allows the phone to be secure in the user's pocket/bag and develop vital teleassistance features.
239,124
2022-07-01 to 2023-12-31
Collaborative R&D
This project brings together a team of technologists from MakeSense Technology Limited, researchers from Imperial College London and qualified end-users from Bravo Victor. We are tackling issues associated with the condition of blindness using a new and innovative approach.
Modern computer vision is sophisticated, and augmented reality applications such as snapchat filters or pokemon go, can map, track and overlay onto environments in real-time. However, this technology to map the world around us has not found widespread use within the visually impaired community. The problem lies in communicating useful information to someone who cannot see or perhaps has never seen. We are developing a new capability to deliver useful spatial information which is derived using computer vision. If our ambitions are realised, our technology will deliver life-changing improvements to mobility and independence for millions of people internationally.
Before we can commercially deploy this technology, it needs to be rigorously evaluated by qualified end-users to ensure that it is effectively addressing their needs. As part of this project, Bravo Victor, the sister charity of Blind Veterans UK, will give early adopters the chance to test it and offer feedback. For a product which depends entirely on user experience the value of this partnership cannot be overstated. Imperial College is hosting our industrial research. We aim to finish this project with a prototype product that is ready to enter the certification process for sale as a medical device in the UK and abroad.