Ultrasound is perhaps best known as the technology that enables non-invasive body scans, underwater communications, and helps park cars. A young Norwegian startup called Sonair wants to adapt Sonair for another use: 3D computer vision for use in autonomous hardware applications.
Knut Sandven, founder and CEO of Sonair, said the company's application of ultrasound technology, a breakthrough approach to reading sound waves and detecting people and objects in 3D with minimal energy and computational requirements, is today We believe it could be the basis for a more useful and significantly cheaper solution. A more standard approach using LIDAR.
Sonair has now raised $6 million in funding from early-stage specialists Skyfall and RunwayFBU to roll out early access to its technology. Although initially aimed at autonomous mobile robot developers, the vision is (heh) for it to be used in other applications as well.
“Our go-to-market plan is to start with robots, specifically autonomous mobile robots.” [AMRs] — move things from A to B,” Sandven said. “We are starting indoors, with restrictions to increase concentration, but [but] Of course, in the long term, we will expand to other robot categories and the automotive field. ”
The name Sonair is a play on the 3D capabilities of underwater sonar, but applied to sensors that read signals in the air, which is essentially what the startup has created.
Sandven is an engineer and entrepreneur whose previous startup, GasSecure, developed gas I built a sensor. (The petrochemical industry is a major part of Norway's national economy.)
After GasSecure was acquired by a German industrial safety expert, Sandven started thinking about other uses for MEMS technology and turned his attention to the work of SINTEF, a group that works with leading Norwegian universities of science and technology to commercialize research. did. Over the years, dozens of companies have spun out from the group's activities.
SINTEF was developing a new class of MEMS-related ultrasonic sensors and was “ready for commercialization,” he said. Sandven acquired the IP for this technology (specifically acoustic ranging and detection), hired the researchers who created it, and Sonair was born.
Although LIDAR has become a standard part of autonomous system development in recent years, there is still room for complementary or alternative approaches. LIDAR is still considered expensive. I have a problem with range. Still, it faces interference from light sources, certain surfaces and materials.
While companies like Mobileye are looking more seriously at other alternatives such as radar, Sandven said Sonaire has the potential to reduce the total cost of a sensor package by 50% to 80%, making it a viable option. I believe it is possible.
“Our mission is to replace LIDAR,” he said.
The ultrasonic sensors and associated software that Sonair built to “read” the data from the sensors do not work in a vacuum. Similar to LIDAR, it works with cameras to perform triangulation and provide a more accurate image to the autonomous system in question.
Sonair's ultrasound technology is based on a “beamforming” method that is also used in radar. The company says the data it collects is processed and combined using AI, specifically object recognition algorithms, to create spatial information from sound waves. Hardware using this technology, in the first case a mobile robot, can obtain a 180 degree field of view with a range of 5 meters and use fewer sensors while addressing some of the shortcomings of LIDAR. Masu.
There are still some interesting ideas to consider here. The company's current focus is on new technologies to improve how well autonomous systems understand the objects in front of them. However, this technology is small-scale and may be applied to other form factors as well. For example, could it be used in handsets and wearables as a complement or replacement for pressure-based haptic feedback?
“What other companies are doing today is [focused on] It’s a touch sensor,” Sandven said. “After you touch something, the device measures the pressure, how hard you're touching it, how soft it is. Where we come in is in the moment before you touch it. So if your hand approaches something, We can already do that with our technology. We can move towards objects very quickly, but we can do it with a very soft touch because we can measure distances accurately. It's not a big deal, but it's something we can do.”
RunwayFBU's Sagar Chandna predicts that 200,000 autonomous mobile robots will be produced in 2024, with a market size of $1.4 billion. This gives Sonair an immediate market opportunity as a cheaper alternative to computer vision components.
“Industries from manufacturing to healthcare are poised to benefit from cost reductions in sensor technology and advances in AI in perception and decision-making,” said Skyfall Partner Preben Songe-Møller.