All News

Ultrasound ADAR Sensors Give Robots Better, Cheaper Depth Perception

Oslo startup Sonair launched ADAR — an ultrasonic acoustic detection and ranging sensor — that maps 3D space with high-frequency sound. Positioned as a lower-cost complement to cameras and LIDAR, Sonair aims to improve robot depth perception and safety in human environments. The company raised $6M as demand from robotics and industrial-safety customers grows.

Published September 17, 2025 at 08:09 AM EDT in IoT

Sonair brings room-filling sonar to robot perception

As robots move out of cages and into shared human spaces, their safety requirements change. Sonair, an Oslo-based startup, says its ADAR acoustic detection and ranging sensors — which use high-frequency ultrasound — give machines a richer three-dimensional view of their surroundings, filling gaps left by cameras and offering an alternative to traditional LIDAR.

Where cameras excel at classification and visual context, they can fail in low light, glare, or occlusion. LIDAR provides depth but, as Sonair’s CEO Knut Sandven says, it’s like "swiping a laser pointer" — sparse in coverage. Sonair’s approach is more like shouting into a room and listening: ultrasound waves fill space and the echoes build a dense 3D picture.

Sonair structures its sensor output in industry-standard formats so the devices can plug into existing robot hardware and software stacks. The company released the sensor earlier this year and reports strong interest from robotics manufacturers and industrial safety teams who want automatic shutdowns when people enter hazardous zones.

Why ADAR matters for safety and cost

ADAR’s benefits position it as a practical complement — and in some cases an alternative — to LIDAR for real-world deployments. Key advantages include:

  • Denser spatial coverage — fills volume rather than sampling lines.
  • Robustness in adverse lighting and visual occlusion where cameras struggle.
  • Lower cost compared with many LIDAR units, improving economics for mass deployment.

Use cases are immediate and practical: factory-floor safety zones that automatically halt heavy machinery, warehouse robots that need reliable depth cues near workers, or outdoor inspection bots that must detect obstacles in dust or fog where optical systems degrade.

Market response and path forward

Sonair raised $6 million in fresh funding from investors including Scale Capital, Norway’s Investinor, and ProVenture. The company says multiple robotics firms plan to integrate ADAR into upcoming models. That interest underscores a broader trend: as robots share space with people, perception and safety become business-critical, not optional.

Some industry voices remain cautious about rapid adoption of humanoid or household robots, citing safety and security risks. Sonair sees a niche in improving depth awareness across robot classes — and ultimately hopes ADAR will be as ubiquitous as cameras on robots.

Practical considerations and limits

No single sensor solves every problem. Ultrasonic systems can be affected by soft, absorptive materials and certain acoustic noise environments. That’s why sensor fusion — combining ADAR with cameras, IMUs, and selective LIDAR — will be the sensible route for most teams.

Think of modern robot perception like human senses: eyes for detail, ears for spatial cues, and a brain to reconcile conflicting signals. ADAR adds an "ear" for machines — improving reliability in scenarios where vision or light-based ranging fall short.

What organizations should ask next

Before swapping sensors, product teams and safety engineers should evaluate integration complexity, response latency, false-positive/negative rates, and maintenance needs. Pilots in real operational conditions — not just labs — will reveal whether ADAR meaningfully reduces incidents and total system cost versus LIDAR-centric architectures.

Sonair’s $6M raise and early customer traction suggest acoustic ranging will be part of the conversation as robots become more present in workplaces and public spaces. The coming year should clarify whether sonar becomes a standard layer of perception across robot platforms.

For teams planning deployments, the practical question remains: how do you combine sensors to meet safety thresholds, reduce cost, and scale? That’s exactly the kind of systems-level analysis QuarkyByte brings — matching sensor choices to operational realities and measurable safety outcomes.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte can help robotics teams evaluate ADAR vs. LIDAR through sensor-fusion benchmarking, safety validation frameworks, and pilot deployment roadmaps. We model cost, false-positive risk, and integration paths so manufacturers and plant operators can make measurable, low-risk choices about adopting sonar-based perception.