PanoRadar: Advancing Robot Navigation with Radio Waves

November 18, 2024
|
Updated February 9, 2026
|

2 min read

PanoRadar radio wave sensor displayed in a clear enclosure next to monitors showing 3D imaging data.
Today’s robots tend to use one of three imaging techniques: cameras, LIDAR, or radar. Cameras see virtually the same views we do, meaning they’re susceptible to smoke, fog, light reflections, and other visual obstacles. LIDAR pulses laser signals to map out the robot’s surroundings, but that light scatters in smoke and reflects off of windows or glass walls. And while traditional radar can “see” through solid objects, the images it provides are extremely low-resolution, meaning it still falls short. Radio waves readily travel through smoke and glass, making them impervious to conditions that wouldn’t be safe for human eyes and to the shiny surfaces that make up many retail and office buildings.



In an effort to help robots see in tricky situations, researchers at the University of Pennsylvania are reaching for a fourth option: radio. Radio waves readily travel through smoke and glass, making them impervious to conditions that wouldn’t be safe for human eyes and to the shiny surfaces that make up many retail and office buildings. PanoRadar, the researchers’ experimental system, tests the viability of using this technology in place of more conventional techniques.

In an effort to help robots see in tricky situations, researchers at the University of Pennsylvania are reaching for a fourth option: radio. Radio waves readily travel through smoke and glass, making them impervious to conditions that wouldn’t be safe for human eyes and to the shiny surfaces that make up many retail and office buildings. PanoRadar, the researchers’ experimental system, tests the viability of using this technology in place of more conventional techniques.

Based on how long it takes for all these radio waves to bounce back—AKA how close or far away an object is—PanoRadar builds a heat map. Then, to create an image useful to humans, PanoRadar feeds that heat map into a machine learning algorithm, which constructs a 3D image of the system’s surroundings. The result is a high-resolution, panoramic view of an environment that would otherwise be difficult for human eyes, LIDAR, or radar to parse.

For the complete article CLICK HERE.

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

About The Author

Gene Roe - founder of Lidar News

Alluxa optical coatings
SAM Managed geospatial services

Recent Robotics Posts

Robots Gain Real Time Vision with 3D Splatting

Researchers are bridging the gap between robotic vision and natural language using a breakthrough framework…

March 16, 2026

AI Weed Control: Carbon Robotics’ LaserWeeder

The 2026 World Ag Expo in Tulare recently showcased a transformative shift in agricultural technology:…

March 14, 2026
The Saraighat Bridge where underwater robots were used for inspection.

Underwater Robots Inspect Saraighat Bridge

The Northeast Frontier Railway (NFR) is taking infrastructure maintenance to new depths using underwater robots…

December 10, 2025

Lidar Robot Vacuum Privacy Breach Exposed by Engineer

Lidar Robot Vacuum Privacy Breach Exposed by Engineer A recent deep-dive into a bricked robot…

November 12, 2025

WildFusion Robot Multisensory Mapping Uses Sound, Touch, Lidar

The General Robotics Lab at Duke University has introduced WildFusion, a robotic system designed to…

May 26, 2025

Sony AS-DT1 Lidar Sensor: Compact Solutions for Robotics

Rethinking Lidar at the Human Scale Sony’s recent announcement of the AS-DT1 miniature lidar depth…

April 18, 2025

Popular Posts

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

New Compass Ranger asset extraction