At UCSD (University of California San Diego), robotics researchers are developing a low cost and low power alternative to help autonomous mobile robots (AMR) localize and navigate.
The concept leverages normal WiFi signals to help the AMRs navigate. WiFi is typically required for network communication with the AMR, so it is present within the working environment. It’s a novel, new approach to AMR localization.
Most AMR platforms leverage a variety of sensors for understanding the environment in which the robot operates. This includes sensors such as LiDAR, cameras and SONAR.
For this research, the robot is using “WiFi sensors” rather than light sensors such as cameras to navigate. Ultimately, this will provide one more option for navigation and localization, which will result in more accurate operations. The WiFi sensing technology may also be more economical than using expensive LiDAR units.
Repetitive environments such as long corridors and warehouses can confuse an AMR using cameras or LiDAR, because there is little differentiate in the environment. This is the localization problem faced by AMRs that leverage cameras or LiDAR.
An AMR needs help when it becomes “lost”
When an AMR loses its localization, it has to be “re-localized” – often with operator interaction. It’s the equivalent of getting lost while driving in the city and not having access to a map of where you are located.
What this WiFi sensing solution can’t do is identify obstacles around the AMR or in the path of the AMR.
A team of researchers from the Wireless Communication Sensing and Networking Group, led by UCSD electrical and computer engineering professor Dinesh Bharadia, will present their work at the 2022 International Conference on Robotics and Automation (ICRA), which will take place from May 23 to 27 in Philadelphia.
“We are surrounded by wireless signals almost everywhere we go. The beauty of this work is that we can use these everyday signals to do indoor localization and mapping with robots,” said Bharadia.
“Using WiFi, we have built a new kind of sensing modality that fills in the gaps left behind by today’s light-based sensors, and it can enable robots to navigate in scenarios where they currently cannot,” added Aditya Arun, who is an electrical and computer engineering Ph.D. student in Bharadia’s UCSD lab and the first author of the study.
The UCSD robotics researchers built their prototype system using off-the-shelf hardware. The system consists of a robot that has been equipped with the WiFi sensors, which are built from commercially available WiFi transceivers. These devices transmit and receive wireless signals to and from WiFi access points in the environment. What makes these WiFi sensors special is that they use this constant back and forth communication with the WiFi access points to map the robot’s location and direction of movement.
“This two-way communication is already happening between mobile devices like your phone and WiFi access points all the time—it’s just not telling you where you are,” said Roshan Ayyalasomayajula, who is also an electrical and computer engineering Ph.D. student in Bharadia’s lab and a co-author on the study. “Our technology piggybacks on that communication to do localization and mapping in an unknown environment.”
Here’s how it works. At the start, the WiFi sensors are unaware of the robot’s location and where any of the WiFi access points are in the environment. Figuring that out is like playing a game of Marco Polo—as the robot moves, the sensors call out to the access points and listen for their replies, using them as landmarks. The key here is that every incoming and outgoing wireless signal carries its own unique physical information—an angle of arrival and direct path length to (or from) an access point—that can be used to figure out where the robot and access points are in relation to each other. Algorithms developed by Bharadia’s team enable the WiFi sensors to extract this information and make these calculations. As the call and response continues, the sensors pick up more information and can accurately locate where the robot is going.
The researchers tested their technology on a floor of an office building. They placed several access points around the space and equipped a robot with the WiFi sensors, as well as a camera and a LiDAR to perform measurements for comparison. The team controlled their robot to travel several times around the floor, turning corners, going down long and narrow corridors, and passing through both bright and dimly lit spaces.
In these tests, the accuracy of localization and mapping provided by the WiFi sensors was on par with that of the commercial camera and LiDAR sensors.
“We can use WiFi signals, which are essentially free, to do robust and reliable sensing in visually challenging environments,” said Arun. “WiFi sensing could potentially replace expensive LiDARs and complement other low cost sensors such as cameras in these scenarios.”
That’s what the team is now exploring. The researchers will be combining WiFi sensors (which provide accuracy and reliability) with cameras (which provide visual and contextual information about the environment) to develop a more complete, yet inexpensive, mapping technology.
Paper title: “P2SLAM: Bearing Based WiFi SLAM for Indoor Robots.” Co-authors include William Hunter, UC San Diego.
Tell Us What You Think!
You must be logged in to post a comment.