These AI-powered guide dogs don’t just lead – they talk
Large language models help robots plan routes and guide the visually impaired
image:
Scientists at Binghamton University have developed a robot guide dog system that communicates with the visually impaired and provides real-time feedback during travel.
view moreCredit: Binghamton University, State University of New York
Guide dogs are powerful allies, leading the visually impaired safely to their destinations, but they can’t talk with their owners — until now.
Using large language models, a team of researchers at Binghamton University, State University of New York has created a talking robot guide dog system that determines an ideal route and safely guides users to their destination, offering real-time feedback along the way.
“For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs,” said Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science’s School of Computing. “Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities.”
Zhang and his team had previously trained robot guide dogs to lead the visually impaired by responding to a tug on the leash. This new system takes their work a step further, creating a spoken back-and-forth between user and dog, and providing more control and situational awareness. The robot offers information about a route before departure (what the researchers call plan verbalization) and information during travel (scene verbalization.)
“This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision,” Zhang said.
To test the system, the team recruited seven legally blind participants to navigate a large, multi-room office environment. The robot would ask the user where they wanted to go (in this experiment, a conference room) and then present possible routes to the room and the time it would take to reach it. Once the user selected a preferred route, the robot would guide them to the conference room, verbalizing the surroundings and obstacles along the way (such as “this is a long corridor”) until it reached the destination.
Following the test, the users completed a questionnaire about their experience, rating the system’s helpfulness, ease of communication, and usefulness. Overall, a combined approach — which included planning explanations and real-time narration from the robot — was preferred among participants. A simulated study of the system also showed that this approach was successful.
Going forward, the team plans to conduct more user studies, increase the system’s autonomy, and have the robots navigate longer distances, both indoors and outdoors.
The goal of this research is to help integrate robotic guide dogs into everyday life. The study participants were enthusiastic about this possibility.
“They were super excited about the technology, about the robots,” Zhang said. “They asked many questions. They really see the potential for the technology and hope to see this working.”
The paper, “From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication,” was presented at the 40th Annual AAAI Conference on Artificial Intelligence, one of the largest academic AI conferences in history.]
Method of Research
Experimental study
Article Title
From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication
A nanoscale robotic cleaner
University of Würzburg
image:
Artistic view of a nanorobot (center and inset) interacting with several bacteria of two distinct types. The dashed arrows indicate the attractive thermophoretic force exerted by the nanorobot on the bacteria in the vicinity while being illuminated.
view moreCredit: Jin Qin
Tiny robots – around 50 times smaller than the diameter of a human hair – open up fascinating possibilities: they enable the controlled manipulation of objects far too small for human hands. This brings us closer to a long-standing dream – the direct interaction with the microscopic world.
Particularly relevant are biological objects in aqueous environments, such as single cells or bacteria. Handling such objects in a controlled and targeted way has remained a major challenge. The nanorobots presented here demonstrate that controlled manipulation, including collection and relocation of bacteria, is already achievable.
Light-driven propulsion and control
A key challenge is how to power and steer such extremely small machines. At Julius-Maximilians-Universität Würzburg (JMU), the research group led by Professor Bert Hecht has already pioneered this approach: the researchers use the recoil of individual photons to move micrometer-sized devices – so-called microdrones.
These devices incorporate up to four plasmonic nanoantennas that absorb light of specific color and helicity and re-emit it directionally. Each redirected photon generates a recoil force – comparable to the recoil when firing a bullet. Due to the extremely small mass of the microdrones, this results in substantial accelerations and velocities.
In the current work, the team has succeeded in further miniaturizing these light-driven robots to sizes below one micrometre. A key factor was simplifying the steering mechanism without compromising the photon-based propulsion.
The researchers exploit the tendency of nanoscale antenna wires embedded in the robot to align with the polarization direction of incident light. By controlling the light polarization, they can thus steer the orientation of the nanorobot, while propulsion continues to be driven by photon recoil – a principle reminiscent of steering in macroscopic vehicles.
“Microscopic cleaners” in action
“In essence, we have built a light-driven nanorobot that can track down and collect bacteria,” says Jin Qin, lead experimental scientist of the study. “By simplifying the design, we reached a size at which these robots can operate directly in the microbial world – almost like microscopic cleaning devices.”
The nanorobots are remarkably agile: they can perform extremely rapid 90° turns, allowing them to systematically and efficiently scan large areas of a sample. In addition, they are capable of selectively capturing, transporting, and releasing significant numbers of bacteria.
This enables them to effectively “clean” microscopic environments under controlled laboratory conditions by collecting bacteria and depositing them at defined locations.
“This is a striking example of how light can be used not only to observe the microscopic world, but also to actively shape it,” adds Bert Hecht. “The idea of tiny robotic cleaners may sound futuristic, but we are already demonstrating the physical principles that make it possible.”
Even when transporting larger clusters of bacteria, the nanorobots remain fully manoeuvrable – albeit at slightly reduced speed. This robustness highlights their potential for future applications in microbiology, biomedical research, and targeted manipulation at the microscale.
No comments:
Post a Comment