By: Cassidy Delamarter, University Communications and Marketing
Robots may soon have a new way to communicate with people. Not through words or screens, but with light and images projected directly onto the world around them.
With a $411,578 award from the National Science Foundation, University of South Florida computer science Assistant Professor Zhao Han is developing technology that could transform how people interact with robots in both emergencies and everyday life.
With a projector-based augmented reality, a new form of human-robot communication, the goal is to make robots more adaptable in how they share information. The novel approach allows robots to project images and instructions onto cluttered, irregular and textured environments, such as piles of rubble, crowded lecture halls and messy living rooms.
"Traditionally, projectors only work well on flat, blank surfaces, like a movie screen. But those conditions rarely exist," Han said. "An advantage of this work is that we will identify the textures and then adjust the projected image accordingly so when people see it, they'll actually see the original image even though it is modified to work with the texture."

Students in the RARE Lab testing virtual reality
By combining computer vision, augmented reality and artificial intelligence, Han's system uses advanced techniques to sense an environment and capture its geometry, lighting and textures at the same time. With that information, it can help robots locate small, flat patches across uneven surfaces, stitch them together and adjust colors so projected images remain clear, even against patterned backdrops.
In a disaster, it might help rescuers locate survivors faster. "During a rescue, if you talk, people may not be able to hear you and communicate. Power lines may also be down, but if we use a robotic projector, we can avoid those challenges." Han said.
Instead of muffled audio or unreliable wireless communication in an emergency, a robot could shine a glowing path through rubble to guide rescuers to survivors. The approach makes it easier for people to understand what a robot is doing or where it wants them to go - no special glasses or devices required.
"We have the technology, but we are training the robot to better communicate with people," Han said. "Trust is a big topic in our field. When we talk about the robot failures, we usually talk about trust, because when a robot fails, people don't trust it anymore."
Inside the Reality, Autonomy, and Robot Experience (RARE) Lab at USF, Han and his students are testing the robotic system in three staged environments: a search and rescue field, a lecture hall filled with rows of chairs and a simulated messy home.

Ngoc Bao Dinh
"This project helps bridge the gap between robots and people," Han said. "If robots can communicate clearly in messy, real-world environments, they can become more effective partners in everything from disaster response to daily living."
The three-year project advances the fields of human-robot interaction, augmented reality and artificial intelligence, while providing students with hands-on opportunities in robotics research.
USF sophomore Ngoc Bao Dinh aspires to work in agricultural robotics. The computer engineering major joined Han's lab to better understand robots and assist with the project by fine-tuning the mid-air fog screen system that can be used in search and rescue and construction sites.
"Engineering should benefit the people, so I want to develop solutions that can be applied anywhere in the world to improve efficiency," Dinh said. "I also learned skills that are used in the industry that wouldn't be taught in classrooms. I discovered that I love the research environment, where innovation and exploration are encouraged."
To further leverage the technology from this project, Han plans to test it with other robots, such as robotic dogs and humanoid robots with legs that are suited for scenarios with stairs and disasters.