Autonomous driving promises to bring lots of benefits, ranging from more safety to increased mobility. However, the introduction of self-driving cars also means that we have to come up with means to communicate the intentions of these vehicles, when there is no longer a driver to look at. TU/e researcher Debargha Dey is one of the researchers at TU/e who is developing a new language for cars. Something that is much more difficult than it seems.
Picture this: it’s 2030. You’re sitting in your self-driving car on a wide road. While minding your own business, the car – programmed to bring you to your destination without any interaction on your part – wants to turn left into a narrow road. However, you’re stuck: the narrow road is blocked by a car that is waiting for you, as you have right of way. In a fully automated world, with no humans ready to intervene, this would be a possibly unending impasse.
Fortunately, there are so-called external Human-Machine Interfaces coming our way that use light, sound or any other means to indicate to the other road user that they can go first. Impasse solved!
eHMIs, as they are called in the world of smart mobility, are part of a relatively new and exciting research field that helps automated vehicles negotiate the outside world. One of the researchers involved in this field is Indian-born Debargha Dey, part of the Future Everyday group in the department of Industrial Design. Dey, who is just about to finish his PhD and will continue to stay on as a post doc at TU/e, has a very personal motivation for his involvement in this field.
“I’ve always been fascinated by cars, making sketches of them as a child, but I’d never expected that working with cars would one day be a full-time job. One of the reasons that I decided to get involved in this field is a car accident I had some eight years ago, when I was working in the US as a software engineer. A car crashed into me at a T-junction. Although I suffered only slight injuries, the incident had a deep effect on me. Not only was I afraid to drive for some time, I also found out that the other driver had been distracted by his satnav. This kindled a deep interest in me for the interaction between humans and cars.”
“After attending a conference on Automotive UI, I decided to apply for a PDEng research position here at TU/e with professor Jacques Terken and I never looked back. It was one of the best decisions of my life!”
Understanding the self-driving car
Up till a few years ago most researchers in smart mobility were primarily interested in the interaction between drivers, or between car and driver. Dey and his colleagues in the NWO funded iCAVE program, on the other hand, focus on situations where the driver is no longer in control, and the person in the car no longer represents the intention of the car.
“Of course, we are some way from a situation where automated cars can do most of the driving (known as levels 3, 4 and 5 of car automation), but we believe that eHMIs can play a crucial role in getting there. Once autonomous cars are able to automatically and effectively signal their intentions to other road users, drivers can finally sit back and fully enjoy the benefits of autonomous driving.”
Bumper lights and more
A good example of Dey’s research, is the award-winning research he and his colleagues did into distance-dependent eHMIs. “Here we looked at the way people interact with a car when it approaches. From eye-tracking tests we did, we observed that people tend to look at a car not necessarily to make eye contact with the driver, but to assess its intentions.”