For the first time, a new study has revealed how and when we make eye contact—not just the act itself—plays a crucial role in how we understand and respond to others, including robots.
Led by cognitive neuroscientist Dr Nathan Caruana , researchers from the HAVIC Lab at Flinders University asked 137 participants to complete a block-building task with a virtual partner.
They discovered that the most effective way to signal a request was through a specific gaze sequence: looking at an object, making eye contact, then looking back at the same object. This timing made people most likely to interpret the gaze as a call for help.
Dr Caruana says that identifying these key patterns in eye contact offers new insights into how we process social cues in face-to-face interactions, paving the way for smarter, more human-centred technology.
"We found that it's not just how often someone looks at you, or if they look at you last in a sequence of eye movements but the context of their eye movements that makes that behaviour appear communicative and relevant," says Dr Caruana, from the College of Education, Psychology and Social Work.
"And what's fascinating is that people responded the same way whether the gaze behaviour is observed from a human or a robot.
"Our findings have helped to decode one of our most instinctive behaviours and how it can be used to build better connections whether you're talking to a teammate, a robot, or someone who communicates differently.
"It aligns with our earlier work showing that the human brain is broadly tuned to see and respond to social information and that humans are primed to effectively communicate and understand robots and virtual agents if they display the non-verbal gestures we are used to navigating in our everyday interactions with other people."
The authors say the research can directly inform how we build social robots and virtual assistants that are becoming ever more ubiquitous in our schools, workplaces and homes, while also having broader implications beyond tech.
"Understanding how eye contact works could improve non-verbal communication training in high-pressure settings like sports, defence, and noisy workplaces," says Dr Caruana.
"It could also support people who rely heavily on visual cues, such as those who are hearing-impaired or autistic."
The team is now expanding the research to explore other factors that shape how we interpret gaze, such as the duration of eye contact, repeated looks, and our beliefs about who or what we are interacting with (human, AI, or computer-controlled).
The HAVIC Lab is currently conducting several applied studies exploring how humans perceive and interact with social robots in various settings, including education and manufacturing.
"These subtle signals are the building blocks of social connection," says Dr Caruana.
"By understanding them better, we can create technologies and training that help people connect more clearly and confidently."
The HAVIC Lab is affiliated with the Flinders Institute for Mental Health and Wellbeing and a founding partner of the Flinders Autism Research Initiative .
The article, The temporal context of eye contact influences perceptions of communicative intent by Nathan Caruana (Flinders University), Friederike Charlotte Hechler (Macquarie University and Universität Potsdam, Germany), E.S Cross (ETH Zurich, Switzerland) and Emmanuele Tidoni (University of Leeds, UK) was published in Royal Society Open Science journal. DOI:10.1098/rsos.250277
Acknowledgements: Authors were supported by an Experimental Psychology Society small grant.