Headcam footage from babies has revealed new insights into how babies encounter faces across early development.
Cardiff Babylab at the Cardiff University Centre for Human Developmental Science (CUCHDS) used baby-worn headcams and AI face detection to investigate the everyday viewing of faces during playtime across the first years of development.
Their findings suggest that how and how often babies see faces during playtime depends on age, potentially changing as babies gain the ability to sit and walk independently. Importantly, they discovered that while face availability and position changed with age, it was not linear in its pattern.
We know very little about what the world looks like from a baby's own point of view.
"Faces play a fundamental role in early development – shaping visual attention, social learning, and interactions with caregivers. From birth, infants show a preference for faces, which is thought to support many aspects of their cognitive and social development.
"Although face perception and social learning have been widely studied, it is crucial to understand how babies encounter faces in their everyday environments, not just in artificial lab settings.
"By using innovations in lightweight headcams, we were able to develop headgear suitable for babies to wear during playtime in their own home. This, paired with AI face detection, enabled us to capture subtle, real-world changes in face exposure across the first years of life."
The researchers designed the TinyExplorer gear – lightweight headgear that captures high-quality video and audio with a wide vertical field of view. This allowed the team to not only capture how often the babies saw faces across ages, but also where in their views the faces were – bottom, middle, or top.
The team asked parents and caregivers to record headcam footage during playtime within their home. The researchers mapped changes in how babies see faces as they develop their motor skills – such as learning to sit or crawl or walk – which might change how or how often a baby might view others' faces.
They used AI to analyse data from 29 infants and toddlers, aged between two and 30 months, analysing over 5.5 million video frames from the headcams.
The study found that how babies see faces changes systematically with age. Their analysis showed that:
- During early infancy, faces most often appeared in the centre of the baby's view. Across the first year, this central positioning declined sharply.
- During the second year of a baby's life, the presence of faces in the top area of the visual scene increased and then dropped, while faces in the centre showed a small rebound.
- Younger infants saw faces at more varied sizes, reflecting close, caregiver-driven interactions – this variability decreased with age.
Dr Teodor Nikolov, Cardiff University School of Psychology, said: "AI is reshaping the research landscape. Instead of relying on small datasets that take countless hours to code by hand, we can now capture much richer, more naturalistic insights into babies' everyday world."
Our findings show that how often and where babies see faces changes with age. Very young infants see a lot of faces in the centre of their view, which fits with face-to-face play and caregivers leaning in close.
"As babies learn to sit up, how they interact with the world changes – they see fewer faces, potentially because they focus on manipulating objects around them which sitting enables them to do.
"Later, when they start walking, faces often appear higher up, as caregivers may stand over them. The drop in face presence in the second year may reflect growing independence.
"We hope that our findings will motivate further interest in how specific development milestones may contribute to changes in the visual experiences of babies," added Dr Nikolov.
By combining headcams with automated face detection, we can now scale this research to much larger samples and even develop tools for practitioners - such as physiotherapists or speech and language therapists. This will give us new opportunities to understand the everyday experiences of infants and toddlers.
Dr D'Souza added: "This study is an important stepping stone towards our future work aimed at understanding how children with disabilities experience the world and how their experiences shape their development.
"We also see real potential for this technique to be used in clinical practice and in policy development, offering new ways to share and understand children's perspectives."
The paper, Nonlinear changes in face availability during naturalistic playtime across the first years: Insights from head-mounted cameras and automated face detection, was published in Developmental Science.
The research was supported by a James S. McDonnell Foundation (JSMF) Opportunity Award and a UKRI Future Leaders Fellowship awarded to Dr Hana D'Souza.