Robotics Breakthrough: Dynamic Emotion Boosts Robots

Abstract

Social robots have been extensively studied in recent decades, with many researchers exploring the use of modalities such as facial expressions to achieve more natural emotions in robots. Various methods have been attempted to generate and express robot emotions, including computational models that define an affect space and show dynamic emotion changes. However, the implementation of multimodal expression in previous models is ambiguous, and the generation of emotions in response to stimuli relies on heuristic methods. In this paper, we present a framework that enables robots to naturally express their emotions in a multimodal way, where the emotion can change over time based on the given stimulus values. By representing the robot's emotion as a position in an affect space of a computational emotion model, we consider the given stimuli values as driving forces that can shift the emotion position dynamically. In order to examine the feasibility of our proposed method, a mobile robot prototype was implemented that can recognize touch and express different emotions with facial expressions and movements. The experiment demonstrated that the emotion elicited by a given stimulus is contingent upon the robot's previous state, thereby imparting the impression that the robot possesses a distinctive emotion model. Furthermore, the Godspeed survey results indicated that our model was rated significantly higher than the baseline, which did not include a computational emotion model, in terms of anthropomorphism, animacy, and perceived intelligence. Notably, the unpredictabil ity of emotion switching contributed to a perception of greater lifelikeness, which in turn enhanced the overall interaction experience.

Imagine a scenario in which an individual experiences a sudden tap on their shoulder. In response, their eyes may widen, or they may instinctively flinch. Such stimuli naturally provoke automatic reactions in humans, though these responses tend to diminish or evolve with repeated exposure over time. Building on this understanding of human emotional dynamics, researchers have developed an innovative robotic system capable of simulating adaptive emotional responses based on stimulus frequency and context. This advancement aims to enhance emotional engagement in social and companion robots.

Professor Hui Sung Lee and his team from the Department of Design at UNIST announced the development of this adaptive robot technology, which expresses emotions through changes in eye shape, color, and movement, with responses that evolve dynamically over time.

Shown above is a scene where the robot expresses happiness when gently touched.

Shown above is a scene where the robot expresses happiness when gently touched.

The robot is capable of displaying six distinct emotions by combining variations in eye appearance, color, and movement patterns. Interactions are initiated via physical touch-either stroking (representing positive stimuli) or tapping (representing negative stimuli) on the robot's head. For instance, a sudden tap causes the robot's eyes to enlarge, turn blue, and its body to lean backward, thereby conveying surprise. Crucially, when the same stimulus is repeated, the robot does not simply replicate its initial response. Rather, its emotional expression adapts according to its previous emotional state and the history of stimuli it has received.

This approach allows the robot to mimic human-like emotional dynamics. User evaluations indicated that "[T]he robot's responses vary subtly depending on the context, even with identical stimuli, making its reactions feel less mechanical and more natural," with over 80% of participants describing its emotional expressions as "lifelike and vibrant."

The research team modeled emotions as dynamic vectors that change over time, rather than static states. Strong stimuli rapidly increase the magnitude of the robot's emotional vector, while weaker stimuli induce more gradual changes. This methodology enables the robot to exhibit nuanced and realistic emotional behaviors.

Professor Lee remarked, "Unlike traditional robots that display predetermined responses, our model captures the flow of emotional change, making the robot feel more like a living entity." He added, "This has significant implications for applications such as companion robots and emotional support systems."

The study was led by Haeun Park, a doctoral student and first author of the publication. The research was accepted for presentation at the 2025 IEEE International Conference on Robotics and Automation (ICRA), the world's premier event dedicated to advancing the field of robotics, held in Atlanta, USA, on May 21, 2025. Funding was provided by the Ministry of Trade, Industry and Energy (MOTIE).

Journal Reference

Haeun Park, Jiyeon Lee, and Hui Sung Lee, "Adaptive Emotional Expression in Social Robots: A Multimodal Approach to Dynamic Emotion Modeling," 2025 IEEE International Conference on Robotics and Automation (ICRA 2025), May 2025.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.