How is an animal feeling at a given moment? Humans have long recognised certain well-known behaviour like a cat hissing as a warning, but in many cases we've had little clue of what's going on inside an animal's head.
Author
- Shelley Brady
Postdoctoral Researcher in Animal Behaviour, Assistive Technology and Epilepsy, Dublin City University
Now we have a better idea, thanks to a Milan-based researcher who has developed an AI model that he claims can detect whether their calls express positive or negative emotions. Stavros Ntalampiras's deep-learning model, which was published in Scientific Reports, can recognise emotional tones across seven species of hoofed animals, including pigs, goats and cows. The model picks up on shared features of their calls, such as pitch, frequency range and tonal quality.
The analysis showed that negative calls tended to be more mid to high frequency, while positive calls were spread more evenly across the spectrum. In pigs, high-pitched calls were especially informative, whereas in sheep and horses the mid-range carried more weight, a sign that animals share some common markers of emotion but also express them in ways that vary by species.
For scientists who have long tried to untangle animal signals, this discovery of emotional traits across species is the latest leap forward in a field that is being transformed by AI.
The implications are far-reaching. Farmers could receive earlier warnings of livestock stress, conservationists might monitor the emotional health of wild populations remotely, and zookeepers could respond more quickly to subtle welfare changes.
This potential for a new layer of insight into the animal world also raises ethical questions. If an algorithm can reliably detect when an animal is in distress, what responsibility do humans have to act? And how do we guard against over-generalisation, where we assume that all signs of arousal mean the same thing in every species?
Of barks and buzzes
Tools like the one devised by Ntalampiras are not being trained to "translate" animals in a human sense, but to detect behavioural and acoustic patterns too subtle for us to perceive unaided.
Similar work is underway with whales, where New York-based research organisation Project Ceti (the Cetacean Translation Initiative) is analysing patterned click sequences called codas . Long believed to encode social meaning, these are now being mapped at scale using machine learning, revealing patterns that may correspond to each whale's identity, affiliation or emotional state.
In dogs, researchers are linking facial expressions, vocalisations and tail-wagging patterns with emotional states. One study showed that subtle shifts in canine facial muscles correspond to fear or excitement. Another found that tail-wag direction varies depending on whether a dog encounters a familiar friend or a potential threat.
At Dublin City University's Insight Centre for Data Analytics, we are developing a detection collar worn by assistance dogs which are trained to recognise the onset of a seizure in people who suffer from epilepsy. The collar uses sensors to pick up on a dog's trained behaviours, such as spinning, which raise the alarm that their owner is about to have a seizure.
The project , funded by Research Ireland, strives to demonstrate how AI can leverage animal communication to improve safety, support timely intervention, and enhance quality of life. In future we aim to train the model to recognise instinctive dog behaviours such as pawing, nudging or barking.
Honeybees, too, are under AI's lens. Their intricate waggle dances - figure-of-eight movements that indicate food sources - are being decoded in real time with computer vision. These models highlight how small positional shifts influence how well other bees interpret the message.
Caveats
These systems promise real gains in animal welfare and safety. A collar that senses the first signs of stress in a working dog could spare it from exhaustion. A dairy herd monitored by vision-based AI might get treatment for illness hours or days sooner than a farmer would notice.
Detecting a cry of distress is not the same as understanding what it means, however. AI can show that two whale codas often occur together, or that a pig's squeal shares features with a goat's bleat. The Milan study goes further by classifying such calls as broadly positive or negative, but even this remains using pattern recognition to try to decode emotions.
Emotional classifiers risk flattening rich behaviours into crude binaries of happy/sad or calm/stressed, such as logging a dog's tail wag as "consent" when it can sometimes signal stress. As Ntalampiras notes in his study, pattern recognition is not the same as understanding.
One solution is for researchers to develop models that integrate vocal data with visual cues, such as posture or facial expression, and even physiological signals such as heart rate, to build more reliable indicators of how animals are feeling. AI models are also going to be most reliable when interpreted in context, alongside the knowledge of someone experienced with the species.
It's also worth bearing in mind that the ecological price of listening is high. Using AI adds carbon costs that, in fragile ecosystems, undercut the very conservation goals they claim to serve. It's therefore important that any technologies genuinely serve animal welfare, rather than simply satisfying human curiosity.
Whether we welcome it or not, AI is here. Machines are now decoding signals that evolution honed long before us, and will continue to get better at it.
The real test, though, is not how well we listen, but what we're prepared to do with what we hear. If we burn energy decoding animal signals but only use the information to exploit them, or manage them more tightly, it's not science that falls short - it's us.
Shelley Brady does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.