From moos to oinks: How AI is cracking the code of animal emotions through sound

Understanding emotions in non-human animals has long been a challenge. While physiological indicators such as heart rate and cortisol levels provide insights into emotional arousal, assessing emotional valence - whether an emotion is positive or negative - remains difficult. Vocalizations, however, offer a promising avenue.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 03-03-2025 12:00 IST | Created: 03-03-2025 12:00 IST
From moos to oinks: How AI is cracking the code of animal emotions through sound
Representative Image. Credit: ChatGPT

In the ever-evolving landscape of artificial intelligence, machine learning continues to push the boundaries of understanding non-human communication. A groundbreaking study published in iScience titled "Machine Learning Algorithms Can Predict Emotional Valence Across Ungulate Vocalizations" by Romain A. Lefèvre, Ciara C. R. Sypherd, and Élodie F. Briefer explores how AI can decode the emotions of ungulates (hooved mammals) based on their vocalizations. With an impressive 89.49% accuracy, the study highlights the potential of AI-driven tools in improving animal welfare and deepening our understanding of emotional communication across species.

Machine learning and emotional expression in animals

Understanding emotions in non-human animals has long been a challenge. While physiological indicators such as heart rate and cortisol levels provide insights into emotional arousal, assessing emotional valence - whether an emotion is positive or negative - remains difficult. Vocalizations, however, offer a promising avenue. The study employs the eXtreme Gradient Boosting (XGBoost) algorithm to classify contact calls from seven ungulate species, including cows, sheep, horses, Przewalski’s horses, pigs, wild boars, and goats. By analyzing various acoustic features such as duration, pitch, amplitude modulation, and energy quartiles, the AI model successfully predicts whether a vocalization signals a pleasant or unpleasant emotional state.

The study also used Uniform Manifold Approximation and Projection (UMAP) to visualize the separability of vocalizations by species and emotional valence. Results showed that species such as pigs and Przewalski’s horses exhibited greater separation between positive and negative calls, while goats and cows had more overlap, suggesting species-specific differences in emotional vocal expression. Additional clustering and classification methods, including k-means clustering and Naive Bayes classification, were applied to further assess the reliability of the model.

Key findings and implications

The machine learning model achieved an overall classification accuracy of 89.49%, with a balanced accuracy of 83.90%. The study found that certain acoustic features serve as reliable indicators of emotional valence across species. For example, positive vocalizations tend to have lower amplitude modulation, shorter duration, and lower fundamental frequency variability, whereas negative vocalizations exhibit higher energy distribution in higher frequencies.

Interestingly, the classification accuracy varied among species, with pigs and Przewalski’s horses showing the highest accuracy rates of 99.91% and 97.78%, respectively. This suggests that some species exhibit clearer vocal distinctions between positive and negative emotions than others. The study also revealed that among the 17 acoustic features analyzed, duration and amplitude modulation depth were among the most influential in determining emotional valence. Additionally, the researchers utilized SHapley Additive exPlanations (SHAP) to interpret the model's decision-making process and confirm which acoustic features contributed most to classification accuracy.

Role of AI in animal welfare

One of the most exciting implications of this research is its potential application in animal welfare monitoring. With further refinement, AI-powered acoustic analysis could be integrated into automated systems to assess the emotional well-being of farm animals, zoo inhabitants, and even wild populations. Such technology could enable early detection of stress or discomfort, allowing for timely intervention to improve living conditions. Moreover, this research provides valuable insights into the evolutionary origins of vocal emotion expression, shedding light on the biological roots of human speech and affective communication.

In practical applications, automated monitoring systems using AI-driven vocal analysis could be particularly useful in large-scale livestock management, where individual monitoring is challenging. Farmers and veterinarians could receive real-time alerts about potential distress in animals, allowing them to take action before visible signs of suffering emerge. Additionally, conservationists could employ AI acoustic monitoring in natural habitats to assess stress levels in wildlife without intrusive methods.

Future directions and challenges

Despite its success, the study acknowledges certain limitations. The dataset was limited to contact calls, avoiding extreme emotional states such as distress or fear. Additionally, while the model performed well across multiple species, there were variations in classification accuracy, particularly in species like goats and cows, which showed more overlap between positive and negative vocalizations. Expanding the dataset to include a wider range of species and emotional contexts will be crucial in improving the generalizability of the findings.

Another challenge lies in the potential impact of environmental noise and recording quality on classification accuracy. Future research may focus on refining algorithms to better distinguish relevant vocal signals from background noise. Additionally, combining vocal analysis with other behavioral and physiological indicators, such as body posture and hormone levels, could provide a more comprehensive assessment of animal emotions.

The integration of AI into bioacoustics marks a significant step forward in understanding animal emotions. As technology advances, future research may explore real-time emotion recognition, multi-species comparative analysis, and integration with other behavioral and physiological indicators. With continued development, AI-driven tools could revolutionize how we interpret and respond to the emotional states of animals, bridging the gap between human and non-human communication in unprecedented ways.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback