fbpx

Measuring pig emotions and why it matters

17-03-2021 | | |
Outdoor pigs apparently enjoying the rooting in the ground. What can we know about their emotions? - Photo: Jan Willem van Vliet
Outdoor pigs apparently enjoying the rooting in the ground. What can we know about their emotions? - Photo: Jan Willem van Vliet

Humans can communicate by speaking, writing or gesturing and also have faces that usually portray how they feel. What if the emotions of farm animals could also be interpreted accurately through their communication, faces and body language? An effort to do just this is being made at Wageningen University & Research in the Netherlands.

Technology continues to expand into all facets of life, and the Internet of Things is poised to cross the boundary between humans and animals. Someone can track their pet dog or cat instantaneously with a chip and a smartphone, but it is also becoming increasingly possible to read a farm animal’s whereabouts and even emotion without any invasive procedure.

Until now, sensors and probes have been used to measure vital statistics of animals, but these have meant putting an animal through a procedure that itself causes stress. In the future, non-invasive measures will use sensors that identify individual animals by detecting facial features and process any data, and results will be sent to a smartphone. When that becomes possible, a veterinarian or farmer could quickly detect the emotional state of animals.

What is emotion?

First of all, emotion itself needs defining. It’s regarded in science as how the brain processes neuropsychological events through physiological, behavioural and cognitive stimuli for any particular individual or complex organism. Such stimuli are instantly translated for that organism into a positive or negative experience. The reaction determines which is the appropriate response for that organism at that moment – to stay, fight or flee. Human faces are easily read for a reaction from an experience, but animal faces are considerably more challenging.

To assist in interpreting physiological changes and general welfare of an animal, certain chemical markers can be measured. For example, the cortisol level in pig saliva is used to determine changing stress levels. In cattle, a change in neurotransmitting chemicals is detected as the animal experiences various stimuli. However, once again such measurements are conducted by interfering with the animal, which causes stress, and it can take precious time to produce the findings.

A facial sensor

What if an animal’s face could be read like a human’s? A sensor capable of detecting all the variances of animal faces has not yet been developed. In the interim, a variety of sensors are used to measure various components and parables. Such tools incorporate infrared thermal imaging, sound recordings, GPS tracking and drones. However, none of these are totally satisfactory by themselves and they all have their individual shortcomings in data collection. The quest is on among scientists to find a sensor that detects the smallest facial tick. In turn, such observable movements will be matched to what the animal is feeling at the time.

The question remains how to match such facial expressions with the correct emotion. To discern an animal’s emotional state, a tremendous amount of gathered data has to be analysed and the correct computer algorithm used to develop proper software. The approach with humans is to utilise machine learning that essentially has a computer react similarly to how a human brain would under certain conditions.

A major barrier for this development has been the area of fuzzy logic that looks at responses beyond a computer’s usual yes and no answers or, in other words, the need to create a computerised neural network system to resemble human reactions beyond just the facts. As an example, how one person reacts to a stubbed toe is different from another person, which a computer would have trouble differentiating. At some level, the same barrier exists for an animal algorithm that must be further delineated by species and environment. Do wild animals have different emotional responses from farm animals?

To have accurate facial recognition is mandatory as, like humans, animals have their individual levels of stress and must be evaluated separately and by species. So far, results of animals are from controlled environments, such as a farm, without a comparative measurement or benchmark of free range or wild animals. Some kind of baseline must be used to measure animal emotions by species in their natural environment, but such data is currently unavailable due to a lack of any suitable monitoring equipment.

Dairy cows striking a pose at a farm in the Netherlands. - Photo: Herbert Wiggerman

Dairy cows striking a pose at a farm in the Netherlands. – Photo: Herbert Wiggerman

Managing facial recognition data

Business economics has produced mega-farms containing large numbers of animals, making the task of identifying each animal arduous, not to mention expensive when using multiple sensors. A single sensor capable of collecting essential data would be a milestone by being more affordable and would help increase the health and productivity of livestock, while reducing stress levels by being as simple as having a picture taken. The use of edge computing, or having dedicated data interpreted independently rather than through a data centre, needs exploring so sensor devices themselves can produce findings and reduce the amount of bandwidth.

That would be especially helpful on farms with a poor internet connection. Computer programmes, like WUR Wolf, developed by the Farmworx group in the Netherlands’ Wageningen University & Research, analysed animal facial features. The programme recognises and evaluates 14 facial features combinations and seven emotional states of cows and pigs. For the study, images and videos of several thousand pigs and dairy cows were evaluated using You Only Look Once (YOLO) real time object detection. The corresponding data was interpreted by PyCharm and Python computer programming languages. The deep learning model WUR Wolf was dedicated in identifying facial expressions of these farm animals, successfully identifying 86% of the animals and their emotional states. A spinoff industry from this could be for the many security applications on the farm and elsewhere.

Measuring pig emotions and why it matters


Importantly, such detection is being performed humanely without animals being aware and provides an unfettered result in real time without any probes being surgically inserted. The benefit of this is early detection of any illness or disease so that treatment or confinement is swiftly provided. Considerably reducing, if not eliminating, any contagious ailment from being spread throughout a farm also protects valuable livestock.

WUR Wolf identifies animal emotions based on four principal facial expressions – neutral, aggression, happiness and fear. To build a database, the test sample of pigs was used to determine the correct algorithm. A number of artificial intelligence (AI) algorithms and camera and infrared imaging systems were used to gather data, such as eye retinal detection and the complex simulation of a neural network, to produce an automated emotion evaluation from what might be called a thinking computer. Such technology has previously been used for human aids to produce interactive robots, in the advertising industry to determine consumer preferences, and as an education tool, to name a few.

Difficulties ahead

The quantum leap to apply such technology to animals is in its infancy. This early scientific work basically breaks down an animal’s emotions into positive and negative. The areas of fuzzy logic, baselines and stress defined by species are still largely unexplored. The ability to produce a framework on how animals feel must take the areas of affected emotion, feelings and mood into account.

To define the complexity of these areas, affected emotion is a reaction to an initial stimulus, whereas feelings span short or longer times while mood occurs in the background and makes an emotion either positive or negative overall. Over 65 emotions have been attributed to humans, but as already mentioned, our ease of communication makes that complex task much less onerous.

How many emotions animals have is the crux of the ongoing study. The input of data includes such things as the appearance of eyes, ear position/posture, age, orbital cheek or snout tightening, nose bulge, eyelid movement and the animal’s body and tail postures that a computer programme will need to mull over before giving an analysis of emotion.

Further refinement of programmes like WUR Wolf show promise in identifying stress in farm animals. In the meantime, efforts are underway to develop an economical and user-friendly sensing platform for emotional check-up of farm animals. Farmers possessing this tool would provide better management through continuous computer monitoring that would lead to illnesses being identified and treated quicker which, in turn, would increase production levels to make a business more profitable. The old saying that a content cow is a happy cow could possibly never be truer with facial recognition technology. Farmers and business owners would also be grinning with such an animal emotional health tool.

Author:

Dr Suresh Neethirajan, associate professor, Wageningen University & Research, the Netherlands

References available upon request. The author can be reached at suresh.neethirajan@wur.nl.

Contributors
Contributors Global Pig Production Authors




Beheer