26 Mar 2019
Project from Scotland’s Rural College and UWE could alert farmers to animal health problems.
The health of pigs is currently monitored to an extent using RFID tags, but pigs happen to be one farm animal known to be highly expressive and communicate with each other using different facial expressions.
The ability of humans to read porcine faces has been limited, but a collaboration between Scotland's Rural College (SRUC) and machine vision experts at UWE in Bristol has now developed a possible route to monitoring and understanding these facial expressions more fully, using machine learning and facial recognition technology.
"By focusing on the pig's face, we hope to deliver a truly animal-centric welfare assessment technique, where the animal can 'tell' us how it feels about its own individual experiences and environment," said Emma Baxter of SRUC. "This allows insight into both short-term emotional reactions and long-term individual 'moods' of animals under our care."
The workflow developed by the project involves SRUC capturing 3D and 2D facial images of its breeding sow population under various situations that are likely to result in different emotional states, such as the discomfort of lameness and its subsequent treatment. Detection of a pain-free and positive mood is more tricky, but the team believes that this too reveals itself in facial recognition scans.
Images are processed at UWE's Centre for Machine Vision, part of the Bristol Robotics Laboratory, where machine learning techniques automatically identify different emotions conveyed by particular facial expressions. After validating these techniques, the team plan to develop the technology for on-farm use with commercial partners, where individual sows in large herds will be monitored continuously.
The current project builds on previous work at the two institutions on the use of convolutional neural networks (CNN) as a non-invasive biometric technique to analyze pig facial expressions, and potentially those of other animals too.
In trials on an initial sample of 10 animals, SRUC developed a CNN able to analyze standard webcam images and discriminate between individual pigs by comparing the snout and the wrinkled region above it; the top of the head where markings are most prevalent; and the eye regions. According to a 2018 project paper, its CNN achieved an accuracy of 96.7 percent on 1553 images of ten pigs, said to outperform at least one of the a standard face recognition techniques frequently used in human face recognition.
The other strand of research is to use the facial expression as a potential indication of emotion and intention, an area of behavioral study where concrete data has been harder to come by. Last year the project published results of a study into pigs' facial expression prior to the occurrence of aggression, during aggression, and during retreat from being attacked, and related facial metrics such as ear angle, contraction of the snout and openness of the eyes to the different behaviours. It concluded that facial expressions could indeed be a signal of intent as well as emotional state.
Having proven the principle, the project intends to investigate the impact of different camera viewpoints, and on the effects of more changeable aspects of the pig's appearance, such as age and cleanliness.
"Machine vision offers the potential to realize a low-cost, non-intrusive and practical means to biometrically identify individual animals on the farm," commented Melvyn Smith of UWE. "Our next step will be, for the first time, to explore the potential for using machine vision to automatically recognize facial expressions that are linked with core emotion states, such as happiness or distress, in the identified pigs."
|DCS 2019: Kent Periscopes present scope for AFVs|
|SRI International joins US Army night vision development program|
|Orbiting telescope spies first Earth-sized exoplanet|
|Low-cost solar telescope prepares to observe chromosphere|
|Seminex launches laser engine production line|
|Pocket-sized laser projector launched on Kickstarter|