Home / Technology / AI & IT / Facial recognition is advancing to Emotion recognition

Facial recognition is advancing to Emotion recognition

Facial recognition is a category of biometric software that maps an individual’s facial features mathematically and stores the data as a faceprint. The software uses deep learning algorithms to compare a live capture or digital image to the stored faceprint in order to verify an individual’s identity. Facial recognition technology  has  sufficiently matured recently and its use has been rapidly increasing both in commercial products, as well as by law enforcement lately.

 

In 2014, Facebook launched DeepFace, which was able to determine whether two pictures were of the same person with 97.25% accuracy (while humans scored 97.53%). Apple wasn’t far behind as it filed a patent in 2014 for AI technology that could analyze and identify mood based on facial expressions. In early 2016, they acquired Emotient, a facial analysis and emotion recognition software company.

 

Face recognition technology is now advancing to Emotion recognition. Emotion-recognition technologies – in which facial expressions of anger, sadness, happiness and boredom, as well as other biometric data are tracked – are supposedly able to infer a person’s feelings based on traits such as facial muscle movements, vocal tone, body movements and other biometric signals. It goes beyond facial-recognition technologies, which simply compare faces to determine a match.

 

But similar to facial recognition, it involves the mass collection of sensitive personal data to track, monitor and profile people and uses machine learning to analyse expressions and other clues.

 

“Machine learning technology is getting really good at recognizing the content of images—of deciphering what kind of object it is,” said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. “We wanted to ask: Could it do the same with emotions? The answer is yes.”

 

Microsoft debuted their facial recognition software by Microsoft Project Oxford in 2015. Their Microsoft Face API has been used since then in Microsoft Hello, their equivalent to Apple’s Face ID, and they even offer a free demo of their emotion recognition technology online. Google also debuted their facial recognition technology called FaceNet in 2015, scoring 100% accuracy on a test to label faces in the wild, and has continued to add facial detection, recognition, and emotional content analysis to their Cloud Vision API services.

 

Amazon’s computer vision division has Rekognition, which not only scans for facial recognition and emotion, but could be used to recognize 100 people in a single image and match them to a database of tens of millions. Snapchat filed a patent for emotion recognition AI in early 2018, likely using the treasure trove of face data gleaned from over 300 million active users per month trying on rainbow vomit and dog ear filters.

 

The industry is booming in China, where since at least 2012, figures including President Xi Jinping have emphasised the creation of “positive energy” as part of an ideological campaign to encourage certain kinds of expression and limit others.

 

Beijing-based Megvii Technology’s Face++ offers a menu of facial analysis services including face search, eye tracking, emotion recognition, skin health analysis, skeleton detection, beauty score, and many, many others. Now China has added Emotional surveillance to its  wide surveillance network of facial recognition and internet censorship across China. South China Morning Post reported that Employees’ brain activity and emotions are reportedly being monitored in factories, state-owned enterprises, and the military across China.

 

The main office of Taigusys is tucked behind a few low-rise office buildings in Shenzhen. Visitors are greeted at the doorway by a series of cameras capturing their images on a big screen that displays body temperature, along with age estimates, and other statistics. Chen, a general manager at the company, says the system in the doorway is the company’s bestseller at the moment because of high demand during the coronavirus pandemic. Chen hails emotion recognition as a way to predict dangerous behaviour by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and elderly people experiencing dementia in care homes.

 

Taigusys systems are installed in about 300 prisons, detention centres and remand facilities around China, connecting 60,000 cameras. “Violence and suicide are very common in detention centres,” says Chen. “Even if police nowadays don’t beat prisoners, they often try to wear them down by not allowing them to fall asleep. As a result, some prisoners will have a mental breakdown and seek to kill themselves. And our system will help prevent that from happening.”

 

Chen says that since prisoners know they are monitored by this system – 24 hours a day, in real time – they are made more docile, which for authorities is a positive on many fronts. “Because they know what the system does, they won’t consciously try to violate certain rules,” he says.

 

Critics say the technology is based on a pseudo-science of stereotypes, and an increasing number of researchers, lawyers and rights activists believe it has serious implications for human rights, privacy and freedom of expression. With the global industry forecast to be worth nearly $36bn by 2023, growing at nearly 30% a year, rights groups say action needs to be taken now.

 

Scientists in China claim they developed AI to measure people’s loyalty to the Chinese Communist Party, reported in July 2022

Scientists at China’s Comprehensive National Science Center in Hefei claim to have developed “mind-reading” artificial intelligence that measures the loyalty of Chinese Communist Party members. Chinese researchers claimed in a now-deleted post on Weibo that the AI software can measure reactions of party members to “thought and political education” by analysing their facial expressions and brain waves.

 

“This equipment is a kind of smart ideology, using AI technology to extract and integrate facial expressions, EEG readings and skin conductivity,” according to a translation of the posts by Radio Free Asia. Chinese scientists claimed the technology made it possible to “ascertain the levels of concentration, recognition and mastery of ideological and political education so as to better understand its effectiveness”.

 

In a now-deleted video, the institute claimed the “mind-reading” software could be used on party members to “further solidify their determination to be grateful to the party, listen to the party and follow the party”. However, the post was taken down following outcry from citizens on the internet. The equipment was reportedly tested by reading the brain waves and by conducting facial scans of party members as they read articles about the Communist Party, measures that were then converted to a loyalty “score.”

Emotion Recognition technology

In particular, a  2017 paper authored by a collaboration of scientists at USC, Carnegie Mellon University, and Cincinnati Children’s Hospital Medical Center claims to have drawn a bead on some of the biomarkers that differentiate depressed and suicidal patients. During interviews with these groups, they recorded gestures that included smiling, frowning, eye brow raising, and head motioning. The data was then fed into a machine-learning algorithm that looked for correlations between different gestures, alone or in combination, and patient groups.

 

Specifically, Duchenne smiles versus non-Duchenne smiles held the key to differentiating the groups. A Duchenne smile involves the contraction of muscles surrounding the eyes, while a non-Duchenne smile doesn’t involve the eyes. Those people displaying non-Duchenne smiles were far more likely to possess suicidal ideation than those lacking them.

 

A paper, published in August 2019 in the journal Science Advances, marks an important step forward in the application of “neural networks”—computer systems modeled after the human brain—to the study of emotion. It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see—even briefly—could have a greater, more swift impact on our emotions than we might assume.

 

“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

 

For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.

 

He then “showed” the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe and surprise. EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe and surprise.

 

Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be. When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.

 

To further test and refine EmoNet, the researchers then brought in 18 human subjects. As a functional magnetic resonance imaging (fMRI) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject. When activity in the neural network was compared to that in the subjects’ brains, the patterns matched up.

 

“We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so,” said Kragel.

 

The brain imaging itself also yielded some surprising findings. Even a brief, basic image – an object or a face – could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.

 

“This shows that emotions are not just add-ons that happen later in different areas of the brain,” said Wager, now a professor at Dartmouth College. “Our brains are recognizing them, categorizing them and responding to them very early on.”

 

Ultimately, the resesarchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones. It could also be applied to improve computer-human interactions and help advance emotion research. The takeaway for now, says Kragel: “What you see and what your surroundings are can make a big difference in your emotional life.”

 

 

 

References and resources also include:

https://uk.finance.yahoo.com/news/scientists-china-claim-developed-ai-124012725.html

https://www.theguardian.com/global-development/2021/mar/03/china-positive-energy-emotion-surveillance-recognition-tech

 

 

 

About Rajesh Uppal

Check Also

DARPA developing AI tools to enhance adult learning in security technologies

The way people work is shifting; acquiring new skill sets can help ensure the national …

error: Content is protected !!