Challenging AI Emotion Recognition: A trial run with the Uighur community

Challenging AI Emotion Recognition: A trial run with the Uighur community

Xinjiang is home to an ethnic minority of 12 million Uyghurs, most of whom are Muslim.

 

Citizens of the province of Xinjiang are under daily surveillance. The area is home to "vocational education and training centers", which are called high-security detention camps by human rights groups, where it is estimated that more than one million people have been detained.

 

Beijing has consistently argued how surveillance in the region is necessary because separatists have killed hundreds of people in terrorist attacks.

 

A facial recognition software engineer, agreed to speak to the BBC's Panorama program on the condition of anonymity in the use of facial recognition technology for surveillance, because he fears for his safety. He showed Panorama five photographs of Uyghur detainees on whom, he says, the emotion recognition system had been tested.

 

"The Chinese government uses Uyghurs as guinea pigs for various experiments, just as rats are used in laboratories," says the engineer.

 

He then spoke about his role in installing cameras in police stations in the province: "We placed the emotion detection camera 3 meters away from the subject. It is similar to a lie detector but of much more advanced technology."

 

He provided evidence of how the artificial intelligence system is trained to detect and analyse even the slightest changes in facial expressions and skin pores.

 

According to his claims, the software creates a pie chart, with the red segment representing a negative or anxious mental state.

 

He said the software was intended for "bias without any credible evidence."

 

The Chinese Embassy in London did not respond to questions about the use of the emotional recognition software in the province, but said, "Political, economic and social rights and freedom of religious belief in all ethnic groups in Xinjiang are fully guaranteed. People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life without restrictions on personal freedom."

 

The evidence was shown to Sophie Richardson, China director of Human Rights Watch.

 

"It is shocking material. It's not just that people are being reduced to a pie chart, these are people who are in highly coercive circumstances, under enormous pressure, they're understandably nervous, and this is taken as an indication of guilt, and I think that's deeply problematic." says Richardson. 

 

Suspicious behavior

According to Darren Byler, of the University of Colorado, Uyghurs must regularly provide DNA samples to local officials, submit to digital scans, and most must download a government phone app, which collects data, including contact lists and text messages.

 

"Uyghurs' lives are now dedicated to generating data," he said.

 

Most of the data is fed into a computer system called the Integrated Joint Operations Platform, which Human Rights Watch says flags allegedly suspicious behaviour.

 

"The system is collecting information on dozens of different kinds of perfectly legal behaviours, such using the back door instead of the front door, or if someone gets gas in a car that doesn't belong to them," Richardson said.

 

"Authorities now place QR codes outside the doors of people's homes so they can easily know who should be there and who shouldn't."

Daily surveillance 

 

China is estimated to host half of the world's nearly 800 million surveillance cameras.

 

It also has a large number of smart cities, such as Chongqing, where artificial intelligence is built into the foundation of the urban environment.

 

An investigative journalist from Chongqing Hu Liu told Panorama about his own experience: "Once you leave your house and get on the elevator, you are caught by a camera. There are cameras everywhere."

 

"When I leave the house to go somewhere, I call a cab, the cab company uploads the data to the government. I can then go to a bar to meet some friends and the authorities know my location through the camera in the bar.

 

"There have been occasions when I have met some friends and soon after someone from the government contacted me. They warned me, 'Don't see that person, don't do this and that.'

 

"With artificial intelligence we have nowhere to hide," he said.

 

Source: https://www.bbc.com/news/technology-57101248

Published on  Updated on