Incident 249: Government Deployed Extreme Surveillance Technologies to Monitor and Target Muslim Minorities in Xinjiang

Description: A suite of AI-powered digital surveillance systems involving facial recognition and analysis of biometric data were deployed by the Chinese government in Xinjiang to monitor and discriminate local Uyghur and other Turkic Muslims.
Alleged: Chinese government developed and deployed an AI system, which harmed Uyghur people and Turkic Muslim ethnic groups.

Suggested citation format

Dickinson, Ingrid. (2016-10-01) Incident Number 249. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
249
Report Count
2
Incident Date
2016-10-01
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Since late 2016, the Chinese government has subjected the 13 million ethnic Uyghurs and other Turkic Muslims in Xinjiang to mass arbitrary detention, forced political indoctrination, restrictions on movement, and religious oppression. Credible estimates indicate that under this heightened repression, up to one million people are being held in “political education” camps. The government’s “Strike Hard Campaign against Violent Terrorism” (Strike Hard Campaign, 严厉打击暴力恐怖活动专项行动) has turned Xinjiang into one of China’s major centers for using innovative technologies for social control.

This report provides a detailed description and analysis of a mobile app that police and other officials use to communicate with the Integrated Joint Operations Platform (IJOP, 一体化联合作战平台), one of the main systems Chinese authorities use for mass surveillance in Xinjiang. Human Rights Watch first reported on the IJOP in February 2018, noting the policing program aggregates data about people and flags to officials those it deems potentially threatening; some of those targeted are detained and sent to political education camps and other facilities. But by “reverse engineering” this mobile app, we now know specifically the kinds of behaviors and people this mass surveillance system targets.

The findings have broader significance, providing an unprecedented window into how mass surveillance actually works in Xinjiang, because the IJOP system is central to a larger ecosystem of social monitoring and control in the region. They also shed light on how mass surveillance functions in China. While Xinjiang’s systems are particularly intrusive, their basic designs are similar to those the police are planning and implementing throughout China. 

Many—perhaps all—of the mass surveillance practices described in this report appear to be contrary to Chinese law. They violate the internationally guaranteed rights to privacy, to be presumed innocent until proven guilty, and to freedom of association and movement. Their impact on other rights, such as freedom of expression and religion, is profound.

Human Rights Watch finds that officials use the IJOP app to fulfill three broad functions: collecting personal information, reporting on activities or circumstances deemed suspicious, and prompting investigations of people the system flags as problematic.

Analysis of the IJOP app reveals that authorities are collecting massive amounts of personal information—from the color of a person’s car to their height down to the precise centimeter—and feeding it into the IJOP central system, linking that data to the person’s national identification card number. Our analysis also shows that Xinjiang authorities consider many forms of lawful, everyday, non-violent behavior—such as “not socializing with neighbors, often avoiding using the front door”—as suspicious. The app also labels the use of 51 network tools as suspicious, including many Virtual Private Networks (VPNs) and encrypted communication tools, such as WhatsApp and Viber.

The IJOP app demonstrates that Chinese authorities consider certain peaceful religious activities as suspicious, such as donating to mosques or preaching the Quran without authorization. But most of the other behavior the app considers problematic are ethnic-and religion-neutral. Our findings suggest the IJOP system surveils and collects data on everyone in Xinjiang. The system is tracking the movement of people by monitoring the “trajectory” and location data of their phones, ID cards, and vehicles; it is also monitoring the use of electricity and gas stations of everybody in the region. This is consistent with Xinjiang local government statements that emphasize officials must collect data for the IJOP system in a “comprehensive manner” from “everyone in every household.”

When the IJOP system detects irregularities or deviations from what it considers normal, such as when people are using a phone that is not registered to them, when they use more electricity than “normal,” or when they leave the area in which they are registered to live without police permission, the system flags these “micro-clues” to the authorities as suspicious and prompts an investigation. 

Another key element of IJOP system is the monitoring of personal relationships. Authorities seem to consider some of these relationships inherently suspicious. For example, the IJOP app instructs officers to investigate people who are related to people who have obtained a new phone number or who have foreign links.

The authorities have sought to justify mass surveillance in Xinjiang as a means to fight terrorism.  While the app instructs officials to check for “terrorism” and “violent audio-visual content” when conducting phone and software checks, these terms are broadly defined under Chinese laws. It also instructs officials to watch out for “adherents of Wahhabism,” a term suggesting an ultra-conservative form of Islamic belief, and “families of those…who detonated [devices] and killed themselves.” But many—if not most—behaviors the IJOP system pays special attention to have no clear relationship to terrorism or extremism. Our analysis of the IJOP system suggests that gathering information to counter genuine terrorism or extremist violence is not a central goal of the system.

The app also scores government officials on their performance in fulfilling tasks and is a tool for higher-level supervisors to assign tasks to, and keep tabs on the performance of, lower-level officials. The IJOP app, in part, aims to control government officials to ensure that they are efficiently carrying out the government’s repressive orders.

In creating the IJOP system, the Chinese government has benefitted from Chinese companies who provide them with technologies. While the Chinese government has primary responsibility for the human rights violations taking place in Xinjiang, these companies also have a responsibility under international law to respect human rights, avoid complicity in abuses, and adequately remedy them when they occur.

As detailed below, the IJOP system and some of the region’s checkpoints work together to form a series of invisible or virtual fences. Authorities describe them as a series of “filters” or “sieves” throughout the region, sifting out undesirable elements. Depending on the level of threat authorities perceive—determined by factors programmed into the IJOP system—, individuals’ freedom of movement is restricted to different degrees. Some are held captive in Xinjiang’s prisons and political education camps; others are subjected to house arrest, not allowed to leave their registered locales, not allowed to enter public places, or not allowed to leave China.  

Government control over movement in Xinjiang today bears similarities to the Mao Zedong era (1949-1976), when people were restricted to where they were registered to live and police could detain anyone for venturing outside their locales. After economic liberalization was launched in 1979, most of these controls had become largely obsolete. However, Xinjiang’s modern police state—which uses a combination of technological systems and administrative controls—empowers the authorities to reimpose a Mao-era degree of control, but in a graded manner that also meets the economy’s demands for largely free movement of labor.

The intrusive, massive collection of personal information through the IJOP app helps explain reports by Turkic Muslims in Xinjiang that government officials have asked them or their family members a bewildering array of personal questions. When government agents conduct intrusive visits to Muslims’ homes and offices, for example, they typically ask whether the residents own exercise equipment and how they communicate with families who live abroad; it appears that such officials are fulfilling requirements sent to them through apps such as the IJOP app. The IJOP app does not require government officials to inform the people whose daily lives are pored over and logged the purpose of such intrusive data collection or how their information is being used or stored, much less obtain consent for such data collection.

The Strike Hard Campaign has shown complete disregard for the rights of Turkic Muslims to be presumed innocent until proven guilty. In Xinjiang, authorities have created a system that considers individuals suspicious based on broad and dubious criteria, and then generates lists of people to be evaluated by officials for detention. Official documents state that individuals “who ought to be taken, should be taken,” suggesting the goal is to maximize the number of people they find “untrustworthy” in detention. Such people are then subjected to police interrogation without basic procedural protections. They have no right to legal counsel, and some are subjected to torture and mistreatment, for which they have no effective redress, as we have documented in our September 2018 report. The result is Chinese authorities, bolstered by technology, arbitrarily and indefinitely detaining Turkic Muslims in Xinjiang en masse for actions and behavior that are not crimes under Chinese law.

And yet Chinese authorities continue to make wildly inaccurate claims that their “sophisticated” systems are keeping Xinjiang safe by “targeting” terrorists “with precision.” In China, the lack of an independent judiciary and free press, coupled with fierce government hostility to independent civil society organizations, means there is no way to hold the government or participating businesses accountable for their actions, including for the devastating consequences these systems inflict on people’s lives.

The Chinese government should immediately shut down the IJOP and delete all the data it has collected from individuals in Xinjiang. It should cease the Strike Hard Campaign, including all compulsory programs aimed at surveilling and controlling Turkic Muslims. All those held in political education camps should be unconditionally released and the camps shut down. The government should also investigate Party Secretary Chen Quanguo and other senior officials implicated in human rights abuses, including violating privacy rights, and grant access to Xinjiang, as requested by the Office of the United Nations High Commissioner for Human Rights and UN human rights experts.

Concerned foreign governments should impose targeted sanctions, such as the US Global Magnitsky Act, including visa bans and asset freezes, against Party Secretary Chen and other senior officials linked to abuses in the Strike Hard Campaign. They should also impose appropriate export control mechanisms to prevent the Chinese government from obtaining technologies used to violate basic rights.

The remainder of the report is available at https://www.hrw.org/report/2019/05/01/chinas-algorithms-repression/reverse-engineering-xinjiang-police-mass

China’s Algorithms of Repression

We used to worry about Terminator-type artificial intelligence robots dominating the human race, but what we are moving toward is more the opposite: humans are being turned into automatons with little freedom to decide what we do.

Across the world, we are seeing a rise of sensory systems monitoring us en masse and round the clock in public and private spaces, whether automatic license plate readers, facial recognition cameras, cellphones tracking location data, or voice assistants in our homes. Each of these “smart” systems promise benefits—less traffic, better security, better maps, better services.

Some of these systems are used in countries with strong human rights protections. In Sweden, for example, the EU privacy law General Data Protection Regulation provides some—albeit limited—protections on how personal data is gathered. These countries also enjoy freedom of expression and the press, public forums where these issues are freely debated.

Most people live in countries with fewer privacy protections, however. Even the US lacks national consumer privacy legislation while the government exercises broad surveillance against citizens and foreigners alike. But the US is also a democracy where increasing awareness about privacy has led states and cities to enact laws like the California Consumer Privacy Act.

Inequalities in human rights protections replicate themselves in privacy protections, with deeply repressive governments—from Zimbabwe to China—actively seeking out new technologies to deepen their assault on rights. In China, not only are there no effective privacy protections, there is also no civil society, free press, or elections. The Communist Party is above the law, maintains a chokehold on the Internet and under Xi Jinping, is increasingly intolerant of dissent.

At the extreme end of this privacy spectrum lies Xinjiang, a region in northwest China with 13 million Uyghurs and other Turkic Muslim minorities. Under the “Strike Hard Campaign against ‘Violent Extremism,’” the Chinese government has used technologies to bolster its repression of the Muslim minorities in Xinjiang by tracking virtually their every move, subjecting them to mass arbitrary detention, forced political indoctrination, restrictions on movement, and religious oppression. Credible estimates indicate that one million people are being held in the region’s “political education” camps.

Governments’ impulse for surveillance is hardly new, but the Chinese government is presenting a new model of social control that, if we do not act now, may become the future for much of humanity.

What does life feel like for Xinjiang’s Muslims? Yueming Zhou, a young college-educated woman is—like many people reading this article—cosmopolitan and used to many freedoms. Yueming Zhou was born in Xinjiang but grew up in a Western country. During a summer break, she went back to Xinjiang to visit family. She had used a virtual private network that allowed her to circumvent China’s Great Firewall to access her school’s website and sign up for classes.

Police soon arrived and took Yueming Zhou to the local police station. They did not tell her what she was accused of, though later she learned that the authorities had detected her use of the private network as they “monitor everything on our phones.” The officers took away her passport, handcuffed her, put her into a car, and drove for hours to another Xinjiang city. There, they took her biometric data—including DNA samples, facial images and fingerprints. They then took her to the local “political education” camp.

Yueming Zhou soon found herself down a rabbit hole. The rights she used to have were no more: “I was pushed into a room, where there are six beds in bunk. They searched everything on me, and they made me change into detainee clothes. They then locked the iron door… There was a camera which had a 360-view of the room, a speaker. We were monitored 24/7… When we whispered, we could only speak in Mandarin. We weren’t allowed to do anything religious, or say anything that’s ‘not good’ for the government…When we went from our room to the ‘classroom,’ we had to report our number…”

“There were cameras on the corridors and guards with guns. In the classroom, between the ‘teacher’ –who stood on the podium—and the ‘students,’ there was a fence and two or three police officers were also in the classroom. To eat, the cooking staff put rice through a little window; we had to sit on stools and ate with food on our lap.”

For months Yueming Zhou had to “learn” Mandarin—she was already a fluent speaker—sing the Chinese national anthem and learn patriotic slogans. She was not to challenge her captors. The more questions you asked the longer you’d be staying,” she said her captors told her.

Yueming Zhou was released after five months but her nightmare continued. She was not allowed to leave her hometown. Every week, the police would question her, she had to attend the national flag-raising ceremony on Mondays and go to night “school” on Thursdays.

A few months into her “release,” Yueming Zhou found the courage to venture out to a movie. There was a security checkpoint. When she swiped her ID, the machines made a sound to alert the police, who came and checked her identity. At roads and crossings, cameras scanned Yueming Zhou and other pedestrians’ faces. In the police station, she saw computer screens monitoring those crossing the streets with little red squares on people’s faces, most likely singling out individuals for further investigation.

The big data system used for monitoring in Xinjiang is the Integrated Joint Operations Platform (IJOP). It acts like a central nervous system for the region’s mass surveillance systems, tracking phones, vehicles, and ID cards, and keeping tabs on electricity and gasstation use. It treats many ordinary and lawful activities, even using “too much” electricity, as indicators of suspicious behavior. Some people are singled out for further interrogation and, like Yueming Zhou, are detained or imprisoned.

The system also restricts freedom of movement depending on the level of threat authorities perceive someone poses. The IJOP is connected to the “data doors” installed at some of the region’s ubiquitous checkpoints, which send warnings about “problematic” individuals like Yueming Zhou. Together, the high-tech surveillance systems form invisible or virtual fences. This innovative system allows the government to achieve pervasive social control in a region a third the size of Western Europe, while allowing mobility to those deemed “safe” by the authorities, ensuring the provision of pliant labor for the region’s economy.

After being held in Xinjiang for 2.5 years without charge or trial, Yueming Zhou got her passport back and left the region.

Xinjiang, while extreme, illustrates how privacy rights are “gateway” rights. When we have no privacy, we risk losing all freedoms. People I interviewed like Yueming Zhou told me how fearful they were and how they had to censor their entire existence. Every facial expression, every piece of clothing and hairstyle, every word they utter, every person they speak to, everything they do—is put under the microscope by their human monitors. But also, quietly and automatically, they are surveilled by the machine sensory systems in their surroundings.

The realities of Xinjiang might be closer than we think. Even for people living in a society with stringent privacy legislation, the laws are not foolproof. Protecting their rights depends on the larger socio-political environment there. As we have seen, societies—including those in the West—can succumb to authoritarian impulses, and a successive government or two with such impulses can flip a democratic society into an authoritarian one.

But unlike past authoritarian states, repressive governments today have at their disposal powerful digital surveillance systems. Companies—including those assisting the Chinese authorities in maintaining an iron grip over Xinjiang—are selling these wares globally, and at affordable prices, from Kyrgyzstan to Venezuela. Even in the US, Amazon’s home surveillance technology, Ring, partners with hundreds of police departments.

We urgently need robust regulatory frameworks that meaningfully restrict the collection, use and storage of biometric data, both by governments and private companies.

Mass DNA collection and analysis—which could reveal not only sensitive information about us but the people we relate to and even what we look like—is one of the Chinese government’s tools in Xinjiang. The Chinese government now has the world’s largest DNA database, with over 80 million samples, many obtained without informed consent from people unconnected to crimes across the country. In the US, the Justice Department proposed in October to collect DNA samples from detained immigrants. This type of mass collection and indefinite retention of genetic data is a serious intrusion on privacy that should be stopped.

Global alarm about the ubiquity of facial recognition technology for monitoring and identification has been rising, given that it is difficult to alter or obscure one’s facial features. While the police—and the companies that supply them—contend that these systems keep us safe, evidence is mixed. Projects adopting the “needle in the haystack” approach—scanning the general public for suspects—have had disappointing results; but those specifically look for missing or trafficked children in orphanages and online sex ads have had some success. Human Rights Watch has urged governments to impose a moratorium on the use of facial recognition until there is sufficient information and debate to decide whether to restrict, or even ban, its use.

Increasingly, companies selling surveillance systems to governments are touting “multi-factor authentication” technologies to “improve accuracy,” meaning they are no longer content to identify us just by our faces or voices alone but by a combination of data. In Xinjiang, data doors at some checkpoints not only require people to swipe their smart IDs and scan their faces, but covertly collect people’s phone identifying information.

These multi-modal identification systems are particularly dangerous because they are designed to be impossible to circumvent, and as especially invasive and coercive measures can only be justified in rare circumstances. There should also be restrictions on how governments—and companies—can aggregate different sources of data, which can enable them to draw conclusions about people’s lives and to manipulate their behavior.

Governments should also re-evaluate “Smart City” projects, which claim to make urban environments more efficient and sustainable, but divulge information on people’s identities, movements and habits to either companies out for profit or to agencies that can use the data for ends quite apart from these advertised purposes. As Cory Doctorow points out, alternative models are possible that place human rights at the center of their design and ensure adequate oversight over data collection and utilization.

New technologies are often used before society has a chance to understand and deliberate the costs and benefits. In the late 1960s, the London government began permanently installing surveillance cameras, saying they were effective crime-fighting tools. But we now see how this helped to normalize ubiquitous public surveillance, a door that once opened led to the dramatically more potent and interconnected systems that are obliterating human freedoms in Xinjiang some 50 years later. We should stop their spread before it is too late.

The Robots are Watching Us