'An invisible cage' Minority report: How China polices the future

New Chinese technologies extend the boundaries of social and political controls and integrate them deeper into people’s lives. They justify suffocating surveillance and violate privacy, and risk automating systemic discrimination and political repression
'An invisible cage' Minority report: How China polices the future

BEIJING: The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored. Now, even their future is under surveillance.

The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberrations, promising to predict crimes or protests before they happen. They target potential troublemakers in the eyes of the Chinese government — not only those with a criminal past but also vulnerable groups, including ethnic minorities, migrant workers and those with a history of mental illness.

They can warn police if a victim of a fraud tries to travel to Beijing to petition the government for payment or a drug user makes too many calls to the same number. They can signal officers each time a person with a history of mental illness gets near a school.

While largely unproven, the new Chinese technologies, detailed in procurement and other documents reviewed by The New York Times, further extend the boundaries of social and political controls and integrate them ever deeper into people’s lives. At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression.

For the government, social stability is paramount and any threat to it must be eliminated. During his decade as China’s top leader, Xi Jinping has hardened and centralised the security state, unleashing techno-authoritarian policies to quell ethnic unrest in the western region of Xinjiang and enforce some of the world’s most severe coronavirus lockdowns. The space for dissent, always limited, is rapidly disappearing.

The details of these emerging security technologies are described in police research papers, surveillance contractor patents and presentations, as well as hundreds of public procurement documents reviewed and confirmed by the Times. Many of the procurement documents were shared by ChinaFile, an online magazine published by the Asia Society, which has systematically gathered years of records on government websites. Another set, describing software bought by authorities in the port city of Tianjin to stop petitioners from going to neighboring Beijing, was provided by IPVM, a surveillance industry publication.

China’s Ministry of Public Security did not respond to requests for comment faxed to its headquarters in Beijing and six local departments across the country. The new approach to surveillance is partly based on data-driven policing software from the United States and Europe, technology that rights groups say has encoded racism into decisions like which neighbourhoods are most heavily policed and which prisoners get parole. China takes it to the extreme, tapping nationwide reservoirs of data that allow police to operate with opacity and impunity.

Often people don’t know they’re being watched. Police face little outside scrutiny of the effectiveness of the technology or the actions they prompt. Chinese authorities require no warrants to collect personal information. At the most bleeding edge, the systems raise perennial science fiction conundrums: How is it possible to know the future has been accurately predicted if police intervene before it happens?

Even when the software fails to deduce human behavior, it can be considered successful since the surveillance itself inhibits unrest and crime, experts say. “This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch, “the disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”

In 2017, one of China’s best-known entrepreneurs had a bold vision for the future: a computer system that could predict crimes. The entrepreneur, Yin Qi, who founded Megvii, an artificial intelligence startup, told Chinese state media that the surveillance system could give police a search engine for crime, analysing huge amounts of video footage to intuit patterns and warn authorities about suspicious behavior. He explained that if cameras detected a person spending too much time at a train station, the system could flag a possible pickpocket.

“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Yin said. “It’s like the search engine we use every day to surf the internet — it’s very neutral. It’s supposed to be a benevolent thing.” He added that with such surveillance, “the bad guys have nowhere to hide.” Five years later, his vision is slowly becoming reality. Internal Megvii presentations reviewed by the Times show how the start-up’s products assemble full digital dossiers for police.

“Build a multidimensional database that stores faces, photos, cars, cases and incident records,” reads a description of one product, called “intelligent search.” The software analyses the data to “dig out ordinary people who seem innocent” to “stifle illegal acts in the cradle.” A Megvii spokesperson said in an emailed statement that the company was committed to the responsible development of artificial intelligence, and that it was concerned about making life more safe and convenient and “not about monitoring any particular group or individual.”

Similar technologies are already being put into use. In 2022, police in Tianjin bought software made by a Megvii competitor, Hikvision, that aims to predict protests. The system collects data on legions of Chinese petitioners, a general term in China that describes people who try to file complaints about local officials with higher authorities.

It then scores petitioners on the likelihood that they will travel to Beijing. In the future, the data will be used to train machine-learning models, according to a procurement document. Local officials want to prevent such trips to avoid political embarrassment or exposure of wrongdoing. And the central government doesn’t want groups of disgruntled citizens gathering in the capital.

The writers are journalists with NYT©2022

Visit news.dtnext.in to explore our interactive epaper!

Download the DT Next app for more exciting features!

Click here for iOS

Click here for Android

Related Stories

No stories found.
DT next
www.dtnext.in