This week, China announced that it would loosen its harsh “zero-COVID” policies after weeks of public protests. It was a rare victory for dissent in a country where surveillance and censorship are ubiquitous.
China’s efforts to control its population range from the crude “Great Firewall” that blocks Chinese access to websites containing information banned in China to sophisticated “data doors” that vacuum up information from the devices of the Muslims under heavy surveillance in Xinjiang.
Critics and their families are so likely to be detained and censored that even protesters outside of China fear being identified if they have relatives in the country. At a recent protest at Harvard University, “almost everyone wore masks, and some also wore sunglasses, largely to protect their identity,” reported Jiayang Fan in The New Yorker.
China’s actions matter, not only to its own residents but also to a world where autocrats learn repressive techniques from one another and often share surveillance and spyware technology.
But it’s not easy to get a clear look at the surveillance regime in China, since many members of the foreign press have been expelled or denied entry to the country. To understand it, I turned to two Wall Street Journal reporters, Josh Chin and Liza Lin, who recently published the book “Surveillance State: Inside China’s Quest to Launch a New Era of Social Control.”
Chin is the Seoul-based deputy bureau chief for The Wall Street Journal’s China bureau, and Lin is a China correspondent for The Wall Street Journal based in Singapore. Their coverage, along with other colleagues, of Chinese surveillance won the Gerald Loeb Award for international reporting in 2018. Chin, who was expelled from China in 2020, is also a recipient of the Don Bolles Medal, awarded to investigative journalists who exhibit courage in the face of intimidation.
Our conversation, edited for brevity and clarity, is below.
Angwin: China is often described as the ultimate surveillance state. Is it? And what are the big components of its surveillance apparatus?
Chin: China is undoubtedly the biggest and most advanced surveillance state right now. There are a couple elements to that. One is scale—the sheer volume of data they’re collecting. The country has more than 400 million surveillance cameras, and the government has the ability to access somewhere around a billion smartphones. They have all sorts of other sensors and data collection methods that they’re constantly using to hoover up information.
What’s interesting in China in particular is that the government has singular access to data, whereas in the United States the government can get access to data, but it’s really fractured.
In China there’s a centralized ID database that has biometric photos attached to everyone’s name and information, which makes it really easy to track people. Additionally, the apps in China collect a much broader cross section of data than they do even in the United States. In the U.S., if you want to know what people spend money on, you go to Amazon. If you want to know what they’re searching for, you go to Google. If you want to know what they’re talking about, maybe you go to Facebook. In China, there’s one really huge app that everyone uses called WeChat, and it has all of that information in one place.
Angwin: Can you talk about how surveillance ramped up during the pandemic and what this means for Chinese citizens?
Lin: In prepandemic China, if you thought about real-time surveillance and the groups of people that were under real-time surveillance, you would think about the persons of interest to the Chinese police and the Ministry of Public Security. They had well-defined categories of who would be a person of interest. If you’re a regular Chinese person, it was highly unlikely that you would be tracked on a daily or 24/7 basis.
Postpandemic, though, China took the chance to roll out its digital surveillance system across the country. Now, it doesn’t matter if you are a person of interest to the police or not, because everyone has a health code, which is a barcode that you install on your phone that tracks your movements over the previous two weeks. Typically, the state telecom companies are tracking this data, and they’re sharing it with the health authorities to ensure that someone has not been in an area with a huge COVID outbreak. If you were in an area with a huge COVID outbreak, your health code would turn red, and that would bar you from taking subways and leaving your home for 14 days. Indirectly, this gave the government the capability to track almost 1.4 billion citizens, which is something that they had never done in the past.
At the start, people were generally happy with the surveillance, but what we’ve been seeing this year is the abuse of some of that surveillance. This summer there were protests in a smaller city in China, Zhengzhou, where a local bank had been misappropriating funds and the authorities froze the assets. People panicked, and bank depositors went up to that bank and started to protest. The first time they protested, the authorities were caught off guard; the second time, the authorities turned the health codes of the protesters red. That meant that these people couldn’t enter the city, and they were herded off to quarantine centers.
Angwin: It seems like the most egregious surveillance is occurring around the oppression of the Uyghurs?
Chin: In Xinjiang, China is running the world’s most ambitious experiment with predicting human behavior, and it actually extends from the war on terror. Previously, tensions in the region were seen as a product of ethnic conflict between the Uyghurs and Han Chinese, but post the war on terror, the government increasingly started to see it as a problem of dealing with radical Islam.
The theory behind the surveillance rollout in Xinjiang is that if the government could collect enough data about Uyghurs and other Turkic Muslims, they could predict which individuals were likely to pose a security threat in the future. They were collecting all kinds of information—how often people pray, where they pray, if they have a Quran, if they had a religious education, but also things like gasoline consumption. If you go to a gas station in Xinjiang, you have to scan your face to get into the gas station, and you have to scan your ID card to buy the gas.
They collected all these measures of behavior that they felt could be used to extrapolate ill intent. The problem with this approach is that it’s almost impossible to use these techniques to predict who will become a terrorist. We don’t have good information about what makes a terrorist, or at least not good enough to predict who will become one. People have gotten swept up in these systems, labeled as threats, and sent to internment camps. Many of them are intellectuals or people who have friends abroad. Basically, in Xinjiang, the concept of a threat has expanded to include almost anything that is seen as a challenge to the Communist Party’s view of how the region should be managed—not just terrorist threats, but intellectual threats, cultural, religious threats. Anyone who fits that model is now being flagged by the system.
Angwin: Let’s talk about something that is often talked about in the media: the social credit system.
Chin: The social credit system is by far the most recognizable piece of the Chinese surveillance state globally, and it’s also by far the most misunderstood piece of the Chinese surveillance state.
What it actually began as was a regular financial credit rating effort that was trying to address a real problem in China, which is a lack of trust. At one point, officials had the ambition of expanding it to rate a wide variety of nonfinancial behavior as well. What it has become instead is a much more modest effort to essentially enforce existing laws. If you’re found misbehaving or breaking rules, the social credit system tracks that and punishes you, not just legally but in other ways. For example, not only are you racking up more fines, but you aren’t allowed to buy plane tickets, or you aren’t allowed to stay in luxury hotels.
Lin: Because there wasn’t a legal template or a legal basis to it, you’re really seeing cities come up with a patchwork of different social credit systems, none of which correspond or talk to each other. The whole system requires a ton of data collection and storage, which is very expensive and labor intensive for under-resourced local authorities.
The social credit system has evolved to become more of an afterthought. It’s a good example of how China’s surveillance state is like a panopticon, where the idea that you’re being tracked is more effective than the actual tracking mechanisms.
Angwin: Everyone in D.C. seems obsessed with the AI race with China. What do you make of it?
Chin: China does lead the world in computer vision and other surveillance-focused areas of AI. If you go to AI conferences—there’s a big computer vision conference every year called CVPR—Chinese researchers and companies dominate in the papers, patents, and patent filings for computer-vision-related AI.
However, the U.S. has imposed these export restrictions on American companies and American people, which basically make it incredibly difficult for China and any Chinese entity to acquire advanced chip technology because Americans dominate advanced chip manufacturing in so many phases. Depending on how strictly the U.S. wants to enforce those restrictions, it could have some extremely negative effects on China’s AI industry, because it needs those chips to do research and to train algorithms.
As always, thanks for reading.
(Additional Hello World research by Eve Zelickson.)