In his darkly comic 2010 novel Super Sad True Love Story, Gary Shteyngart imagines a Yelpified America in which people are judged not by the content of their character but by their streamed credit scores and crowdsourced “hotness” points. Social relations of even the most intimate variety are governed by online rating systems.
A sanitized if more insidious version of Shteyngart’s big-data dystopia is taking shape in China today. At its core is the government’s “Social Credit System,” a centrally managed data-analysis program that, using facial-recognition software, mobile apps, and other digital tools, collects exhaustive information on people’s behavior and, running the data through an evaluative algorithm, assigns each person a “social trustworthiness” score. If you run a red light or fail to pick up your dog’s poop, your score goes down. If you shovel snow off a sidewalk or exhibit good posture in riding your bicycle, your score goes up. People with high scores get a variety of benefits, from better seats on trains to easier credit at banks. People with low scores suffer various financial and social penalties.
As Kai Strittmatter reports in a Süddeutsche Zeitung article, the Social Credit System is already operating in three dozen test cities in China, including Shanghai, and the government’s goal is to have everyone in the country enrolled by 2020:
Each company and person in China is to take part in it. Everyone will be continuously assessed at all times and accorded a rating. In [the test cities], each participant starts with 1000 points, and then their score either improves or worsens. You can be a triple-A citizen (“Role Model of Honesty,” with more than 1050 points), or a double-A (“Outstanding Honesty”). But if you’ve messed up often enough, you can drop down to a C, with fewer than 849 points (“Warning Level”), or even a D (“Dishonest”) with 599 points or less. In the latter case, your name is added to a black list, the general public is informed, and you become an “object of significant surveillance.”
As Strittmatter points out, the Chinese government has long monitored its citizenry. But while the internet-based Social Credit System may be nothing new from a policy standpoint, it allows a depth and immediacy of behavioral monitoring and correction that go far beyond anything that was possible before:
The Social Credit System’s heart and soul is the algorithm that gathers information without pause, and then processes, structures and evaluates it. The “Accelerate Punishment Software” section of the system guidelines describes the aim: “automatic verification, automatic interception, automatic supervision, and automatic punishment” of each breach of trust, in real time, everywhere. If all goes as planned, there will no longer be any loopholes anywhere.
The government officials that Strittmatter talked to were eager to discuss the program and to emphasize how it would encourage citizens to act more responsibly, leading to a happier, more harmonious society. As one planning document puts it, “the system will stamp out ‘lies and deception’ [and] increase ‘the nation’s honesty and quality.'” Those sound like worthy goals, and the rhetoric is not so different from that used in the U.S. and U.K. to promote governmental and commercial programs that employ online data collection and automated “nudge” systems to encourage good behavior and social harmony. I recall something Mark Zuckerberg wrote in his recent “Building Global Community” manifesto: “Looking ahead, one of our greatest opportunities to keep people safe is building artificial intelligence to understand more quickly and accurately what is happening across our community.” I’m not suggesting any equivalence. I am suggesting that when it comes to using automated behavioral monitoring and control systems for “beneficial” ends, the boundaries can get fuzzy fast. “Of all tyrannies,” wrote C. S. Lewis in God in the Dock, “a tyranny sincerely exercised for the good of its victims may be the most oppressive.”
What’s particularly worrisome about behavior-modification systems that employ publicly posted numerical ratings is that they encourage citizens to serve as their own tyrants. Using peer pressure, competition, and status-establishing prizes to shape behavior, the systems raise the specter of a “gamification” of tyranny. Nobody wants the stigma of a low score, particularly when it’s out there on the net for everyone to see. We’ll strive for Status Credits just as we strive for Likes or, to return to Shteyngart’s world, Hotness Points. “Our aim is to standardize people’s behavior,“ a Communist Party Secretary tells Strittmatter. “If everyone behaves according to standard, society is automatically stable and harmonious. This makes my work much easier.”