China will use AI-based technology to monitor citizens and punish bad behavior

China is making plans to implement a nation-wide “social credit” system that will analyze big data and use it to reward good behavior such as volunteer work or giving blood, and punish bad behavior such as jaywalking, posting anti-government messages on social media or smoking in non-smoking zones.

While the system is intended to combat fraud and immoral behavior and might seem like a good idea on the surface, the implications are, frankly, downright terrifying, and uncomfortably reminiscent of the dang’an system used by the Communist Party.  But it’s ok, because the benign aim of the system is “to allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step.”  If you’ve ever read Les Mis, you’ll know we don’t always get this judgment right.

Consequences of algorithmic decisions impact every part of daily life

The social credit system has already been in place since 2014 in certain areas of China such as Hangzhou, near Shanghai.  For those deemed untrustworthy, “everywhere is limited, and it is difficult to move, so that those who violate the law and lose the trust will pay a heavy price”.  The social credit system will be implemented nation-wide by 2020.

The behaviors that come under scrutiny include as many aspects as possible of a person’s social, political, commercial and moral life.  The Independent reveals that there is reason to believe a person’s social credit score will affect countless parts of their life such as the ability to book transportation and accommodation, the ability to get a mortgage, access to good schools, internet speed, the ability to apply for certain jobs and even the right to own a pet.  Serious offenders have a message automatically added to their ringtone to warn anyone who calls them about their immoral comportment, and offenders are further shamed by having their names appear on giant lists in public places to “form a pattern of distrust and punishment.”

So far, over 15 million “untrustworthy” individuals have already been barred from booking planes and high-speed trains.  The list of blacklisted citizens includes Chinese journalist Liu Hu, who was accused of defamation last year for his tweets regarding the corruption of government officials.  Liu Hu was deemed to have been insincere in his apologies and as a result, he was put on the social credit “blacklist.”  The list, based primarily on limiting “high consumption,” restricts offenders’ abilities to travel by high-speed train, enrol children at a private school or get a loan for a house.

AI increasingly used by the state to control and monitor Chinese citizens

Chinese citizens already willingly provide a wealth of data to anyone who wants to know, and the state is prepared to capitalize on the population’s apparent ambivalence towards snoopy smartphone apps and surveillance in public spaces.  Facial recognition software such as Sensetime is already being used in many public spaces in China to fine jaywalkers and otherwise maintain the peace.  In 2019, the surveillance state is planning to make it mandatory for all drivers to install RFID chips in their vehicles that will provide more detailed information to the state.

Social credit algorithms are the logical next step in China’s surveillance state evolution

China has already been criticized in the past for its totalitarian cybertactics.  Following the Great Firewall and the Great Cannon, the state is currently making moves to implement a grid system to monitor the daily lives of people in different neighborhoods.

The social credit rating system is already being piloted by several private organizations, including Sesame Credit, an affiliate of Ali Baba.  The name evokes the tale of Ali Baba and the 40 Thieves, where the magical words “Open Sesame!” opens a cave to a giant pile of treasure.  According to Wired’s Mara Hvistendahl, Sesame’s welcome screen informs users that “[Sesame] Credit is the embodiment of personal credit.  It uses big data to conduct an objective assessment. The higher the score, the better your credit.”  Seems simple enough.

On the surface, Sesame Credit is indistinguishable from any other credit-rating website.  But according to The Guardian, the company uses a secret algorithm to assign its users a score from 350-950 based on data about their interpersonal relationships and consumer habits.  Behaviors like being friends with low-rated people or buying video games will negatively affect your score, while buying diapers will earn you points.  A good score will get you preferential treatment on the Baihe online dating service, deposit-free car rental and fast-tracked visa applications.  As described by Hvistendahl, the amount of information the apps take into account is terrifying, especially the data about a person’s social connections and how they interplay with one’s own credit rating.  Unfortunately, how exactly the algorithms work is a mystery, because transparency is not a part of the system.

Read more about what China's social credit system means for the West.