The Chinese government has long since used a system by which individuals ‘social score’ determines how much the government trusts them to act on their own. Similar to a universal credit score, the system is designed to punish those that display behavior that goes against the government’s wishes and reward those that behave along the proscribed lines. While this seems like an authoritarian overreach to most, there is a possibility for such a system to be implemented in Western societies as well.
Private companies have already started using social media as a basis for determining things like insurance premiums. Additionally, commercial biometric monitors have shown up in bars that track behaviors of consumers. These tools can be useful in stopping trouble before it starts, but it raises serious moral concerns about how far technology should be allowed to pass judgment on the individual.
Uber, Airbnb, and WhatsApp are Facilitators
Airbnb noted that it has over six million users in its database, but if the company decides to blacklist a traveler, it can severely limit their options for affordable stays. While that would be useful if the company had a clearly outlined set of infractions to avoid, there is no such document, and the company can ban a user without warning. WhatsApp has a similar situation where, if too many users block an individual, they can be prohibited from using the app altogether. Uber has a similar system where drivers rate passengers the same way passengers rate drivers. Too many low ratings and you could be booted from the system.
The Moral Implications of Social Credit
The system is a while off before becoming ingrained into Western society, but dependence on crowd-sourced tech already has us at the mercy of other people’s opinions. There is not even a chance to defend oneself or present evidence to the contrary. Social credit could lead to abuse of power by majorities. While the system is legal, it sets a dangerous precedent for society to follow.