2020: China to Require AI and VR Aided Videos to be Labeled from

China is among one of the world’s leading technologically advanced societies. However, they have recognized a potential problem with the creation of “deepfake” videos. These videos can be produced to look almost real, calling into question the integrity of video evidence. Now, China has decided that from the start of next year, any video created with VR or AI would need to be appropriately labeled, or else its creators could face severe repercussions from the government.

Deepfake Technology Readily Available

Deepfakes are a new type of AI technology that can take existing video and convincingly alter it to make it look as though the recorded individual is saying something different. China’s leading legislative body has already stated that it is considering making deepfake technology illegal. A recent app released to the Chinese market in September of 2019 called Zao, utilized deepfake technology to enable users to swap their faces and voices with celebrities. The app was massively downloaded before concerns were raised about the privacy that the program guaranteed to its users.

Deepfakes A Concern in International Elections as Well

In June, the US congress held a meeting to discuss deepfakes ahead of the 2020 US Presidential elections. Disinformation campaigns have abounded since the last US elections, polarizing the electorate. Deepfakes would serve to push the voters further into separate camps if used strategically. OpenAI policy director Jack Clark pleaded with large tech platforms to make their deepfake detection algorithms available for other agencies.

An Ideal Tool for Misinformation

Based on the way society accepts video recordings, deepfake technology can be a powerful method of swaying popular opinion. In an authoritarian state such as China, the government is keen to ensure that its legitimacy isn’t undermined by new technology. Whether other countries will follow the Chinese model in dealing with deepfake technology and how they will seek to punish its abuse is still uncertain.