With forthcoming breakthroughs of technology, lingers a threat of out doing human synapse of comprehension and consumption of information.
Limitless possibilities afloat of what one can dismiss as impossible only to find out that technology can prove otherwise. 2017 saw a manifestation of ‘deepfakes’ in the West which is an artificial intelligence software which works by mechanising synthetic algorithms to superimpose the voice or face or both of someone else over the subject and make it appear that they said it in real-time. In 2018 comedian Jordan Peele released quite a believable video of former President of the States, Barack Obama saying things as a PSA which was doctored by the comedian himself. The purpose was to mediate the message of the extent to which Deepfake can be deceiving and the ease of creating and sharing them.
Deepfake video of Barack Obama taken from YouTube channel of Good morning America!
This software was initially used in the porn business to generate revenue. Graphics of celebrities and well-known faces were easily accessible from the net thereby making them easy targets for the subject of their content. Lately, this technology is doing rounds in harbouring fake news as it has been seeping into politics.
A day ahead of Delhi elections, Bharatiya Janata Party (BJP) MP, Manoj Tiwari’s video of criticizing the Kejriwal led government surfaced in multiple languages on WhatsApp and other social media platforms, aimed at targeting prospective voters. The singer/actor turned politician can be seen speaking in fluent English and Haryanvi making appeals to the Janata to vote for his party when in reality the actual video was him speaking on the affirmation of passing of Citizenship Amendment Bill (now Act) in the house of parliament.
Video of Manoj Tiwari speaking in English and Haryanvi taken from The Vice
The negative repercussions of Deepfake are directly proportional to the ease with which it can be accessed feasibly. Politicians tailor their needs of reaching more target voters than their opponents in more authentic ways possible. People who consumed Manoj Tiwari’s hoaxed video felt more attached to him when he spoke in their language. A major threat with the upcoming of such videos is that denial would become a lot easier for people who can be easily caught in their wrongdoings via citizen journalism. They can immediately call out the video for being falsified. Even worse, anyone can be made to say things which they didn’t and share that video extensively.
In a country like India which is vulnerable to riots, such videos will serve as a very good bait to disturb peace and harmony.
Actual video of Manoj Tiwari taken from The Vice
Prateek Sinha of Alt news- a fact-checking website, in an interview with Vice said,” At this point in time it’s impossible to fact check or verify something that you don’t recognise is doctored.”
It has become a whole lot easier to deceive people especially the ones who are in the oblivion of the technological jargon.
A critical approach in understanding these videos would be an ideal case scenario but in reality, we are a nation which believes in United Nation giveaways of honorifics to Indian festivals and national anthem. How can we decipher the face2face algorithms and others incorporated in making such lies? A New York-based software company is working on detecting Deepfake videos. It would be a relief to say that ills of technology are countered by technology itself, however, it’s imperative that we exercise extreme caution before believing any content. Fact check it while we can and run it through multiple sources before sharing them because separating the act of consuming and sharing of information has the ability to impede the pace of harm that this new technology bores as it’s potential.
Suggestive fact checking websites: altnews.in, indiaspend.com, boomlive.com, www.hoaxslayer.net, mediavigil.com
Feature Image Credits: m.phys
Feature Image Caption: AI enabled technology superimposed actor Nicolas Cage’s face on Tesla’s CEO Elon Musk.
Umaima Khanam
Comments are closed.