Select Page

Imagine a new movie that has just been released, but Robin Williams, who died years ago, is the star of the role. How the heck is that possible? This may sound crazy, but it is not far fetched. With this new technology called deepfake, humans can digitally bring old actors back to “life”. This is just one of the many uses of this new technology, but what happens when used for political speeches? This poses a threat to democracy by enhancing fake news through manipulated images and sounds. This article will be delving more into the new technology called “deepfake” and the cultural implications that this technology imposes on society. 

What is Deepfake technology?

Deepfake technology uses artificial intelligence to create fake audio and video recordings that look and sound like real footage. Deepfake technology is based on GAN generative adversarial networks that allow the algorithms to classify data and generate new data. In a GAN, two machine learning models work together with the data, one makes forgeries based on the data it analyzes. At the same time, the other detects forgeries until it cannot anymore, and a coherent deepfake is created. By examining hundreds of photographs, movements, and sounds that an individual makes, the technology can replicate the images as if they were real. This is why actors and political leaders are the first individuals to be replicated. They have the most recorded footage of themselves, and the AI can learn the different expressions, vocalizations, and body movements that each individual makes to replicate their mannerisms. 

What are the primary uses of deepfake technology?

Mika Westerland, associate professor at Carleton University, points out some of the benefits of this technology. According to Westerland, deepfake can break language barriers by translating speech in real-time and adjusting people’s facial expressions to dictate the new sounds. There is also the potential to reminisce with loved ones who have passed away, allow transgender individuals to see themselves as the alter gender before surgeries or even digitally recreate limbs that have been lost. There is also the potential to enable people to virtually try on clothes online before they purchase them. If used correctly, this technology can be very transformative, but is it worth the risk to introduce a surge of fake media?

What are some of the risks?

It is more than just comedic reels of different celebrities being swapped with the average joe or fake celebrity porn scandals because deepfake technology has the capability of manipulating political speeches, imposing financial fraud, sending out terrorist propaganda, destroying marriages with revenge porn and creating fake depictions of violence which poses a huge threat to people’s livelihoods and to modern democracy. It only takes one video to go viral and spread a surge of misinformation to the general public. If reliable sources were not challenging enough before, this technology would make it even harder for the general public to trust what they see online.


The biggest question on my mind is: how will deepfake technologies be monitored or controlled? The truth is, this technology will only work if democracy is challenged. This is because without total surveillance and blockchain technology to decipher whether images are real or fake, the surge of fake news will rise, and it may be nearly impossible to ever know if what someone is saying online is even real. Would you trust this type of technology?



Explore More:-

What is The Ideal Society?

What is OnlyFans and How Does it Work?

Calling Frontline Workers “Heroes” Is Just Not Enough


Kirstin Corbett
Kirstin Corbett

Kirstin Corbett is a professional communicator dedicated to promoting new sustainable initiatives and technologies. Her background in Anthropology and Communications can explain her passion for writing about social injustice issues, as well as the cultural impacts of new technologies. Her goal is to educate and enlighten others through the content she creates.

Spread the love
%d bloggers like this: