+40 743145588
2011 - 2021 Copyright | All Rights Reserved CONFIDENT

[ Next challenge in PR: deepfake news ]

#ConfidentInsight, EN, News 24.08.2020

Crisis communication is reaching another level with a situation which is more and more often in the news: deepfake. Basically, everyone can be swap into videos or audio files that aren’t them just to create confusion and generate news. The phenomena was born in 2017 when a Reddit user of the same name posted porn clips on the site. But it escalated quickly to politics, business and entertainment.

Deepfake means fake videos or audio recordings that look and sound just like the real thing, which emerged from Hollywood special effects studios and intelligence agencies producing propaganda. Currently, several softwares can be freely dowloaded and create convincing fake videos and content.

A statistics shown by The Guardian found 15,000 deepfake videos online in September 2019, a near doubling over nine months, as they mention. 96% were pornographic and 99% of those mapped faces from female celebrities on to porn stars. But 4% means more trouble to the communication specialists then ever, while most of them are targeting politicians or businesspeople around the globe. Just few examples, Barack Obama called Donald Trump a “complete dipshit”, Mark Zuckerberg brag about having “total control of billions of people’s stolen data”, or witnessed Jon Snow’s moving apology for the dismal ending to Game of Thrones, are among the most powerful deepfake news which was out in the media recently.

Yet, deepfake video can help out in some cases. For example, Hollywood has transposed real or fictional faces onto other actors like bringing Peter Cushing back to life in 2016’s Rogue One: A Star Wars Story, but the technique used complex, expensive pipelines and face-mounted cameras.

How can this emerge in a communication crisis?

Although deepfakes have been limited to amateurs putting celebrities’ faces on porn stars’ bodies and making politicians say funny things. However, using these programs, it’s quite easy to create a deepfake of an emergency alert. Basically, every public figure can be in the spotlight of deepfakers, from corporate to politician or even targeting the private life of someone’s popular.

How does it work?

According to CSO online information, “Deepfakes exploit this human tendency using generative adversarial networks (GANs), in which two machine learning (ML) models duke it out. One ML model trains on a data set and then creates video forgeries, while the other attempts to detect the forgeries. The forger creates fakes until the other ML model can’t detect the forgery. The larger the set of training data, the easier it is for the forger to create a believable deepfake. This is why videos of former presidents and Hollywood celebrities have been frequently used in this early, first generation of deepfakes — there’s a ton of publicly available video footage to train the forger.”

It takes a few steps to make a face-swap video and a performant computer. Through an AI algorithm called encoder, you run thousands of face shots of the two people. The encoder finds and learns similarities between the two faces, and reduces them to their shared common features, compressing the images in the process. Another AI algorithm, a decoder, is then taught to recover the faces from the compressed images. Because the faces are different, you programme one decoder to recover the first person’s face, and another decoder to recover the second person’s face. To perform the face swap, you simply introduce the encoded images into the “wrong” decoder.

Faking news is practiced since 1920s

Just after the invention of the cinema, faking news videos in order to dramatize the real news was par for the course. At a time when film could take weeks to cross an ocean, filmmakers would dramatize earthquakes or fires with tiny sets to make the news more lifelike. It happened on September 1, 1923, when 7.9 earthquake hit Japan and killed over 140,000 people. And while news reached newspapers around the world by the next day, there was no way to get film footage from Japan to the United States that quickly. Yet, they did.

Photo credit: geralt via pixabay.com

Other news

RO -> Ingenius TV: Prejudecati in crearea unui brand personal

News 24.06.2019

Despre prejudecati in crearea unui brand personal de business cu Alina Stoian la Ingenius Hub: Idei preconcepute inainte de crearea unui … Read more

RO: Masa de Crăciun perfectă cu o paella de Mondiale La Finca by Alioli & La Pescaderia

#ConfidentInsight, RO 24.12.2020

Industria HoReCa a fost grav afectată în acest an, dar au existat și momente în care am reușit să ne … Read more

Aici se scrie: Mara Gojgar, 5 sfaturi la început de antreprenoriat

News 27.11.2017

Mara Gojgar, Fondator Confident Communications, despre cele mai importante sfaturi primite la inceput de antreprenoriat, pentru Aici se scrie: