Iubește tangoul argentinian, cafeaua cu lapte, vinul de pelin și crede că „momentul potrivit” este ca fata morgana: dacă o aștepți … Read more
[ Next challenge in PR: deepfake news ]#ConfidentInsight, EN, News 24.08.2020
Crisis communication is reaching another level with a situation which is more and more often in the news: deepfake. Basically, everyone can be swap into videos or audio files that aren’t them just to create confusion and generate news. The phenomena was born in 2017 when a Reddit user of the same name posted porn clips on the site. But it escalated quickly to politics, business and entertainment.
Deepfake means fake videos or audio recordings that look and sound just like the real thing, which emerged from Hollywood special effects studios and intelligence agencies producing propaganda. Currently, several softwares can be freely dowloaded and create convincing fake videos and content.
A statistics shown by The Guardian found 15,000 deepfake videos online in September 2019, a near doubling over nine months, as they mention. Barack Obama called Donald Trump a “”, Mark Zuckerberg brag about having “”, or witnessed Jon Snow’s for the dismal ending to Game of Thrones, are among the most powerful deepfake news which was out in the media recently.and 99% of those mapped faces from female celebrities on to porn stars. But 4% means more trouble to the communication specialists then ever, while most of them are targeting politicians or businesspeople around the globe. Just few examples,
Yet, deepfake video can help out in some cases. For example, Hollywood has transposed real or fictional faces onto other actors like bringing Peter Cushing back to life in 2016’s Rogue One: A Star Wars Story, but the technique used complex, expensive pipelines and face-mounted cameras.
How can this emerge in a communication crisis?
Although deepfakes have been limited to amateurs putting celebrities’ faces on porn stars’ bodies and making politicians say funny things. However, using these programs, it’s quite easy to create a deepfake of an emergency alert. Basically, every public figure can be in the spotlight of deepfakers, from corporate to politician or even targeting the private life of someone’s popular.
How does it work?
According to CSO online information, “Deepfakes exploit this human tendency using generative adversarial networks (GANs), in which two machine learning (ML) models duke it out. One ML model trains on a data set and then creates video forgeries, while the other attempts to detect the forgeries. The forger creates fakes until the other ML model can’t detect the forgery. The larger the set of training data, the easier it is for the forger to create a believable deepfake. This is why videos of former presidents and Hollywood celebrities have been frequently used in this early, first generation of deepfakes — there’s a ton of publicly available video footage to train the forger.”
It takes a few steps to make a face-swap video and a performant computer. Through an AI algorithm called encoder, you run thousands of face shots of the two people. The encoder finds and learns similarities between the two faces, and reduces them to their shared common features, compressing the images in the process. Another AI algorithm, a decoder, is then taught to recover the faces from the compressed images. Because the faces are different, you programme one decoder to recover the first person’s face, and another decoder to recover the second person’s face. To perform the face swap, you simply introduce the encoded images into the “wrong” decoder.
Faking news is practiced since 1920s
Just after the invention of the cinema, faking news videos in order to dramatize the real news was par for the course. At a time when film could take weeks to cross an ocean, filmmakers would dramatize earthquakes or fires with tiny sets to make the news more lifelike. It happened on September 1, 1923, when 7.9 earthquake hit Japan and killed over 140,000 people. And while news reached newspapers around the world by the next day, there was no way to get film footage from Japan to the United States that quickly. Yet, they did.
Photo credit: geralt via pixabay.com
La ceremonia de absolvire a studentilor de la FCRP – SNSPA, dupa 10 ani de cand am plecat de pe … Read more
Ebenezer Scrooge, personajul creat de Charles Dickens, disprețuia Crăciunul, dar noi, Confidenții, îl iubim tare mult. Și am fost cuminți … Read more