Skip to Content

Opinion: The Zelensky deepfake is a warning for Corporate America

Opinion by Matthew F. Ferraro for CNN Business Perspectives

Last month, a false video circulated online that seemed to show Ukrainian President Volodymyr Zelensky telling his soldiers to surrender to the Russian invasion. The video — a mediocre lip-sync of Zelensky’s face and voice — was the product of artificial intelligence manipulation, commonly known as a deepfake. It is unclear who made the deepfake, but the Ukrainian government had been warning for weeks that Russia may push manipulated media.

The video quickly spread on social media. Zelensky promptly responded with a video of his own from the streets of Kyiv, proclaiming that he will continue to defend Ukraine, and that the country’s citizens will only lay down their arms when they are victorious against Russia. And, the same day, major social media companies removed the deepfake for violating their policies on misinformation and manipulated media.

The Zelensky deepfake failed to unleash confusion and mayhem, as seems to have been the intention. As Sam Gregory, the program director for the human rights group WITNESS, observed, it was a “best-case scenario” for a situation like this. The Ukrainian government had engaged in extensive “prebunking” — warning of possible manipulated videos before they emerged. The deepfake itself was of poor quality. A credible person (Zelensky) quickly rebutted the deepfake’s message to his millions of social media followers. And the deepfake clearly violated social media platform policies against manipulated content, allowing the deepfake to be removed, Gregory said.

The ease with which deepfakes are made and distributed illuminates a growing risk not only to governments, but also to the private sector — where the use of believable manipulated media could be used to confuse customers, partners and employees and damage corporate brands and valuations. But the experience of the Zelensky deepfake provides some clues for what businesses can do to meet the threat.

Consider how falsified media could harm businesses. For example, an “undercover” video of a CEO seeming to use a racial epithet could circulate on the eve of a major product launch, torching the leader’s reputation and throwing the launch into doubt. Or the leaders of two competitors could appear in a video shaking hands to announce a (nonexistent) merger, goosing stocks to the benefit of anyone prepared for the jump. And, with many employees continuing to work remotely, a falsified video of a chief information security officer or chief financial officer could trick unsuspecting employees into providing IT credentials to an outsider or wiring funds to a thief, in a kind of cyber-impersonation campaign that the FBI calls “business identity compromise.”

The deepfakes could be spread by any number of bad actors, from industry competitors to disgruntled former employees, from trolls motivated by spite to profiteers seeking a payday through fraud, to foreign states that see strategic advantage in damaging the business community.

What can Corporate America do in response?

First, much as President Zelensky and his government removed the deepfake’s sting by warning ahead of time about the prospect of manipulated media, the C-suite should prepare employees for growing cyber threats, and deepfakes, in particular. They should revise their cyber incident response plans so that the entire company understands how to identify and respond to disinformation and deceptive media if it arises. That way, employees are less likely to be tricked into abetting fraud.

Companies should also consider integrating technology into corporate videos that reduces the likelihood that media can later be covertly manipulated. This is called image provenance, and it relies on blockchain technology to provide verifiable signatures that media has not been unknowingly altered. In the event a suspected deepfake of an executive appears, viewers could check metadata to see if the media was tampered with.

Second, Zelensky benefited from having both a massive online megaphone and an established, authentic voice. A video announcing surrender simply didn’t seem like something he would say. While business leaders may not be able to command as substantial a following as a heroic wartime leader, they should maintain verified presences on social media platforms, develop a genuine voice, and frequently communicate, at least, to key customers, partners and shareholders. They can reduce the power of a potential fake by establishing a real persona.

Third, if a deepfake targeting a company circulates, the business should move quickly to respond to it with legitimate content. The company should also work with its lawyers and with social media platforms to have the deepfake removed if it violates their policies, and partner with allies trusted by the public (like business titans, former government officials or celebrities) who can validate and amplify the business’s rebuttal.

The growing popularity of deepfakes and the ingenuity of bad actors who adapt old swindles to new technology make it imperative that the corporate world prepares for these burgeoning risks.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

ABC 17 News is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content