For those fearful of a future where videos from real people are indistinguishable from computer-generated fakes, two recent developments that have attracted an audience of millions could have been alarming.
First, a visual effects artist worked with a Tom Cruise impersonator to create amazingly accurate videos that mimicked the actor. The videos, which were created using machine learning techniques and are known as deepfakes, had millions of views on TikTok, Twitter and other social networks in late February.
Days later, MyHeritage, a genealogy website best known for its role in finding the identity of the Golden State Killer, offered a tool to digitally animate old photos of loved ones and made a short, repeating video in which People can be seen move their heads and even smile. As of Monday, more than 26 million images had been animated using the Deep Nostalgia tool.
The videos reiterated the potential of synthetic media, which could lead to significant improvements in the advertising and entertainment industries. However, the technology could also be used to cast doubt on legitimate videos and insert people, including children, into pornographic images.
The developers of the viral Tom Cruise TikTok said the expertise required to use the technology makes abuse a lot more difficult, and the company behind the photo animation tool said it had safeguards in place to prevent abuse . Experts say the two examples aren’t too alarming – but that they do raise questions about the future of technology that should be considered while it’s still in its infancy.
“While Deep Nostalgia itself is harmless, it is part of those tools that are potentially very threatening,” said Sam Gregory, program director of Witness, a nonprofit that focuses on the ethical use of video and an artificial intelligence expert.
Digitally mimicking Mr. Cruise was not an easy task. Chris Ume, the visual effects artist in Belgium who created the videos, said in an interview that they would require extensive expertise and time.
Most of what you see in the videos is the body and voice of Miles Fisher, an impersonator of Tom Cruise who is already fluent in the actor’s mannerisms and sounds, and shows a strong resemblance even without manipulation. The videos only show the face of the real Tom Cruise from forehead to chin.
He spent two months training his computer model to create Mr. Cruise’s facial expressions. He first fed random faces to the video before focusing on Mr. Cruise. Mr. Ume spent approximately 24 hours in production for every minute-long video and tweaked details like eye alignment.
Even if technology improves, videos like his would require extensive manual labor and a skilled impersonator, he said.
“It’s like a little Hollywood studio with the two of us,” he said. “It’s not something you can do with the push of a button on a home computer.”
The Deep Nostalgia tool was developed for MyHeritage by D-ID, an artificial intelligence company based in Tel Aviv. Gil Perry, D-ID’s executive director, said the company only works with partners it can trust not to abuse the technology and that it has a four-year relationship with MyHeritage.
Videos created with the tool have watermarks indicating they are fake and the videos do not contain audio. A decision that, according to Perry, makes it difficult to use for unsavory purposes.
He said the technology that powers Deep Nostalgia is “just the tip of the iceberg of what we are capable of”.
“The potential for the good part of this technology is endless,” he said.
When optimists talk about the potential strengths of technology, they often refer to its use in advocacy, where it can put problems a face and create deeper emotional connections.
A non-governmental organization created a video of Javier Arturo Valdez Cárdenas, a Mexican journalist who was murdered in 2017, apparently demanding justice for his own murder. The parents of Joaquin Oliver, a 17-year-old man who was murdered in a 2018 mass shootings at a high school in Parkland, Florida, digitally resuscitated him for a video promoting gun legislation. Police in the Australian state of Victoria used a police officer who died of suicide in 2012 to deliver a message about mental health support.
And “Welcome to Chechnya,” a documentary released last year about the purges of gays and lesbians in Chechnya, used the technology to protect the identities of vulnerable Chechens.
The effects could also be used in Hollywood to better age or age actors, or to improve dubbing of movies and TV shows in different languages and to closely align lip movements with the language on the screen. International company executives could also look more natural when reaching out to employees who speak different languages.
However, critics fear that as the technology improves, it will continue to be misused, particularly to create pornography that puts one person’s face on another person’s body.
Nina Schick, author of Deepfakes: The Coming Infocalypse, said the earliest deepfaked pornography took hours to produce, so celebrities are the typical targets. However, as technology advances, less content is required to create the videos, putting more women and children at risk.
According to BuzzFeed News, a tool in the Telegram messaging app that allowed users to create simulated nude images from a single uploaded photo has been used hundreds of thousands of times.
“This is becoming a problem that can affect everyone, especially those who do not have the resources to protect themselves,” said Ms. Schick.
The technology could also have a destabilizing effect on global affairs as politicians claim that videos, including real ones, are fake to gain what law professors Robert Chesney and Danielle Citron have called the “liar’s dividend.”
In Gabon, opposition leaders argued that a video of President Ali Bongo Ondimba, who gave a New Year’s address in 2019, was faked to cover up health problems. Last year, a Republican candidate for a seat in the House of Representatives in the St. Louis area alleged that the video of George Floyd’s death in police custody was digitally staged.
As technology advances, it will be used more widely, according to Artificial Intelligence expert Gregory, but its effects are already pronounced.
“People always try to think about the perfect deepfake when it is not necessary for the harmful or beneficial uses,” he said.
When it launched the Deep Nostalgia tool, MyHeritage addressed the issue of consent and encouraged users to “use this feature on your own historical photos, not photos of living people without their permission.” Ume, who created the Mr. Cruise deepfakes, said he had no contact with the actor or his agents.
Of course, people who have died cannot agree to be shown on videos. And that’s important if the dead – especially celebrities – can be digitally resuscitated, like artist Bob Ross was supposed to sell Mountain Dew or like Robert Kardashian received a gift to daughter Kim Kardashian West from husband Kanye West last year.
Henry Ajder, a deepfakes researcher, envisioned a future where our own voices could be used with assistants like Amazon Alexa so that we can stay connected with loved ones after we die. Or, as pointed out in an episode of “Black Mirror”, whole aspects of our personality after death could be simulated, trained through our voices on social media.
But that begs a tricky question: “In what cases do we need the deceased’s consent to resuscitate them?”
“These questions make you feel uncomfortable, something feels a little wrong or worrying, but it’s difficult to know if that’s just because it’s new or if it suggests a deeper intuition about something problematic,” said Ajder.