Make the world a greater place with deepfakes

In a recent video, Brett Kavanaugh speaks to members of the U.S. Congress. “It’s time to correct the record,” he begins. Over the next few minutes, the Supreme Court judge admits he may have committed sexual assault and expresses remorse at the way he responded in his testimony to Christine Blasey Ford’s allegations. “I take responsibility for that and apologize.”

The thing is, this scene isn’t real. The footage is covered, and Kavanaugh never actually said those things.

In reality, Kavanaugh denied and ignored the allegations and played the victim. The video described above is from a series of fake clips envisioning a future where divisive public figures like Kavanaugh, Alex Jones and Mark Zuckerberg take responsibility for their past transgressions.

The series, titled Deep Reckonings, is an brainchild of Stephanie Lepp – an artist who wants to make positive change in the world by using deepfake technology to help people see and imagine better versions of themselves.

It’s a tall and somewhat abstract project, but Lepp is not alone in her endeavors. She is part of a growing league of developers who want to use deepfake technology to do good.

Deepfake it ’till you make it

Deepfakes have had a controversial journey so far. The technology has been used extensively for nefarious purposes such as pornography creation and disinformation campaigns, putting it under scrutiny by both governments and tech companies who fear the weapon of technology.

“Given that the vast majority of deepfakes are shameful in nature, it’s understandable that we focused on their weapon,” says Lepp. “However, this focus prevented us from realizing their prosocial potential. In particular, deepfakes can be used for education, health, and social change purposes. “

Stephanie Lepp

She argues that, similar to how virtual reality has been used to help patients recover from brain injury by interacting with virtual memories, deepfakes can be used for psychological healing in trauma victims. For example, imagine a scenario where doctors could write deepfakes of an addict’s sober future self and use it to encourage them on the path to recovery.

The concept is at least theoretically sound. Jonathan Gratch, director of virtual human research at the University of Southern California’s Institute of Creative Technologies, has found that seeing yourself in VR can be very motivating, and that the same concept can easily be applied to deep footage. He suggests that the patient would be more likely to follow the doctor’s advice if a patient’s face were subtly worked into their doctor’s face.

More than memes and misinformation

Despite the fact that negative uses of deepfakes tend to get more attention, positive uses like Lepp’s are on the rise. In the past few years, technology has made its way into storytelling, prosocial projects, and more.

For example, the ALS Association’s Revoice project enables amyotrophic lateral sclerosis patients who have lost their ability to speak to continue using their voice. As? By using deepfakes, personalized synthetic vocal tracks can be created that can be played back with a soundboard if necessary.

In a separate project by the nonprofit anti-malarial organization Malaria Must Die, celebrity athlete David Beckham delivered a message in nine different languages ​​(and voices) thanks to deeply faked audio and video matching his lips with the words.

In a particularly eye-catching campaign in early 2020, the Massachusetts Institute of Technology’s Center for Advanced Virtuality attempted to educate the public about misinformation by submitting a deepfake of former US President Richard M. Nixon, who wrote it in 1969 for the fall of the Apollo Emergency speech made 11 crew members unable to return from the moon.

These types of public announcements and awareness campaigns are just the tip of the iceberg. Deepfake tools have also helped simplify processes in the entertainment industry that would otherwise require high-end equipment and time-consuming resources, such as: B. Aging, voice cloning and more. For example, every face in a recent music video from The Strokes was faked so that the band’s 40-year-old members could look like they were 20 years old.

Ohad Fried, lecturer in computer science at the Herzliya Interdisciplinary Center in Israel, says that thanks to deepfakes, “what used to take years of artist time can now be achieved by independent small studios. This is always good news for the variety and quality of the media we consume. “

Tilt the scales

However, the potential of deepfake technology to wreak havoc – especially as it becomes more accessible – remains an issue. Aviv Ovadya, founder of the Thoughtful Technology Project, agrees that the ability to create synthetic media “can have numerous positive effects on storytelling, on people with disabilities, and on more seamless communication between languages.” At the same time, however, he warns that there is still a lot of room for damage once technology goes mainstream and that a lot of work needs to be done to minimize those risks.

“Even these positive use cases can inadvertently lead to real and significant damage,” he told Digital Trends. “Excerpts from works of art that try to create empathy can also be taken out of context and misused.”

“The goal should be to develop this technology in such a way that these negative effects are kept as low as possible.”

Experts have repeatedly sounded the horn to divert more resources into detection programs and official ethical guidelines – although legal encroachment could affect freedom of expression. But no one is quite sure in which direction deepfakes will ultimately go. As with any new technology, there will come a point where deepfakes strike an equilibrium and the responsibility lies with tech companies, policymakers, and developers to make sure the scales remain on the good side.

Ovadya also suggests limiting the accessibility of deepfake tools to the masses until researchers can “complete some of the fortifications we need to protect our society from the potential negative effects. The aim should be to develop this technology in such a way that these negative effects are at least reduced as much as possible. “

For now, Lepp will be spending her time focusing on her next deepfake protagonist: Donald Trump and his concession speech.

Editor’s recommendations




Comments are closed.