Recognizing and resisting deepfake
Rutte calling for a meat tax or mouthing the word climate crisis at all. Or take the following sentence: “Corona was a toddler-sized crisis compared to what lies ahead for us due to climate change.” You can hardly imagine it and yet our prime minister says those exact words in this clip. You’ve probably already seen it pass by; this deepfake from the Correspondent (an independent journalism platform) in which they finally show the prime minister real climate leadership.
Deepfakes are on the rise. Through artificial-intelligence software, one can now create completely new images and audio clips that are virtually indistinguishable from the real thing. The software analyzes a person’s voice, movements, facial expressions, skin textures, and hair color and then uses that data to create new material. With just 30 seconds of existing footage and/or audio material, you can already get someone to do and say what you want.
Deepfake for positive purposes
Deepfake can be used for a variety of purposes. Take the above example, a playful way to wake up contemporary politics and put climate change on the map. Or consider David Beckham’s video in which he calls attention to the dangers of Malaria in nine different languages in the blink of an eye. The software is also useful in education. For example, in future history classes, Martin Luther King himself could explain in a video how his life went and what he accomplished.
Ethical issue
The police also use deepfakes. On May 22, 2022, they put a video live of Sedar Soares being shot to death. Sedar was shot dead at the age of 13, and the bereaved family, 20 years after the fact, are still left with questions about who is responsible. Specialists created a video based on photos of the boy playing soccer and talking about his dreams. At the same time, a voice rings out asking who exactly is the perpetrator of this sweet boy’s murder.
In this case, Sedar’s parents gave permission for this video to be made. But what if this is not so? This raises ethical questions. After all, if someone has not given permission for deepfake to create a fake video, it is easy to imagine that this could cause unpleasant situations. It may even cause individuals to become digitally immortal.
Deepfake as manipulation
Unfortunately, deepfakes can also be used in manipulative ways. For example, there are several cases where technology is used to make an 18+ film of someone who did not play in it at all. Or to spread fake news, for example, like this video of Donald Trump going around Belgium supposedly proclaiming that Belgium should pull out of the Paris climate agreement. This video eventually turned out to have been made by a Belgian political party.
Deepfake is also used to provoke large financial transactions. Criminals captured 30 million euros in 2020 by mimicking the voice of a CEO. This “CEO” asked if money could be transferred quickly. Using “deepvoice,” the criminals managed to influence the people on the other end of the line to perform the transaction.
Can everyone use it?
Even though artificial intelligence is fundamentally complex, it is easy to use yourself. With the advent of new apps, anyone can use deepfake technologies. One example is FaceApp, which allows you to create an older or younger version of yourself or even change gender. Audio can also be easily manipulated, though apps for this are less accessible.
Future of deepfake
Scientists believe deepfake will firmly influence the future of “truth. What is real and what is not will no longer be so clear, say researchers at Tilburg University (Van der Sloot et al., 2021). They believe that in 5 years 90% of all online content will be manipulated by deepfakes. Technology already exists that can recognize and filter deepfake content, already recognizing some 65%. Although it is to be expected that this percentage will decrease in the future.
Politicians have also already responded to deepfake developments, however, the legal system currently only prohibits the use of deepfake for improper purposes. Developing and offering deepfake is not yet banned. Van der Sloot of Tilburg University thinks this is only a sham solution. ”The harm is often already done after a deepfake is developed,” he says.
How to make your organization resilient
Deepfakes are a real danger to the information viability of organizations, so it is important to focus on awareness. In addition to knowing about the existence of deepfakes, it is also important to recognize them.
With deepfake, therefore, pay attention:
- Jerky movements
- Strange blinking of the eyes
- Changes in exposure
- Changes in skin color
At deepvoice you pay attention:
- Crackling voice
- Poorly expressed emotions
- A monotone voice
- Vague voice sounds
Maybe you were already familiar with deepfake technologies but that doesn’t necessarily apply to all colleagues. To this end, we created the free-to-use Deepfake Challenge in collaboration with municipalities. In it, we explain what deepfake is and test if you can recognize it. Click this link to try the Deepfake challenge now!