Deceptively real, but all lies: 63 per cent are afraid of deepfakes
- One third of Germans have never heard of deepfakes
- 81 per cent believe they cannot recognise deepfakes
- Half see possible applications in art and culture
Berlin, 24 July 2023 - The alleged Pope in a white down jacket, the supposed arrest of Trump or Franziska Giffey's conversation with a fake Klitschko - what looks deceptively real at first glance turns out to be a so-called deepfake in retrospect. These are images, audios or videos that have been deceptively altered or falsified. For example, you can have someone say or do something in videos that he or she never said or did in reality. A slight majority of Germans have already heard or read about such deepfakes (60 percent). However, only 15 percent can explain well what is meant by deepfakes. 23 percent know at least something about it, 22 percent have read or heard about it, but do not know exactly what it is. A third (33 percent), on the other hand, have never read or heard of deepfakes. These are the results of a representative survey of 1,002 people in Germany aged 16 and over commissioned by Bitkom.
Germans (81 percent) say they would not recognise a deepfake. 44 percent say they have already been fooled by a deepfake. 70 percent are of the opinion that photos and videos can no longer be trusted and 63 percent even say that deepfakes scare them. 60 percent see deepfakes as a danger to our democracy. On the other hand, more than half also see positive uses for them: 55 per cent think deepfakes could be put to good use, for example in the cinema or in art. "Retouching is as old as photography and audios and videos have been edited and changed since time immemorial. In the past, such interventions were reserved for highly specialised experts, but today they can be created with a few clicks even without the corresponding prior training. This makes it all the more important to create awareness for this phenomenon and to sensitise people to it," says Dr Bernhard Rohleder, CEO of Bitkom.
The vast majority have encountered deepfakes in information programmes: 63 percent say they have seen deepfakes in reports on the topic. Only 2 per cent have recognised deepfakes on the internet that were not marked as such. 8 percent have come across deepfakes that were marked as such. And 3 percent have tried out software themselves that can be used to create deepfakes.
A broad majority (84 percent) calls for compulsory labelling of deepfakes, 60 percent say they should be banned altogether. Rohleder: "Even if there were a labelling obligation or even a ban, the very people we want to protect ourselves from, such as cybercriminals or the troll factories of states that are hostile to us, would not adhere to it. Education and media literacy are indispensable in the fight against deepfakes. Each and every individual should carefully check whether a text, picture or video is authentic before liking or sharing it on social media, for example."