Babiš's fascination with false forgiveness and spectacular profits. Most Czechs don't know about deepfakes

Fake videos created by artificial intelligence called deepfakes are increasingly spreading on social media. For example, after the uproar over former Prime Minister Andrej Babiš's e-mail, in which he requested sensitive information about Foreign Minister Jan Lipavsky, Babiš's apology spread across the Internet — at first glance, it might be true, but it's a fake. Mimicking someone's face or voice can take a few minutes. However, according to a recent survey, most Czechs either do not know such a system exists or do not know how to use it.

The leader of the ANO movement also appears in many fake videos, which are mostly aimed at attracting people to invest. In them, Babiš usually tells people about “fairytale” investment opportunities, but it's a scam.

Babis recently objected to this. “You often write to me saying that I recommend some financial product or strategy. Please don't believe this, it's a scam. Some scammers use artificial intelligence to misuse my name, photo, sometimes voice and image. If you see this, report it to your network operator and others. Be warned. I will be grateful if you share this,” said the leader of the ANO movement on his Facebook profile.

Half the people don't know what it is

But the phenomenon of deepfake videos is still unknown among Czechs. A deepfake is a seemingly real, AI-generated image, sound or video that is not based on reality. But most of us don't know this, according to a study conducted by the Media Agency for the Central European Digital Media Observatory (CEDMO) at Charles University.

Two out of five respondents correctly answered the question of what a deepfake is. About half chose the answer I don't know, and less than a tenth answered incorrectly. At the same time, more than a third of people, 35 percent, said they had encountered in-depth content in recent months.

See also  Peter Powell: I am not under electoral pressure. You can't just tell people good news

“Among the respondents' answers, the correct preference appeared statistically higher for men than for women,” said CEDMO data analyst Lucas Cuddle. He said that university-educated, students and self-employed people have the most knowledge and, depending on the region, residents of Prague and Central Bohemia. “In terms of political preferences, they are mostly voters of Birstan coalitions, but also voters of Trikolar, Svobodny and Soukromnik, which is interesting because they are voters of two coalitions whose voter base usually does not overlap,” Kudil said.

Three-quarters of people in the survey said that the majority of Czech society does not use artificial intelligence tools. Only one percent of respondents use one or more times a day. Five percent said they used it several times a week. The younger generation works more with these tools.

In January, a third of those surveyed recorded a fake video with Interior Minister Vít Rakušan (STAN) that circulated on social networks and in which he insulted the citizens of Karvina. 17 percent of the respondents felt it was real.

A moment

Currently, it is almost impossible for artificial intelligence (AI) to imitate human speech and tell the difference. For image and video, it's not perfect yet, but the technology will catch up in the future, says Barbara Zitová of the Institute of Information Theory and Automation at the Czech Republic's Academy of Sciences.

See also  For agricultural products, we will create a Red Book of Endangered Foods

“AI technology is advancing rapidly, as computing power is increasing, experts are collecting more data, mainly because the field is becoming more and more interesting, more people are starting to do research,” said Zitová.

According to the scientist, the so-called deepfake can be created very quickly. Usually two minutes is enough for that person. According to the expert, today's software offers a form of protection in the form of a certain person's approval, but there are certainly software that can be ignored. “For audio, even a one-minute recording is enough, and I know from the apps I've tested that there are no restrictions or approval requirements,” Zidova added.

According to Zidova, even an ordinary person can create a deep impression. “The technology is available to the general public. Some companies are starting to charge for their services, but this does not stop those who are trying to achieve something through such fraud,” said the scientist.

You can verify the identity of the person on the other end of the phone with some information or password known only to the person in question. Another option is to think about the content of the message and, in the best case, call the person back. “It is realistic to have an app on mobile phones that checks whether there is artificial intelligence or a person on the other end,” Zidova noted. According to him, banks should opt out of voice ID verification.

See also  Abrams for Ukraine - The US military has not yet started tank training

Police officers are also recording

According to the police, this is an increasingly common form of fraud. “Recently, we have noticed a significant increase in cases where criminals try to lure victims into fraudulent investments using fake videos called deepfake videos. These are fictitious conversations modified by advanced technologies. As a rule, these are videos of well-known people recommending certain investment sites. In this way, fraudsters lure victims into a They are abusing the trust of well-known people with the aim of getting them to invest in the fraud site,” police spokeswoman Violeta Chirishova told Echo24.

Deceptive advertisements of high profits from ČEZ shares are also almost always evergreen. Cyber ​​fraudsters often use Google or Facebook to advertise through famous companies and politicians, whether it's President Peter Pavle or Prime Minister Peter Fiala.

In general, the best way to protect yourself from deepfakes is to be cautious and not believe everything a user sees on the Internet. If someone tries to sell him a financial product or strategy recommended by famous people, he should visit the official websites of these famous people.

“The culprit, and hence the culprits, if caught and found guilty, faces up to five years in prison,” the police warned.

Leave a Reply

Your email address will not be published. Required fields are marked *