Over the past month, the face of the President of Ukraine, Vladimir Zelensky, has become known as never before, and this has its downside. So, the attackers create deepfakes with his participation, attributing to him words that he never said. For example, last week, a deepfake video began to spread online, in which Zelensky allegedly called on the Ukrainian army to lay down their arms and surrender. The Ukrainian president himself called it a “childish provocation.”
Although the video was not particularly convincing, experts fear that more advanced deepfake technologies may cause real chaos in the future.
“We may encounter other deepfakes, more impressive and complex, which are very difficult to distinguish from the real ones,” Abby MacDonald, a specialist at the Canadian Institute of International Relations in the field of security and defense policies, told reporters of the Canadian Broadcasting Corporation (CBC).
Deepfake technologies did not appear yesterday, and are already known to users. As McDonald explained, they range from cheap low-quality ones created with the help of more primitive software, to complex deepfakes using artificial intelligence and innovative technologies and characterized by high realism.
Recently, deepfakes have come to the fore, so the appearance of a fake video with the participation of Vladimir Zelensky is quite expected in the current circumstances.
“I’m sure it was definitely worth expecting. I don’t know if it was created by the government or by its order, or just by someone on the Internet to deceive people,” said Alyssa Demus, senior analyst at the scientific organization.
However, it is not only the Ukrainian president who is now an object for creating dipfakes. The same can be said about Vladimir Putin. So, recently a video was circulating in social networks in which the Russian president allegedly announces a truce.
However, the presence of deepfakes is one thing, and protection from them is quite another.
“I think the main question is: will there be more of them (deepfakes – ed.), in general, around the world? That’s what’s really scary. The fact is that we have already seen the deepfakes, and so far it has been quite easy to debunk them, it is quite easy to show where they came from. But I think that after a while the situation will change,” said Eliot Borenstein, a professor at New York University.