“40,000 accounts tried to use my face.” How with the help of AI images of Ukrainian women glorified China and Russia

“40,000 accounts tried to use my face.”  How with the help of AI images of Ukrainian women glorified China and Russia

[ad_1]

The acronym AI — artificial intelligence — is the most popular word of 2023, according to the Collins English online dictionary. And the more popular artificial intelligence becomes in the world, the more surprises it brings. Imagine that one day you wake up and see your face on the network, speaking with your voice, but in Chinese – this happened to a Ukrainian woman Olea Loyek. As experts say, this can happen to anyone.

Videos with the face of a Ukrainian woman generated by artificial intelligence appeared on Chinese social networks.

“My face speaks another language with my voice”

Oli is twenty. She is from Ukraine, studies cognitive sciences at a university in the USA, does an internship in Germany and shares her thoughts on what she has learned on her YouTube blog.

She launched a blog in November 2023, and within a month she saw her own face on Chinese social networks. However, there she is Natasha or April and speaks Chinese!

I was shocked because I saw my face speaking another language with my voice.

“I was shocked, my face speaks a different language with my voice,” says the Ukrainian woman.

Olya found out about fake videos from her subscribers who, after hearing Chinese from the girl’s mouth on these platforms, started sending her these deepfakes — videos created by artificial intelligence using her image. What was the girl’s surprise when she began to clarify the situation.

“I started subscribing and archiving all the videos I see. At first five accounts, then ten, then thirty. I started to see other accounts with my face,” says the blogger.

The glorification of Russia and China is the most popular topic

The girl found more than 40 clones on China’s largest video site Bilibili and on the platform Little Red Book, an alternative to Instagram. But Olya was most surprised by the topics discussed in these videos.

“A lot of accounts try to position me as a Russian woman. They all promote warm relations between China and Russia. My clone says, for example: “Only China helps us, only China buys Russian products. So let’s support each other, because we are so wonderful,” — says Olya Loek.

YouTube Screenshot / Olga Loiek

At most, some of these pages had close to half a million followers and were already apparently making money from the sale of these Russian products, using Oli’s face and voice.

“It’s a very interesting moment, because both then and now, to be honest, I didn’t make anything from YouTube. With hundreds of thousands of subscribers like that, the chance that someone will decide to buy these products is quite high. So I was a little uncomfortable, of course.” , – Olya shares.

The girl also found clones of many other girls, including the famous Swedish blogger Lana Blakely. Outraged by what she saw, Olya recorded a video on YouTube about hundreds of her clones.

“As a Ukrainian, this obviously outraged me, because my family is forced to hide during the air raid in Ukraine,” she said.

“About 40,000 accounts tried to use my face”

Thanks to publicity and the help of subscribers, Olya blocked at least 15 fake accounts and even contacted the company that produced these deepfakes, because some videos had their logo.

Olya Loek: “They reported that 5,000 videos with my face were created on their platform.”

Noticing suspicious activity, the platform blocked access to all accounts to use Olya’s image, this is how company representatives answered the girl.

“They blocked my face on December 26. Since then (the date of the conversation — March 18 — ed.), approximately 40,000 accounts have tried to use my face and create deep fakes,” says the Ukrainian woman.

It turns out that the company producing these videos did not ask for permission to use the image of a person, if the deepfakes were not created manually, but automatically. Now they assure that they have fixed this error.

How to act in such situations?

Rijul Guptathe head of a company that creates deepfake videos and checks videos for the use of artificial intelligence, explains that videos like this are an easy and cheap way to promote your narratives and influence people.

“It is no longer necessary to hire a person who would oppose his own country,” he says.

More often than not, the producers of such videos are looking for people who meet certain criteria, Rijul suggests: “Popular people are probably women. Probably attractive women, new to the network.”

But Rijul stresses: anyone can become a victim of such fabricated videos.

If you have even one 15-second video online, your face and voice are now available for anyone to deepfake.

Rijul is convinced that in order to protect against such cases, state institutions should cooperate with technology companies. In the meantime, the best thing to do in this situation is to be critical of any videos on the network. But to talk about the fact that anyone can become a victim of a deepfake, as Olya does, who, despite this story, continues her blog.

“I’m not afraid of it, this story has already happened to me. I want people to know what my real values ​​are, not those values ​​that are promoted by all the Natashas, ​​all the Annas on Chinese social networks. I would really like people knew the real me.”

[ad_2]

Original Source Link