The faceless cowards

Taylor is 22 years old. She’s an ordinary college student who lives in a small New England town. One day, a friend sent him the link to a video from the Pornhub site. “Sorry,” he wrote to her, “but I think you should look at this.” » She hesitates for a moment, clicks on the link. The video starts. She doesn’t know this woman’s body, completely naked. But this face… no doubt, it’s his.




Taylor is in shock. She does not understand. This woman in the video is her, but it’s not her.

It’s a nightmare that more and more women are going through, all over the world, with the emergence of artificial intelligence. Many celebrities are targeted, but also unknowns like Taylor. Finally, Taylor is the alias that the student chose to tell her story in the moving documentary Another Body.1

A few days ago, web counterfeiters targeted another Taylor – a real one, this one. Hyper-faked images of Taylor Swift have been viewed tens of millions of times on . Faced with global indignation, X was quick to clean up. “Our teams actively remove all identified images and take appropriate action against the accounts responsible for their publication,” the company assured.

SCREENSHOT FROM X, ASSOCIATED PRESS

Social network X temporarily blocked searches for Taylor Swift after images deepfakes pornographic images of the singer circulated on the platform.

Good for Taylor Swift. Unfortunately, not all women are entitled to such a prompt reaction. In both the United States and Canada, victims are often left to fend for themselves in the Wild West of the World Wide Web. A world without faith or law, where the technological tools allowing the creation of false pornographic images are within the reach of anyone. Any bozo can now download an application on their iPhone to quickly make a porn video, hyper-rigged but hyper-realistic, featuring the object of their fantasies.

Her favorite actress. His ex-blonde. His colleague. Her neighbor. His classmate…

No one is immune. Worse, the victims are exposed like never before.

It only takes a few clicks to come across “ deepfake porn » or “ fake nudes “. Incredibly, Google and other search engines direct Internet users to these sites that trade in porn videos of women who have never consented to their images being used. One of these sites has 14 million views per month. At this rate, these videos may soon become trivial, perhaps even… the norm. “Hey, did you see the last one? deepfake by (enter your favorite singer here)? »

Governments must legislate, and quickly, to curb this disastrous trend.

We are very concerned about advances in artificial intelligence, its impacts on elections and, more broadly, on democracy. In 2023, the Canadian Security Intelligence Service declared that hyperfakes were “a real threat to Canada’s future.”2 We got a taste of what’s to come when a robocall imitating Joe Biden’s voice encouraged voters not to vote during the New Hampshire primary.

All of this is certainly very worrying, but we must still remember that no less than 98% of all hyperfakes circulating on the web are of a pornographic nature, according to a report published in 2023.3 Not surprisingly, 99% of those targeted are women.

Women who are too often told that, sorry, there is nothing we can do. Taylor, the student, was shocked to discover several other fake videos of herself on the Pornhub site, accompanied by her (real) name, city and school. She was traced, harassed. “The detective assigned to the case told me it was disgusting what happened, but that the person (who created the videos) had the right to do it. She had not broken any laws,” she says in the documentary. The police closed the file.

It can’t continue like this. If the images are false, the trauma inflicted on the targeted people is all too real. Photos of graduation, of a family trip, of a happy event posted innocently on Facebook are copied and used to illustrate degrading, often violent pornographic scenes. This is more than the theft of images, more than an invasion of privacy. Several women described feeling it like rape.

And then, it’s not just women who are in danger. In Quebec, Judge Benoit Gagnon announced last year “a new era of cybercrime” after condemning pedophile Steven Larouche, who used hyperfaking to modify videos of assaulted children. “The use of hyperfaking technology by criminal hands is chilling,” warned the judge. This type of software makes it possible to commit crimes that could involve virtually all the children in our communities. »

Although this pedophile may have been convicted, no law specifically prohibits hyperfaking in Canada. Despite its ravages, technology has not yet entered the Criminal Code.

The resounding Taylor Swift affair could help shake things up. The media are talking about it. Politicians are calling for laws. The White House has also promised to legislate. Canada must follow the same path. Let us punish the profiteers of hyperfaking: the websites which host these false images, the Internet users who consume them, the search engines and social networks which facilitate their sharing, as if it were not serious, as if it didn’t destroy lives.

But above all, let us punish the faceless cowards who give themselves the right to steal those of others to create these abject videos, well hidden behind their screens.


reference: www.lapresse.ca

Leave a Comment