A beheading video was on YouTube for hours, raising questions about why it wasn’t removed sooner.

A graphic video of a Pennsylvania man accused of beheading his father that circulated for hours on YouTube has once again highlighted gaps in social media companies’ ability to prevent horrific posts from spreading across the web.

Police said Wednesday they charged Justin Mohn, 32, with first-degree murder and abuse of a corpse after he beheaded his father, Michael, in their Bucks County home and posted it in a video of 14-minute YouTube that anyone, anywhere could watch.

News of the incident, which drew comparisons to beheading videos posted online by Islamic State militants at the height of their prominence nearly a decade ago, came as the CEOs of Meta, TikTok and other social media companies testified in front Federal lawmakers are frustrated by what they see as a lack of progress on child online safety. YouTube, owned by Google, did not attend the hearing despite its status as one of the most popular platforms among teenagers.

The disturbing video from Pennsylvania follows other horrifying clips that have aired on social media in recent years, including national mass shootings broadcast live from Louisville, Kentucky; Memphis, Tennessee; and Buffalo, New York, as well as killings filmed abroad in Christchurch, New Zealand, and the German city of Halle.

Middletown Township Police Capt. Pete Feeney said the video in Pennsylvania was posted around 10 p.m. Tuesday and was online for about five hours, a delay that raises questions about whether social media platforms are complying. with moderation practices that could be more necessary than ever in the midst of the crisis. wars in Gaza and Ukraine, and an extremely contentious presidential election in the United States

“It’s another example of the blatant failure of these companies to protect us,” said Alix Fraser, director of the Council for Responsible Social Media at the nonprofit Issue One. “We can’t trust them to do their own homework.” .

A YouTube spokesperson said the company removed the video, removed Mohn’s channel and was tracking and removing any replays that might appear. The video-sharing site says it uses a combination of artificial intelligence and human moderators to monitor its platform, but did not respond to questions about how the video was captured or why it wasn’t done sooner.

Major social media companies moderate content with the help of powerful automated systems, which can often detect banned content before a human can. But that technology can sometimes fall short when a video is violent and graphic in a new or unusual way, as it was in this case, said Brian Fishman, co-founder of trust and safety technology startup Cinder.

That’s when human moderators are “really critical,” he said. “AI is getting better, but it’s not there yet.”

About 40 minutes after midnight ET on Wednesday, the Global Internet Forum to Counter Terrorism, a group created by technology companies to prevent such videos from spreading online, said it alerted its members. about the video. GIFCT allows the platform with the original footage to submit a “hash” (a digital fingerprint corresponding to a video) and notifies nearly two dozen other member companies so they can restrict it from their platforms.

But by Wednesday morning, the video had already spread to X, where a graphic clip of Mohn holding his father’s head remained on the platform for at least seven hours and received 20,000 views. The company, formerly known as Twitter, did not respond to a request for comment.

Radicalization experts say social media and the Internet have lowered the barrier to entry for people to explore extremist groups and ideologies, allowing anyone who may be predisposed to violence to find a community that reinforces those ideas.

In the video released after the murder, Mohn described his father as a federal employee for 20 years, espoused a variety of conspiracy theories and railed against the government.

Most social platforms have policies for removing violent and extremist content. But they can’t capture everything, and the emergence of many newer, less moderated sites has allowed more hateful ideas to fester unchecked, said Michael Jensen, senior researcher at the Consortium for the Study of Terrorism and Responses to Terrorism, based in the University of Maryland. Terrorism or START.

Despite the hurdles, social media companies need to be more vigilant about regulating violent content, said Jacob Ware, a researcher at the Council on Foreign Relations.

“The reality is that social media has become the front line of extremism and terrorism,” Ware said. “That will require more serious and committed efforts to fight back.”

Nora Benavidez, senior adviser at the media advocacy group Free Press, said that among the technology reforms she would like to see are more transparency about what types of employees are affected by layoffs and more investment in the trust and safety of Workers.

Google, which owns YouTube, this month laid off hundreds of employees working on its hardware, voice assistance and engineering teams. Last year, the company said it cut 12,000 workers “across Alphabet, product areas, functions, levels and regions,” without offering additional details.


——


AP writers Beatrice Dupuy and Mike Balsamo in New York, and Mike Catalini in Levittown, Pennsylvania, contributed to this report.

Leave a Comment