A video of a Hamas gunman firing his assault rifle at a car full of Israeli civilians was viewed more than one million times on X, the platform formerly known as Twitter, since it was uploaded Sunday.
A photograph of dead Israeli civilians, strewn across the side of a road in an Israeli kibbutz near the Gaza Strip, has been shared more than 20,000 times on X.
And an audio recording of a young Israeli woman’s desperate cries for help as she was being kidnapped from her home has been shared nearly 50,000 times on the platform.
Since Hamas launched a deadly cross-border attack into Israel over the weekend, violent videos and graphic images have flooded social media. Many of the posts have been seeded by Hamas to terrorize civilians and take advantage of the lack of content moderation on some social media sites — particularly X and Telegram — according to a Hamas official and social media experts interviewed by The New York Times.
The strategy mirrors efforts by extremist groups like the Islamic State and Al Qaeda, which took advantage of the lack of guardrails at social media companies years ago to upload graphic footage to the internet. Social media companies reacted then by removing and banning accounts tied to those groups.
The issue has sprouted anew in the past week, particularly on X, where safety and content moderation teams have largely disbanded under Elon Musk’s ownership, and on Telegram, the messaging platform which does virtually no content moderation.
Israeli groups who monitor social media for hate speech and disinformation said graphic imagery often starts on Telegram. It then moves to X before finding its way to other social media sites.
“Twitter, or X as they are now called, has become a war zone with no ethics,” said Achiya Schatz, director of FakeReporter, an Israeli organization that monitors disinformation and hate speech. “In the information war being fought, it is now a place where you just go and do whatever you want.”
In the past, his group reported fake accounts or violent content to X, which removed the post if it violated its rules, Mr. Schatz said. Now, he added, there is no one at the company to talk to.
“Everyone we once worked with is gone. There is no one to reach at that company,” he said. “The information war on Twitter is gone, lost. There is nothing left to fight there.”
He added that platforms like Facebook, YouTube and TikTok had been responsive about removing graphic images and misinformation, although the companies were being inundated with requests.
Telegram and X did not respond to a request for comment. Over the weekend, X’s safety team posted an update to its policies, stating that it was removing Hamas-affiliated accounts and had taken action on tens of thousands of posts.
Nora Benavidez, senior counsel at Free Press, a media advocacy group, said the state of discourse on X during the conflict was “the terrible but natural consequence of 11 months of misguided Musk decisions.”
She cited the rollback of policies against toxic content, cuts in staff and the priority given to subscription accounts, which “now allows, even begs for, controversial and incendiary content to thrive.”
Some of those subscription accounts have also been posting fake or doctored images, said Alex Goldenberg, the lead intelligence analyst at the Network Contagion Research Institute at Rutgers University.
Researchers have identified images from video games that were posted on TikTok as actual footage. Old images from the civil war in Syria and a propaganda video from Hezbollah, the Lebanese Shiite militant organization, have been circulated as new.
“It’s a problem across social media,” Mr. Goldenberg said.
Mr. Schatz said his organization on Sunday identified a video of children in cages that had been viewed millions of times on X, amid claims that the children were Israeli hostages of Hamas. While the origins of the video aren’t clear, Mr. Schatz found versions posted weeks ago on TikTok, and other researchers have discovered versions of the video on YouTube and Instagram claiming it was from Afghanistan, Syria and Yemen.
“We reported that the video was fake, and definitely not a current video from Gaza, but nobody at X responded,” Mr. Schatz said. “The real videos are bad enough without people sharing these fake ones.”
The effect of the videos has been stark. Some Israelis have begun avoiding social media for fear of seeing missing loved ones featured in graphic footage.
Dr. Sol Adelsky, an American-born child psychiatrist who has been living in Israel since 2018, said many parents had been advised to keep their children off social media apps.
“We are really trying to limit how much stuff they are seeing,” he said. “Schools are also giving guidance for kids to be off certain social media apps.” Some schools in the United States have also encouraged parents to tell their children to delete the apps.
Dr. Adelsky added that even with the guidance, a lot of unverified claims and frightening messages had made their way to people through messaging apps like WhatsApp, which are popular among Israelis.
The fear and confusion are part of the strategy, according to a Hamas official who would speak only on the condition of anonymity.
The official, who used to be responsible for creating social media content for Hamas on Twitter and other platforms, said the group wanted to establish its own narratives and seek support from allies through social media.
When ISIS published videos of beheadings on social media, he said, the footage served as a rallying cry for extremists to join its cause, and as psychological warfare on its targets. While he stopped short of saying Hamas was following a playbook laid out by ISIS, he called its social media strategy successful.
Source link