- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
X claims it is erasing ‘illegal’ Hamas content after EU ultimatum::The platform follows X and Meta to be urged to step up efforts to stop the spread of disinformation.
Why don’t any of the articles mention what the “illegal content” consists of?
Maybe because it varies by country? I’d think specific, violent threats would be illegal basically everywhere but hate speech and antisemitism are illegal in much of Europe. In America, they’d have to take down any Hamas videos that have a copyrighted song in the background.
It’s probably just misinformation/disinformation
Been seeing a lot of concerning things on the Gaza Now TG channel, like referring to the concert goers and civilian girls as Israeli military.
I’m also seeing people refer to everyone in Gaza as a part of Hamas.
And who deems what is misinformation/disinformation?
This is a fair question. Everybody has an agenda
Anything that is pro palestine or anti-israel.
For France and maybe EU too it is apology of terrorism.
Who watches the watchers?
Removed by mod
*Tiktok
This is the best summary I could come up with:
It urged CEO Shou Zi Chew in a letter to “urgently step up” efforts, and spell out “within the next 24 hours” how it is complying with European law.
The firm’s chief executive Linda Yaccarino responded by telling the bloc it had removed or flagged “tens of thousands of pieces of content” since Hamas attacked Israel.
The EU declined to comment on whether it had received a response from Meta, but a European Commission spokesperson said “contacts are ongoing” with the company’s compliance teams.
A Meta spokesperson told the BBC: “After the terrorist attacks by Hamas on Israel on Saturday, we quickly established a special operations centre staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation.”
"Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation.
The Digital Services Act (DSA) requires so-called “very large online platforms” to proactively remove “illegal content”, and show they have taken measures to do so if requested.
The original article contains 606 words, the summary contains 191 words. Saved 68%. I’m a bot and I’m open source!