While the media posted by the influencer has been removed, numerous text interactions with the deleted posts from his followers are still on the platform. Some of those posts mention a child depicted in the photos as young as one and a half years old.
To make matters worse, the image appears to have been on the platform for several days before being removed. Lucre even described the image in detail in a separate tweet, noting that it had been taken from a video. The video in question involved the abuse of three children, one of whom was reportedly strangled to death after the filming.
The CSAM material was left up for FOUR DAYS (July 22-26) before he was even suspended. Then they let him “delete” it…and reinstated him. People commented during those 4 days DESCRIBING THE IMAGES.
What the FUCK please tell me this is worth a visit from the FBI, getting removed from the App Store, some massive GDPR violation, fucking something. How is this story not bigger news?
To put this in perspective, 4 graphic CSAM images that an account with 500,000 followers posted were left up on Twitter for 4 days. The person who posted the images was suspended for less than a day.
Content warning: I deliberately avoid providing much more detail than “it was clearly CSAM” but I do mention the overall tweet contents and pretext.
I remember this from when it happened and unfortunately did see the text portion and thumbnail from the original tweet.
He did it under the pretext of reporting on the arrest of a person involved in the video and large-scale CSAM production. It started as a standard news-report-style where they list the name, age and arrest details of someone taken into custody. Initially it looked like the normal alt-right tweet about “look at how paedophilia is rampant and the world is sinful!”.
The guy describes himself as “chief trumpster” and a “breaker of narratives” and journalist. He claimed the details of the CSAM were provided by the Dutch police. He then described the title and detailed events of a CSAM video in the tweet. Unfortunately for me, the detailed events were below the tweet fold, so I had no idea it was going there until I expanded it.
The tweet image attachment or link unfurl thumb had a frame from the video itself. It was an otherwise-SFW image the adult abuser who was being talked about. Unfortunately I didn’t realise until after I had expanded the tweet text contents what the thumbnail was. I actually thought it was an opengraph error at first.
Even in the context of “reporting shocking content” the tweet was way over the line and went from 0 to 100 in a few words. I did not need the info on the CSAM, nobody except the police and courts does. The video title alone was over the line.
Musk phrasing this as another “I was told” decision is just him knowingly deferring responsibility.
Thanks, it was partly my fault for taking a curiosity tour of Twitter to see whether there was a noticeable right-wing shift from a few months earlier.
Hopefully I can at least prevent someone from having to go try to find the context for themselves and finding the full tweet, because it was awful even as text.
I assumed there would be “they took it out of context!” apologia when it was inevitably reported on, but the actual context didn’t improve anything or abdicate anyone from responsibility.
My heart goes out to the human moderators at twitter who had to see more of it and didn’t have the choice to bail before learning more. And obviously also to the victims of one of the most heinous acts I’ve ever heard about.
For what it’s worth, I had no idea if it was the absolute worst journalistic judgement I have ever seen, or a way for him to find more CSAM, or some bizarre combo. That is something for the FBI to find out. But I do know the decision to unban him is beyond wild, even for someone trying to bankrupt a social media company. Text almost never makes me physically recoil.
Yeah, even if we give them the benefit of the doubt that it was a bad attempt at journalism, there’s no reason to defend them. Also bizarre that Elon would even get involved in it.
Unfortunately, it’s a term everyone should know. It “replaces” the label child porn, because while it’s universally known as horrible, it’s not “porn.” It’s evidence of child sexual abuse. Hence “child sexual abuse material.”
Source
Source
The CSAM material was left up for FOUR DAYS (July 22-26) before he was even suspended. Then they let him “delete” it…and reinstated him. People commented during those 4 days DESCRIBING THE IMAGES.
What the FUCK please tell me this is worth a visit from the FBI, getting removed from the App Store, some massive GDPR violation, fucking something. How is this story not bigger news?
To put this in perspective, 4 graphic CSAM images that an account with 500,000 followers posted were left up on Twitter for 4 days. The person who posted the images was suspended for less than a day.
Content warning: I deliberately avoid providing much more detail than “it was clearly CSAM” but I do mention the overall tweet contents and pretext.
I remember this from when it happened and unfortunately did see the text portion and thumbnail from the original tweet.
He did it under the pretext of reporting on the arrest of a person involved in the video and large-scale CSAM production. It started as a standard news-report-style where they list the name, age and arrest details of someone taken into custody. Initially it looked like the normal alt-right tweet about “look at how paedophilia is rampant and the world is sinful!”.
The guy describes himself as “chief trumpster” and a “breaker of narratives” and journalist. He claimed the details of the CSAM were provided by the Dutch police. He then described the title and detailed events of a CSAM video in the tweet. Unfortunately for me, the detailed events were below the tweet fold, so I had no idea it was going there until I expanded it.
The tweet image attachment or link unfurl thumb had a frame from the video itself. It was an otherwise-SFW image the adult abuser who was being talked about. Unfortunately I didn’t realise until after I had expanded the tweet text contents what the thumbnail was. I actually thought it was an opengraph error at first.
Even in the context of “reporting shocking content” the tweet was way over the line and went from 0 to 100 in a few words. I did not need the info on the CSAM, nobody except the police and courts does. The video title alone was over the line.
Musk phrasing this as another “I was told” decision is just him knowingly deferring responsibility.
Thank you for the additional context. I’m sorry you had to be exposed to that.
Thanks, it was partly my fault for taking a curiosity tour of Twitter to see whether there was a noticeable right-wing shift from a few months earlier.
Hopefully I can at least prevent someone from having to go try to find the context for themselves and finding the full tweet, because it was awful even as text.
I assumed there would be “they took it out of context!” apologia when it was inevitably reported on, but the actual context didn’t improve anything or abdicate anyone from responsibility.
My heart goes out to the human moderators at twitter who had to see more of it and didn’t have the choice to bail before learning more. And obviously also to the victims of one of the most heinous acts I’ve ever heard about.
Thank for this. Seeing this explained this whole thing makes much more sense. Not in a good way, but I now understand. Edit: fixed typo
For what it’s worth, I had no idea if it was the absolute worst journalistic judgement I have ever seen, or a way for him to find more CSAM, or some bizarre combo. That is something for the FBI to find out. But I do know the decision to unban him is beyond wild, even for someone trying to bankrupt a social media company. Text almost never makes me physically recoil.
Yeah, even if we give them the benefit of the doubt that it was a bad attempt at journalism, there’s no reason to defend them. Also bizarre that Elon would even get involved in it.
Do I want to know what CSAM stands for?
Child sexual assault material, unfortunately.
Yep. Didn’t want to know that.
Unfortunately, it’s a term everyone should know. It “replaces” the label child porn, because while it’s universally known as horrible, it’s not “porn.” It’s evidence of child sexual abuse. Hence “child sexual abuse material.”
Right. Porn implies acting and consent.
deleted by creator
I didn’t use the term porn at all…
deleted by creator
You spent half your response “correcting” me for use of the the term porn, which I did not use. Respond to the OP for that. That’s all I’m saying.
Is it pernounced KaZAM! ?
Because it was one image and it wasn’t showing anything explicit. That’s how I understood it from reading more about this story.
deleted by creator
Dude, he posted it to highlight how evil it is. Maybe it was a stupid thing to do but that’s the extent of it.
deleted by creator
You’re evil