[French media] said the investigation was focused on a lack of moderators on Telegram, and that police considered that this situation allowed criminal activity to go on undeterred on the messaging app.
Europe defending its citizens against the tech giants, I’m sure.
There’s a lot of really really dark shit on telegram that’s for sure, and it’s not like signal where they are just a provider. They do have control the content
In your head in confirms what you want, because you’re biased. You just don’t know what “readily available” means. Can’t help you there. Your entire article makes my point…
The content on telegram is there almost indefinitely, and readily available. What youre sharing is almost instant bans, includes also reports to links of suspected activity, not the content directly.
You’re young. It really was a thing. It never stayed up long, and they found ways to make it essentially instantaneous, but there was a time it was easy to find very unpleasant things on Facebook, whether you wanted to or not. Gore in specific was easy to run across at one point. CP, it was more offers to sell it.
They fixed it, and it isn’t like that now, but it was a problem in the first year or two.
So now it’s not that it’s readily available, it’s that it was in the beginning. So everyone is allowed to let CP go in the first years of their platform? Is that what youre going with. Eww
So you don’t see the difference between the platforms that actually has measures in place to try and prevent it and platforms that intentionally don’t have measures in place to try and prevent it?
Man, Lemmings must be even dumber than Redditors or something
Safe harbour equivalent rules should apply, no? That is, the platforms should not be held liable as long as the platform does not permit for illegal activities on the platform, offer proper reporting mechanism, and documented workflows to investigate + act against reported activity.
It feels like a slippery slope to arrest people on grounds of suspicion (until proven otherwise) of lack of moderation.
Telegram does moderation of political content they don’t like.
Also Telegram does have means to control whatever they want.
And sometimes they also hide certain content from select regions.
Thus - if they make such decisions, then apparently CP and such are in their interest. Maybe to collect information for blackmail by some special services (Durov went to France from Baku, and Azerbaijan is friendly with Israel, and Mossad is even suspected of being connected to Epstein operation), maybe just for profit.
Europe defending its citizens against the tech giants, I’m sure.
There’s a lot of really really dark shit on telegram that’s for sure, and it’s not like signal where they are just a provider. They do have control the content
Removed by mod
I don’t recall CP/gore being readily available on those platforms, it gets reported/removed pretty quickly.
Removed by mod
Riiight
Edit: Is telegram really an encrypted messaging app (spoiler: no) get off your high horse defending exiled russian oligarchs in the name of encryption.
Removed by mod
In your head in confirms what you want, because you’re biased. You just don’t know what “readily available” means. Can’t help you there. Your entire article makes my point…
The content on telegram is there almost indefinitely, and readily available. What youre sharing is almost instant bans, includes also reports to links of suspected activity, not the content directly.
Removed by mod
Removed by mod
You’re not using the right search terms?
Readily available means you don’t need to search. Y’all are on another level searching for this shit lmao.
You’re probably just not tapped into any of the informal networks that are spreading CP on those platforms.
You’re young. It really was a thing. It never stayed up long, and they found ways to make it essentially instantaneous, but there was a time it was easy to find very unpleasant things on Facebook, whether you wanted to or not. Gore in specific was easy to run across at one point. CP, it was more offers to sell it.
They fixed it, and it isn’t like that now, but it was a problem in the first year or two.
Removed by mod
Haha, young ? i wish. But go on making stuff up.
So now it’s not that it’s readily available, it’s that it was in the beginning. So everyone is allowed to let CP go in the first years of their platform? Is that what youre going with. Eww
The fuck are you smoking?
Damn, I hope there’s no upper limit to block lists
I guess he just wanna links
So you don’t see the difference between the platforms that actually has measures in place to try and prevent it and platforms that intentionally don’t have measures in place to try and prevent it?
Man, Lemmings must be even dumber than Redditors or something
If they similarly go unmoderated then action should be taken
Safe harbour equivalent rules should apply, no? That is, the platforms should not be held liable as long as the platform does not permit for illegal activities on the platform, offer proper reporting mechanism, and documented workflows to investigate + act against reported activity.
It feels like a slippery slope to arrest people on grounds of suspicion (until proven otherwise) of lack of moderation.
Telegram does moderation of political content they don’t like.
Also Telegram does have means to control whatever they want.
And sometimes they also hide certain content from select regions.
Thus - if they make such decisions, then apparently CP and such are in their interest. Maybe to collect information for blackmail by some special services (Durov went to France from Baku, and Azerbaijan is friendly with Israel, and Mossad is even suspected of being connected to Epstein operation), maybe just for profit.
Do you have any links/sources about this? I’m not saying you’re wrong, I’m just interested
No, but they do sometimes delete channels for gore and such. I remember a few Azeri channels being banned for this during/after 2020 war.
About having means - well, with server-side stored unencrypted everything it’s not a question.
About hiding channels per region by governmental requests - I’ve heard about that on Lemmy.
Where did you get that the data on the servers are not encrypted?
You are, ahem, not decrypting it when getting history and not encrypting it when uploading files. That should be sufficient.
Anyway, look at TG Desktop sources. They are crap, but in general it’s clear what happens there. At least that’s how I remember it.
Thank you, really appreciate it!
https://blog.cryptographyengineering.com/2024/08/25/telegram-is-not-really-an-encrypted-messaging-app/
Thing is, Telegram don’t do shit about it
I don’t know how they manage their platform — I don’t use it, so it’s irrelevant for me personally — was this proven anywhere in a court of law?