TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

  • t3rmit3@beehaw.org
    link
    fedilink
    arrow-up
    55
    ·
    4 months ago

    I am generally very skeptical of lawsuits making social media and other Internet companies liable for their users’ content, because that’s usually a route to censor whatever the government deems “harmful”, but I think this case actually makes perfect sense by attacking the algorithmic “curation” that they do. Imo social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

    • Chahk@beehaw.org
      link
      fedilink
      arrow-up
      33
      ·
      4 months ago

      social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

      But then how would they make money if they can’t keep users doomscrolling forever to keep serving them ads? Won’t someone think of the shareholders?!

      • t3rmit3@beehaw.org
        link
        fedilink
        arrow-up
        14
        ·
        edit-2
        4 months ago

        Yes, but that is not the entirety or even majority of the problem with algorithmic feed curation by corporations. Reducing visibility of those dumb challenges is one of many benefits.

      • schnurrito@discuss.tchncs.de
        link
        fedilink
        arrow-up
        5
        ·
        4 months ago

        No it wouldn’t, but people would only see them if they were part of a preexisting community where such things are posted or they specifically looked for them.

        On the Internet, censorship happens by having too much information for our limited time and attention span, so going after recommendation algorithms will work.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    30
    ·
    4 months ago

    I’m gonna take the side that tok is potentially liable on the algo argument but these parents also failed their children. Teaching your kids to avoid replicating unsafe internet content should be just as primary as looking both ways before crossing the road.

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      As a society, we’re responsible for all our children. The point of child protection laws, and population protection in general, is to support and protect them, because often times, parents are incapable of doing so, or it’s social dynamics that most parents can’t really understand, follow, or teach in.

      Yes, parents should teach and protect their children. But we should also create an environment where that is possible, and where children of less fortunate and of less able parents are not victims of their environment.

      I don’t think demanding and requiring big social platforms to moderate and regulate at least to the degree where children are not regularly exposed to life-threatening trends is a bad idea.

      That stuff can still be elsewhere if you want it. But social platforms have a social dynamic, more so than an informative one.

  • stardust@lemmy.ca
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    4 months ago

    I remember reading that China’s version of tiktok more promotes stuff like sciences to kids. Then for everyone else they get degeneracy of stuff like stealing KIAs, licking grocery store items, and now black out challenges.

    It would be interesting if how the algorithm is tuned for China and the rest of the world was available. Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

    Stuff like Facebook and Twitter are insane too so it’s all self sabatoge at this point, but tiktok has seemed to become the trend setter.

    • viking@infosec.pub
      link
      fedilink
      arrow-up
      14
      ·
      4 months ago

      Yeah Douyin is pushing educational content and is very fast to censor harmful stuff. Still full of garbage and racism though, just the sanctioned kind against people the government doesn’t like.

    • LukeZaz@beehaw.org
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 months ago

      Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

      Good lord, this is a massive reach. A much simpler explanation is that algorithmic garbage is profitable, and China’s government does not care about negative ramifications that occur outside China itself and so do not regulate it.

      China’s run by a terrible government, not an MCU villain.

      • stardust@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Uhhh… I don’t think you got my point for why I also included Facebook and Twitter at the end as examples of domestic companies also willingly allowing harmful societal trends.

        Money being a reason doesn’t absolve and provide a convient out and let companies do whatever they want without consequence or criticism. I put them all in the camp of willingly selling out a worse society for profit, and whether a country sees that as a win for them or not doesn’t change that.

        • Yoruio@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          4 months ago

          this is just how capitalism works - you have to appeal to your audience more than your competition, and guess which kind of content teenagers want to watch more. Hell, even adults want fun content as opposed to educational content.

          they’re not willingly selling a worse society for profit, that’s just the only way to stay competitive.

          any platform that pushes educational content in North America would just not get any customers and go bankrupt.

          edit: there’s plenty of educational video platforms out there, like Khan academy. Try and get your kids to scroll through that during their free time instead, I bet they won’t.

          • stardust@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            4 months ago

            I know how capitalism works… I was just sharing my thoughts on the situation of a company knowingly adjusting the algorithm in a positive direction for one demographic but a negative for another showing a clear awareness of impact. Not sure why you are so worked up about tiktok getting criticized too. Whatever.

            • Yoruio@lemmy.ca
              link
              fedilink
              arrow-up
              2
              ·
              4 months ago

              In the US, publically traded companies have a legal obligation to make as much money for their shareholders as legally possible (See Ford getting sued by shareholders after giving workers raises). It would be borderline illegal for a company to adjust their algorithm in a way that makes them less competitive.

              This needs to be regulated by government, not the companies themselves. Thay would mean that the companies would be forced to all change their algorithms at the same time, and not impact their competitiveness.

              So the government going after tiktok is a good first step, IF it does the same thing to Facebook / instagram / YouTube / snapchat. But I’m betting it won’t be because those companies spend an absurd amount of money on lobbying.

              • t3rmit3@beehaw.org
                link
                fedilink
                arrow-up
                3
                ·
                edit-2
                4 months ago

                This is a false narrative that stock traders push. The fiduciary duty is just one of several that executives have, and does not outweigh the duty to the company health or to employees. Obviously shareholders will try to argue otherwise or even sue to get their way, because they only care about their own interests, but they won’t prevail in most cases if there was a legitimate business interest and justification for the actions.

  • DeltaTangoLima@reddrefuge.com
    link
    fedilink
    English
    arrow-up
    16
    ·
    4 months ago

    Shit like this is why I intend to keep my (currently) 9yo as far away from social media as I can, for as long as I can. This fucking terrifies me, as it should any parent.

    • DdCno1@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      4 months ago

      Educating your kid about the many possible pitfalls of social media is even more important. They will eventually experience it, are likely already to some degree through their friends’ devices exposed to it. Don’t make the mistake of turning social media into some kind of forbidden fruit, but instead provide them with the tools to deal with it responsibly.

      That said, I would still not allow this Chinese psy-ops tool on any device in my household. Other social media is already terrible enough, but TikTok seems to be engineered to cause nothing but damage.

      • BCsven@lemmy.ca
        link
        fedilink
        arrow-up
        7
        ·
        4 months ago

        I know some amazing parents that have super open communication and excellent teaching moments with their kids, they still fell into the social media morass…because friends (and teenage brain) are a heavy influence even with a safe supportive home

        • LukeZaz@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          This is why I think monitored access is a better idea than total withholding. Kids are going to end up on social media; either as they grow up and eventually become adults, or as a result of peer providing access & pressure. Best to let them on, but ensure they are safe, know how to be safe, and know why to be safe.

        • DdCno1@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          That’s a universal truth about parenting though and not limited to just social media.

          • BCsven@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            Right, but i was commenting about educating your kids about the pitfalls of social media, like you said. My adult children are teachers and they see social media is destroying kids even with education about it…their brains can’t stop even if they know the consequences, especially because it is psychologically tailored to engage them more and more

      • DeltaTangoLima@reddrefuge.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 months ago

        My own belief is that all social media is a cancer, and to be avoided entirely. I’m able to do that for myself, but I’m also realistic about the chances of keeping my kids away from it. So, I focus my energy on trying to equip them with the mental skills to neutralise the toxic aspects of social media.

        For my 9yo, that means teaching her to employ natural skepticism and critical thinking. I’m also trying to drum into her the understanding that social media is inherently untrustworthy and unreliable, and exists solely for the benefit of the corporations that run it.

        That said, I’ve blocked Tik Tok on my home network, much to the older kids’ chagrin. They have to use mobile data if they want to access that shit on their phones.

        • null@slrpnk.net
          link
          fedilink
          arrow-up
          7
          ·
          4 months ago

          My own belief is that all social media is a cancer, and to be avoided entirely. I’m able to do that for myself

          You just posted this to a social media site…

          • Scary le Poo@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            4 months ago

            Come on dude, you know exactly what he meant. Social media is a broad category, but when someone mentions it in this context, it’s very clear what they mean.

            • null@slrpnk.net
              link
              fedilink
              arrow-up
              4
              ·
              4 months ago

              I disagree. I don’t think it’s clear at all what he considers dangerous about social media if he’s excluding things like Lemmy, Reddit, and other message boards.

  • coyotino [he/him]@beehaw.org
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 months ago

    ah shit me and my friends used to do this, pre-social media. I remember one time in middle school recess, going out to the farthest corner of the playground with my friends, and we all did a thing where we took turns holding our breath while someone else squeezed our chest. I remember blacking out, hearing the pokemon theme in pitch darkness, and then waking up on the ground.

    I don’t think we did it more than once (at least I didn’t). But of course, the crucial difference was that I was with my dumbass friends, so at least there was someone to run for help if someone didn’t wake up.