TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • socialmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    edit-2
    11 hours ago

    I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

    The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

    This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

    This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      59 minutes ago

      This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

    • whoisearth@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 hours ago

      The problem is education. It’s a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

      What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I’ll argue there are people actively destroying this for their own gain.

      Educated people are dangerous people.

      It’s not 1984. It’s Brave New World. Aldous Huxley was right.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 hours ago

        I think we need to do better than just say “get an education.”

        There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

        From their perspective I get it, many of the Trump voters didn’t go, they hear that and they just assume brainwashing.

        We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of “education” where you regurgitate talking points from teachers, the TV, or the radio as if they’re matter of a fact … and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we’d be ill advised to assume the right can’t destroy that.

        • CircuitGuy@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information

          This entire comment and @[email protected]’s comments are so powerful.

          I think people have two modes of getting information: digging into a newspaper article and trying to figure out what’s going on and seeing a lurid headline in the tabloid rack. Most people do both ends of the spectrum and a lot of in-between. Modern technology lends itself to giving tabloid-like content while we’re waiting in line for a minute. This is why Tiktok is concerned about being removed from the app store, even though it’s easy to install the app yourself, easier than signing up for a newspaper delivery subscription was. But Tiktok isn’t more like a lurid tabloid that most people would not go two steps out of their way to find, but they might read it waiting in a slow line. I’m hopeful that people will learn to manage the new technology and not keep being influenced by tabloid entertainment.

    • trashboat@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

      Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place

      Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while