YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • vexikron
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.

    Facebook does the same.

    • PoliticalAgitator@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      We also have no idea what measures they take to stop the system being manipulated (if any).

      The far-right could be working to ensure they’re recommended as often as possible and if it just shows up as “engagement” or “impressions” on their stats, YouTube is unlikely to fight it with much enthusiasm.