• NounsAndWords@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Yes, yes, but this isn’t harmful speech military AI, this is simply commercial AI sold at scale. Clearly different, and if not, our lawyers will argue that it is until after we’ve made all our money.

        • bobman@unilem.org
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Hm. So it’s okay for the US government to spy on its citizens but no one else?

          Not sure why people like you ever bother talking about privacy. You clearly don’t care about it or understand it.

            • bobman@unilem.org
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              No its not but at least the shit doesn’t get sold to whoever pays the most

              Do you have evidence of this happening in China?

            • bobman@unilem.org
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              And i care very much about privacy.

              Then you should know it’s not a ‘lesser evil’ scenario. Either you have privacy, or you don’t.

              Choosing who you give up your privacy for is not being private.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    This is the best summary I could come up with:


    IBM has returned to the facial recognition market — just three years after announcing it was abandoning work on the technology due to concerns about racial profiling, mass surveillance, and other human rights violations.

    In June 2020, as Black Lives Matter protests swept the US after George Floyd’s murder, IBM chief executive Arvind Krishna wrote a letter to Congress announcing that the company would no longer offer “general purpose” facial recognition technology.

    “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.” Later that year, the company redoubled its commitment, calling for US export controls to address concerns that facial recognition could be used overseas “to suppress dissent, to infringe on the rights of minorities, or to erase basic expectations of privacy.”

    Kojo Kyerewaa of Black Lives Matter UK said: “IBM has shown itself willing to step over the body and memory of George Floyd to chase a Home Office contract.

    In 2019, an independent report on the London Metropolitan Police Service’s use of live facial recognition found there was no “explicit legal basis” for the force’s use of the technology and raised concerns that it may have breached human rights law.

    In August of the following year, the UK’s Court of Appeal ruled that South Wales Police’s use of facial recognition technology breached privacy rights and broke equality laws.


    The original article contains 860 words, the summary contains 257 words. Saved 70%. I’m a bot and I’m open source!

  • FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    If the holocaust happened today, I don’t doubt for a second that IBM would throw every consultant they have at Nazis to increase efficiency.

  • kool_newt@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    I’m an engineer and found myself working in this space and quit my job because I could not reconcile. We should be holding engineers (as well as the corporate leaders, all high ranking employees) building these dark technologies responsible for the massive harm they are doing and planning to do to humanity.

    • CeeBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      building these dark technologies responsible

      Before you even consider doing that, you should be advocating holding the creators of the already existing technologies that have already caused massive immeasurable amounts of harm from cellphones. The tracking and data logging done through cellphones (precise location and time, websites visited, apps used, text conversations, phone conversations, etc) are more far reaching and intrusive than anything else we could devise.

      With FR systems you have to go to the place where it’s set up. With cellphones it’s listening and watching at all times. Including when you’re at home and when you’re sleeping.

      I lose hope every time someone advocates to regulate or get rid of the “dangerous and evil” FR systems as though we haven’t already completely and UTTERLY lost the fight against privacy because the tracking device lets you watch tiktok videos and cat memes.

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Government surveillance is totally different than “general purpose”, surveillance, I guess…