• ArbiterXero@lemmy.world
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      5 months ago

      Could be another “ai is just cheap labor from India” scam.

      But more importantly, at what cost here? I mean how much processing power and engineering time is required and at what cost….to try and detect bike thefts?

      Wouldn’t it be cheaper just to buy them a new bike?

    • Possibly linuxOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 months ago

      I doubt it. Facial recognition is here and it is scarry as all get out

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    25
    ·
    5 months ago

    Are some people likely to object [to being constantly watched by computers that analyze their behaviour and report any detected anomalies to the cops]?

    Typically, no, but there is no accounting for some people.

    A fitting epitaph for the human race?

  • Destide@feddit.uk
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I have the results


    ._. ._. ._. ._. /| ._. ._. ._. /| ._. ._. ._. ._. /| ._. ._. ._. /| ._. ._. ._. ._. /| ._. ._. ._. /| ._. ._. ._. ._. /| ._. ._. ._. /| ._. ._. ._. ._. /| ._. ._. ._. /| ._. ._. ._. ._. /| ._. ._. ._. /| ._.


  • YaDownWitCPP@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Isn’t it the government’s job to put a stop to crime and ensure a happy community? Nothing to worry about if you’re an upstanding citizen.

    • Possibly linuxOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      The problem with that is two fold. The first issue is that surveillance equals power. Governments have power over people when they can arrest you for just about anything. It might start out as just a safety measure but it ends with people getting arrested for exercising basic rights.

      The other issue is that these systems are racist, sexist and discriminatory. This is because your training data is always going to be flawed in some way.