• The University of Waterloo is expected to remove smart vending machines from its campus.
  • A student discovered an error code that suggested the machines used facial-recognition technology.
  • Vending Services said the technology didn’t take or store customers’ photos.
  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    9 months ago

    Is there any rational reason why vending machines need to be that complicated?

    Card readers / contactless payment were easy enough to “bolt on” to existing models (they had them when I was in college back in the stone age). So that’s not a sufficient reason.

    There are some “new” features I find useful, such as detecting when an item fails to vend. But those are pretty much just IR “tripwires” that detect the falling product; if it doesn’t trip, then you get refunded or can make another selection.

    I just cannot fathom why vending machines need any of this extra crap.

    Feel free to enlighten me if you’re in the know.

    • girsaysdoom@sh.itjust.works
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      9 months ago

      It might be advertisements. Some newer gas stations/grocery stores have ads playing on the refrigerated section and when it detects someone on front of the door it will go clear or show a picture of what’s behind the door.

      It’s completely ridiculous but it’s where things are going now.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        9 months ago

        Yes. I saw a video about Japanese vending machines and they talked about these cameras being used to help with targeted advertising, either by knowing the person or identifying their demographics and selling to that.

    • fuckwit_mcbumcrumble@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      9 months ago

      At this point it costs more money to make a “dumb” vending machine. The cost of a SBC is nothing, and it has more than enough horsepower to process transactions. All that extra horsepower + a 10 cent camera could be used to generate more money with facial recognition so naturally they’re gonna spend the extra 10 cents + cost to drill a hole to do it. It’s practically free to do it.

    • betterdeadthanreddit@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      9 months ago

      MathNews reported that Invenda Group’s FAQ list said that “only the final data, namely presence of a person, estimated age and estimated gender, is collected without any association with an individual.”

      Makes sense why the machine owner would seek this information either for their own use or to sell to others in related fields. Maybe they can use it combined with records of product sales (vs. a very literal form of window-shopping) to identify areas more likely than others to bring consistent returns on the investment of placing, stocking and servicing the machines based on the age/gender statistics of nearby population centers.

      Doesn’t mean I’d want to be part of their dataset or would be comfortable allowing their installation in a facility where that decision was up to me though.

    • SchmidtGenetics@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      9 months ago

      Accounts tied to school ID card? That way you can’t steal someone’s and use theirs, just polls a database and correlates your picture to your id image or something.

      About the only use case I can think of for a school.

      • otacon239@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        This definitely couldn’t backfire. Can’t think of a single reason in recent memory why someone’s face wouldn’t be visible… 🤔

      • betterdeadthanreddit@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        At least in the case covered by the article, they don’t appear to be doing that:

        … the director of technology services for Adaria Vending Services[1] told MathNews that “an individual person cannot be identified using the technology in the machines.”

        Still possible if they’re being less-than-perfectly-honest in that statement, they invest more into the technology or with another machine/company somewhere else.

        1

        … the smart vending machines… [are] provided by Adaria Vending Services and manufactured by Invenda Group.

        • Deceptichum@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          9 months ago

          That statement sounds weasely as fuck.

          The technology in that specific machine cannot identify a user, does not mean the machine does no store or transmit the footage to be processed on another machine or system that can.

          • betterdeadthanreddit@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            “It does not engage in storage, communication, or transmission of any imagery or personally identifiable information,”…

            The linked article includes this statement from Invenda, the manufacturer of the machines. Still have to rely on their truthfulness but they do address that specific point.

        • SchmidtGenetics@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          Oh definitely isn’t, just an example of how it can be used. I’ve seen it used in plants to administer safety gear so people don’t use a dozen gloves a week, even though it’s free and provided.

    • LifeInMultipleChoice@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      9 months ago

      I can tell you why, as no one here seems to have used these. These are rooms of food items with no workers. Say 10 fridges and other items like apples, bananas, coffee machines microwaves etc sitting out. You walk up, grab what you want, use what you want and then check yourself out. It would be wrong to sell any user data but they did this because they were losing money trying to staff lunch rooms. It costs less to ditch warm/cooler food, and rather balance food loss off these items who know they are being filmed and will likely be expelled from the college for stealing. An example woukd be Gulf Coast State College

      Instead of thinking vending machine, think food court with self checkout. (Saw this both in a FL colllege and also had this at a private company in TN)

  • I_Has_A_Hat@lemmy.world
    link
    fedilink
    arrow-up
    31
    arrow-down
    1
    ·
    9 months ago

    I hope the company, Vending Services, lose more from the PR blowback than whatever they gained by incorporating this BS into their machines. One university is a start, where else does this company operate?

  • DevCat@lemmy.world
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    9 months ago

    “What’s most important to understand is that the machines do not take or store any photos or images, and an individual person cannot be identified using the technology in the machines,” the statement said. “The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface — never taking or storing images of customers.”

    The statement said that the machines are “fully GDPR compliant,” referring to the European Union’s General Data Protection Regulation. The regulation is part of the EU’s privacy legislation that determines how corporations can collect citizens’ data.

    • quicksand@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      So facial recognition in this case means that it can recognize that a face exists? No particular details but just a face? That’s a lot less egregious than I assumed from the headline. With all the AI stuff going on these days, I assumed it was some kind of data mining operation

      • WindyRebel@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        But then, why? Is there a problem with birds and dogs making purchases at these machines that they need to identify a face?

      • brianorca@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        It does also estimate age and gender, so there’s some potential for data mining. But not much.

    • pdxfed@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      You gotta read through the carefully worded line toeing bullshit though; fingerprint readers on timeclocks no longer store fingerprint images but they can sure as fuck identify your finger. If operating similarly, it likely stores your face as a unique value, and can track unique purchases. Importantly, this can be sent to other machines to identify customers anywhere, and then of course, if that same technology is used in other machines, well shoot, any purchase you make at a vending machine is tracked to your face.

      The statement says individual purchases cannot be tracked to the individual but could be wordsmithing this with implication of anonymity but in actuality you’re tracked by a value. Also, if this were true why was this done so surreptitiously? They will claim I’m court an individual can’t be identified but of course once the technology is commonplace it would be only too easy for corps to join a data broker orgy and find out who’s who.

      I don’t know what is in the GDPR related to the above, but Cameras and storing pictures are old hat. Could be that carefully worded shit was because it’s not technically a picture of you and also because they’ll delete it upon request.