• NuXCOM_90Percent
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    16
    ·
    edit-2
    3 months ago

    For low contrast greyscale sequrity cameras? Sure.

    For any modern even SD color camera in a decently lit scenario? Bullshit. It is just that most of this tech is usually trained/debugged on the developers and their friends and families and… yeah.

    I always love to tell the story of, maybe a decade and a half ago, evaluating various facial recognition software. White people never had any problems. Even the various AAPI folk in the group would be hit or miss (except for one project out of Taiwan that was ridiculously accurate). And we weren’t able to find a single package that consistently identified even the same black person.

    And even professional shills like MKBHD will talk around this problem during his review ads (the apple vision video being particularly funny).

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      3 months ago

      For any scenario short of studio lighting, there is objectively much less information.

      You’re also dramatically underestimating how truly fucking awful phone camera sensors actually are without the crazy amount of processing phones do to make them functional.

      • NuXCOM_90Percent
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        15
        ·
        3 months ago

        No. I have worked with phone camera sensors quite a bit (see above regarding evaluating facial recognition software…).

        Yes, the computation is a Thing. A bigger Thing is just accessing the databases to match the faces. That is why this gets offloaded to a server farm somewhere.

        But the actual computer vision and source image? You can get more than enough contours and features from dark skin no matter how much you desperately try to talk about how “difficult” black skin is without dropping an n-word. You just have to put a bit of effort in to actually check for those rather than do what a bunch of white grad students did twenty years ago (or just do what a bunch of multicultural grad students did five or six years ago but…).

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          2
          ·
          3 months ago

          It’s not racist to understand physics.

          It’s exactly the same reason phone cameras do terrible in low light unless they do obscenely long exposures (which can’t resolve detail in anything moving). The information is not captured at sufficient resolution.

          • NuXCOM_90Percent
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            14
            ·
            3 months ago

            Rhetorical question (because we clearly can infer the answer) but… have you ever seen a black person?

            A bit of melanin does not make you into some giant void that breaks all cameras. Black folk aren’t doing long exposure shots for selfies or group photos. Believe it or not but RDCWorld doesn’t need to use nightvision cameras to film a skit.

            • conciselyverbose@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              15
              arrow-down
              3
              ·
              3 months ago

              You can keep hand waving away the statement of fact that lower precision input is lower precision input.

              And yes, for actual photography (where people are deliberately still for long enough to offset the longer exposure required), you do actually need different lighting and different camera settings to get the same quality results. But real cameras are also capable of capturing far more dynamic range without guessing heavily on postprocessing.

              • xor@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                3 months ago

                And you can keep hand waving away the fact that lower precision because of less light is not the primary cause of racial bias in facial recognition systems - it’s the fact that the datasets used for training are racially biased.

                • conciselyverbose@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  3 months ago

                  Yes, it is. The idea that giant corporations “aren’t trying” is laughable, and it’s a literal guarantee that massively lower quality, noisier inputs will result in a lower quality model with lower quality outputs.

                  Less photons hitting the sensors matters. A lot.

    • fartsparkles@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 months ago

      You’re not wrong. Research into models trained on racially balanced datasets has shown better recognition performance among with reduced biases. This was in limited and GAN generated faces so it still needs to be recreated with real-world data but it shows promise that balancing training data should reduce bias.

      • NuXCOM_90Percent
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        9
        ·
        3 months ago

        Yeah but this is (basically) reddit and clearly it isn’t racism and is just a problem of multi megapixel cameras not being sufficient to properly handle the needs of phrenology.

        There is definitely some truth to needing to tweak how feature points (?) are computed and the like. But yeah, training data goes a long way and this is why there was a really big push to get better training data sets out there… until we all realized those would predominantly be used by corporations and that people don’t really want to be the next Lenna because they let some kid take a picture of them for extra credit during an undergrad course.