• Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      90
      arrow-down
      1
      ·
      7 months ago

      I’ve been a software developer for nearly 25 years now, and I can tell you this.

      No cunt reads anything.

      Something pops up over the top of what they want, they’ll click OK.

      • AnAngryAlpaca@feddit.de
        link
        fedilink
        English
        arrow-up
        24
        ·
        7 months ago

        With dark patterns you can “guide” the user to click a particular button, for example by having “accept” in a large, bright stand out colored button, and the “reject” button in a low contrast, small or disabled looking button.

        This will not prevent people from clicking reject, but it shifts the percentage of people clicking accept vs reject in the websites favor.

      • Demdaru@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        7 months ago

        I am just a guy who knows shit about computers and family knows it.

        The amount of stuff I had to remove after people next next next’d an adware installation agreement during installing other stuff…

      • saze@feddit.uk
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        Users not reading shit I can understand but it makes my blood boil when it your own bloody colleagues.

    • Crozekiel
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      7 months ago

      Being as I’m forced to use outlook for work… At least it’s just my work persona they are tracking and selling? That guy is wild.

    • jimbo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      15
      ·
      7 months ago

      Me. I legitimately don’t care and I haven’t yet had anyone explain to me over the last few decades what the big bad is that should make me care. Oh noes, some companies are going to analyze my data to scam each other for marketing dollars with generally worthless statistical data.

        • jimbo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          7 months ago

          Did you even bother reading that, or were you just jumping on the chance to use the word “metadata” like you were actually making a point? The “metadata” in question was phone location info, which every carrier has and they don’t need access to your phone or your Outlook emails to do it. I’m also going to go out on a really sturdy limb and say that the CIA/NSA/whoever doesn’t care whether you clicked “Accept All” or “Reject All” when they’re hoovering up “metadata”.

      • crackajack@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        7 months ago

        It’s understandable that the consequences of digital privacy is so nebulous and conceptual that many people don’t give much thought about it. But to put things into perspective, your data goes to data brokers. Anyone could buy your and others’ data from them. There is a case of a female domestic abuse victim who escaped her partner. The partner tracked her down by buying her data from a broker. Insurance companies could also buy your data and discriminate you knowing what your pre-existing health condition is.

        Let that sink in because you never know when your data might be used for malicious purposes. Even if you don’t think your personal information isn’t going to be processed maliciously, you’re inadvertently being part of the collective consent to erode the right to privacy (because in my experience, most people don’t care about privacy). We know that if not enough people complain, the powers-that-be sees this as providing consent. You and others may not see privacy as a big deal, but what about those who will be affected by the lack of it?

        I think at some point, people will only complain more if their personal details are breached. And it might be too late at that stage. As we speak, there is potential of AI being trained and developed to use other people’s likeness and data without their consent. Your childhood picture might be used for something else…