• Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      3 days ago

      I suppose it can, but just calling it bits is extremely misleading. It’s like saying something takes 10 seconds, but only if you are traveling 90% at the speed of light.
      It such extremely poor terminology, and maybe the article is at fault and not the study, but it is presented in a way that is moronic.

      Using this thermodynamics definition is not generally relevant to how thought processes work.
      And using a word to mean something different than it usually does BEFORE pointing it out is very poor terminology.
      And in this case made them look like idiots.

      It’s really too bad, because if they had simply stated we can only handle about 10 concepts per second, that would have been an entirely different matter, I actually agree is probably right. But that’s not bad IMO, that’s actually quite impressive! The exact contrary of what the headline indicates.

      • Aatube@kbin.melroy.org
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        I get your argument now. Do note that this entropy is about information theory and not thermodynamics, so I concur that the Techspot article is at fault here.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          3 days ago

          I get your argument now.

          Thanks. ;)

          Do note that this entropy is about information theory and not thermodynamics

          https://en.wikipedia.org/wiki/Information_theory

          A key measure in information theory is entropy.

          Meaning it’s based on thermo dynamics.

          And incidentally I disagree with both. Information theory assumes the universe is a closed system, which is a requirement for thermodynamics to work. which AFAIK is not a proven fact regarding the universe and unlikely IMO.

          2nd law of thermodynamics (entropy) is not a law but a statistical likelihood, and the early universe does not comply, and the existence of life is also a contradiction to the 2nd law of thermodynamics.

          I have no idea how these ideas are so popular outside their scope?

          • Aatube@kbin.melroy.org
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            Information theory is an accepted field. The entropy in information theory is analogous and named after entropy in thermodynamics, but it’s not actually just thermodynamics. It looks like its own study. I know this because of all the debate around that correcthorsebatterystaple xkcd.

            • Buffalox@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 days ago

              I’m not sure if you are making a joke, or also making a point. But boy that XKCD is spot on. 😋 👍
              I think within it’s field thermodynamics works, but it’s so widely abused outside the field I’ve become sick of hearing about it from people who just parrot it.
              I have not seen anything useful from information theory, mostly just nonsense about information not being able to get lost in black holes. And exaggerated interpretations about entropy.
              So my interest in information theory is near zero, because I have discarded it as rubbish already decades ago.

              • Aatube@kbin.melroy.org
                link
                fedilink
                arrow-up
                2
                ·
                3 days ago

                For one, password security theory that actually works (instead of just “use a special character”) is based on information theory and its concept of entropy.

                • Buffalox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 days ago

                  OK I don’t think information theory is actually needed for that. Just a bit of above average intelligence apparently.
                  Yes it’s true some use the term entropy, instead of just the statistical amount of combinations, and obviously forcing a special character instead of just having it as an option, makes the number of possibilities lower, decreasing the uncertainty, which they then choose to call entropy. Which counter intuitively IMO is called increased entropy.