I’m trying to make minesweeper using rust and bevy, but it feels that my code is bloated (a lot of for loops, segments that seem to be repeating themselves, etc.)

When I look at other people’s code, they are using functions that I don’t really understand (map, zip, etc.) that seem to make their code faster and cleaner.

I know that I should look up the functions that I don’t understand, but I was wondering where you would learn stuff like that in the first place. I want to learn how to find functions that would be useful for optimizing my code.

  • Akrenion@programming.dev
    link
    fedilink
    arrow-up
    12
    ·
    16 days ago

    Map, Filter, Reduce Those are the big three for data. More important than those however is mindset and patience with oneself. Writing code that works is the first and most impressive step. Optimizations are fun to think about but unless your computations are sluggish and repeat a lot of unnecessary steps they are rarely a priority.

    Build something and shelf it. Trust me, in but a few months you will look back in bewilderment and realize how much you’ve grown.

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 days ago

      but unless your computations are sluggish and repeat a lot of unnecessary steps

      In other words, the kind of optimizing that’s worth it is choosing a better algorithm to reduce its big-O complexity class.

      • Akrenion@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        15 days ago

        I was thinking about caches and evaluating what calculations I want to do.

        I fixed a project for someone simulating a machine. That took them almost 9 minutes. Simply replacing the part where they initialised a solver and used it to find a zeropoint of a quadratic function with a call to that initialiser got it down to a minute.

        You should have seen their faces when we put the quadratic formula in and it took 28 seconds.

      • Baldur Nil@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        15 days ago

        The mental model I have about performance is that the higher abstraction usually beats the lower level abstraction.

        So in that sense, a well architected software with proper caching, multithreading where it matters etc. will beat badly architected software (ex: one that brute forces everything). Then, that being equal, good algorithms and solutions beat bad ones. Only then faster runtimes make more of a difference, and at the bottom things like more efficient processor architectures, more efficient compiler etc. beat slower ones.

        A good example is Lemmy itself, which as far as I know was made in Rust to be super fast, but then at the beginning was being DDOSed quite easily because of the way the database was designed and lots of queries were very slow. Once they fixed that, Lemmy became actually usable.