• 2 Posts
  • 297 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • If you made memory access lines twice as wide, they’d take up more space. More space means (a) chips run slower, because it takes time for the electricity to get there (b) they’d be bigger and more expensive.

    The main problem with 32-bit, as others have noticed, is that that’s not really so much RAM. CPUs do addition and subtraction the way we were taught at school - ‘carry the one’, they’ve an overflow bit that’s set when your sum doesn’t fit in the columns. On 8-bit CPUs, we were always checking back when adding up large numbers. On 64-bit CPUs, we can deal with truly massive numbers anyway, it’s not such a hassle. And they’re so fast at doing sums anyway and usually waiting for memory, it’s barely a hassle.

    Moving to 128-bit would give us a truly minuscule, probably unmeasurable, benefit in exchange for significant downsides. We could make them, but it would be pointless.








  • emerges from a brand you’ve probably never heard of

    Writing this on a Tuxedo Pulse 14 / gen 3 as we speak. Great little laptop. I’d wanted something with a few more pixels than my previous machine, and there’s a massive jump from bog-standard 1080p to extremely expensive 4K screens. Three megapixel screen at a premium-but-not-insane price, compiles code like a champion, makes an extremely competent job of 3D gaming, came with Linux and runs it all perfectly.

    “Tuxedo Linux”, which is their in-house distro, is Ubuntu + KDE Plasma. Seemed absolutely fine, although I replaced it with Arch btw since that’s more my style. Presumably they’re using Debian for the ARM support on this new one? This one runs pretty cold most of the time, but you definitely know that you’ve got a 54W processor in a very thin mobile device when you try eg. playing simulation games - it gets a bit warm on the knees. “Not x64” would be a deal-breaker for my work, but for most uses the added battery life would be more valuable than the inconvenience.




  • Any decent conductor is going to to vary the beat based on how long it takes for sound to fill the venue in question. Beethoven’s choices for the music halls in Vienna might have made sense then, but not so much today.

    One of the things that’s always annoyed the conductors that I’ve worked with is that we always ignore the dynamics in his music. Beethoven’s markings are expressive, subtle. And we always play his stuff louder than indicated.


  • Agreed. JSON solves:

    • the ‘versioning’ problem, where the data fields change after an update. That’s a nightmare on packed binary; need to write so much code to handle it.
    • makes debugging persistence issues easy for developers
    • very fast libraries exist for reading and writing it
    • actually compresses pretty damn well; you can pass the compress + write to a background thread once you’ve done the fast serialisation, anyway.

    For saving games, JSON+gzip is such a good combination that I’d probably never consider anything else.




  • Stephen King’s books tend to be both very long and contain a lot of internal monologue. That’s very much not film-friendly. “Faithful” adaptions tend to drag and have a lot of tell-don’t-show, which makes for a “terrible” film. Unfaithful ones tend to change and cut a lot, which makes them “terrible” adaptions. For instance, “The Shining” film has very little to do with the book, but is an absolutely phenomenal movie. King hated it.

    “IT” the Tim Curry version has Tim Curry in it, who was absolutely fantastic. A lot of material from the book was cut out - I’m thinking it could be 80% or more. That includes the scene where the children have a gang bang in the sewer. Out of nowhere, with no foreshadowing, and it’s never mentioned again if I remember correctly. That might make it a “terrible” unfaithful adaption, but you know something? I’m alright without seeing that.




  • One of the things that got me to change my gaming desktop from Mint to Arch was the fact that you get the cutting-edge version of everything; kernel and amdgpu being the most important, but also getting the latest version of Lutris and things is nice too. Brought me from “usually about 50 fps outdoors in Elden Ring” to “usually about 60 fps” on the same machine.

    Makes sense for a gaming machine to only include the services you actually want, which Arch enables. Supports my hardware better too - my audio gear works perfectly in Pipewire but is ropey in ALSA, so rather than “install Mint -> install Pipewire -> remove ALSA -> hope ALSA is gone”, the sequence is “install Arch -> install Pipewire”, which make more sense.

    Other cutting-edge rolling release distros are available, of course, but once you learn Arch, it makes a lot of sense for gaming.


  • We’ve a few rescue cats - we got them all when they were about three / four years old. We kept them inside initially for six weeks or so, made sure that they’d got used to living in a new house before we let them outside.

    The one which had been abandoned and had been living outside for a few weeks (a boy) stopped using his litter tray completely, as soon as he was allowed outside again.

    The other two, both girls but a ‘smooth’ changeover, took a bit more time to get used to being outside. One transitioned off of her litter tray after a couple of months by herself; the other took more like four months, and she was a bit of a fair-weather pooper for a while as well.

    My take-home message would be that cats generally prefer to do their business as far away from where they live as possible. Only possible bit of advice would be to wait until the weather’s getting better in case your cats dislike the wind and the rain. I believe forest cats love the frosty weather anyway, though?


  • Yeah.

    There’s a couple of ways of looking at it; general purpose computers generally implement ‘soft’ real time functionality. It’s usually a requirement for music and video production; if you want to keep to a steady 60fps, then you need to update the screen and the audio buffer absolutely every 16 ms. To achieve that, the AV thread runs at a higher priority than any other thread. The real-time scheduler doesn’t let a lower-priority thread run until every higher-priority thread is finished. Normally that means worse performance overall, and in some cases can softlock the system - if the AV thread gets stuck in a loop, your computer won’t even respond to keyboard input.

    Soft real-time is appropriate for when no-one will die if a timeslot is missed. A video stutter won’t kill you. Hard real-time is for things like industrial control. If the anti-lock breaks in your car are meant to evaluate your wheels one hundred times a second, then taking 11 ms to evaluate that is a complete system failure, even if the answer is correct. Note that it doesn’t matter if it gets the right answer in 1 ms or 9 ms, as long as it never ever takes more than 10. Hard real-time performance does not mean good performance, it means predictable performance.

    When we program up PLCs in industrial settings, for our ‘critical sections’, we’ll processor interrupts, so that we know our code will absolutely run in time. We use specialised languages as well - no loops, no recursion - that don’t let you do things that can’t be checked for an upper time bound. Lots of finite state machines! But when we’re done, we know that we’ve got code that won’t miss a time slot in the next twenty years of operation.

    That does mean, ironically, that my old Amiga was a better music computer than my current desktop, despite being millions of times less powerful. OctaMED could take over the whole CPU whenever it liked. Whereas a modern desktop might always have to respond to a USB device or a hard drive, leading to a potential stutter at any time. Tiny probability, but not an acceptable one.