I am currently using an old laptop (circa 2015) with a 250GB SSD in it, and 4GB of RAM. It runs Fedora 39 Server, and only hosts a Jellyfin instance through Docker right now (though I want to use Nextcloud later too). There is only 15GB of storage left on it, and the CPU is constantly overloaded (due to forced transcoding). I happen to have a lot of 500GB 3.5" HDDs laying around, and I want to use them in RAID 5. What hardware would be good for having 4 HDDs, and running Jellyfin and Nextcloud in Docker? I’m okay with either having just a 4-bay NAS (as long as it can handle transcoding (MKV 480p -> MP4)), or having a 4-bay NAS and a server/computer/NUC. I only have a budget of CAD$900 (USD$658 as of writing), but I am willing to go to CAD$1000 if absolutely necessary.

  • foggy@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    6 months ago

    If price is the deciding factor then just build one.

    Get an old i7 for dirt cheap, cram the thing with ram and storage to suit your budget.

    Run something lightweight like Ubuntu Server.

    • Possibly linux
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 months ago

      Keep in mind this could drive up your energy bill.

      • merthyr1831@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Shouldnt do so that bad. my raspberry pi 4b can do jellyfin and nextcloud without pushing 15W at full load.

        x86 is inefficient, especially older models, but youll likely only push anything over 10W when actually streaming something that requires transcoding. Most of the time your home server is gonna sit idle or doing some tiny cron job that won’t really blast the CPU at all.

      • foggy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 months ago

        It totally doesn’t

        I’m running a 14th gen i9 with a 4080. It’s a power hungry boy. 1500w power supply. Generally using about 600-800w.

        Running this 24/7 costs me <$10/month in electricity.

        The old compaq presario with a Pentium II that probably pulled down 100w running Ubuntu server as described here made no statistically significant change in my electric bill. That is to say, it’s about as much change as being good or bad at turning off your lights when you’re not using them. It’s negligible.

        • ReasonablePea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 months ago

          At 600 watts running continuously wouldn’t that be 432 kWh a month?

          Assuming you didn’t mean you were running your gaming computer as the server.

          At 100 watts that comes out to 72 kWh, in CT where I live rates are waaaay higher then what I calculate your rate to be (around 2.5 cents per kWh)

          For me a 100 watt server is about ~$22 a month to run.

          Are you sure your paying 2.5 cents per kWh?

          • foggy@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            6 months ago

            Idk what I’m paying per kwh, I am just going off my monthly bills.

            There are other power fluctuates, I’m sure. I pay it no mind I just look at the bill. 🤷‍♂️

            So far no bill has arrived that made me change behavior.

            Edit: I’ve also never measured what my machine actually pulls down continuously/when idle. I just know that it’s components demand that range, and that I need the headroom in my power supply for spikes.

        • Possibly linux
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          That’s great if it works for you. However, a lot of us don’t want the bigger power bill. It also has the problem of heating everything up.

          I like CPUs with lower TDPs

          • foggy@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            No “old i7” as I suggested, is going to meaningfully increase the temp of your room if it has any cooling solution in place.

            Your stubbornness around a perfectly practical solution is absurd. I won’t bother convincing you further – it’s the obvious cost effective solution.

            • Possibly linux
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              The problem is it isn’t cost effective with electricity. You can pick up a CPU that is more efficient.

              I’m not saying your wrong but what your describing is not great for some people including myself.

              Your not wrong but there are also trade offs

              • foggy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                6 months ago

                It is still by far most cost effective.

                Your argument amounts to nothing.