• idefix@sh.itjust.works
    link
    fedilink
    arrow-up
    80
    arrow-down
    2
    ·
    1 month ago

    I know it’s a joke, I hate to be that guy. But this meme feels old and obsolete now. I can’t remember the last time I had to tweak my Linux. The fun is gone

    • De_Narm@lemmy.world
      link
      fedilink
      arrow-up
      54
      ·
      1 month ago

      For real. I recently had to swap my window manager to xmonad just to feel something again.

    • RadicalEagle@lemmy.world
      link
      fedilink
      arrow-up
      29
      ·
      1 month ago

      Cool! Maybe I can challenge you. Can you help me figure out how I can get my Hyprland session back on my Arch install? I have a Radeon 7700 XT and I recently installed an RTX 4070 to assist with some compute tasks. With both cards installed GDM doesn’t populate the Hyprland option. If I remove the 4070 everything goes back to normal.

      (This is also a joke, you don’t need to help me troubleshoot this.)

      (Unless you actually know how in which case I can pay you $20 for your time)

        • RadicalEagle@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          1 month ago

          Haha, I was hoping that because all my monitors are plugged into my AMD card that it wouldn’t cause as many issues, but I was mistaken.

          I’m looking at it as an opportunity to learn more about the Linux kernel, the order that certain modules are being loaded in, and environment variables.

          • Refurbished Refurbisher@lemmy.sdf.org
            link
            fedilink
            arrow-up
            9
            arrow-down
            1
            ·
            edit-2
            1 month ago

            You should consider passing through your Nvidia GPU to a virtual machine in order to do compute tasks on; that way, your host machine won’t be infected with proprietary Nvidia drivers (I’m assuming you need CUDA for your compute tasks). The only performance differences you’ll notice is less available system RAM (you will have access to all of your VRAM), and very slightly less CPU performance, due to running two operating systems at the same time (barely even noticable, TBH). This is the option that I would personally recommend.

            If you want to try a super hacky solution which might not work for everything you need, you can try using the open source, recently released ZLUDA translation layer to perform CUDA tasks on your AMD GPU.

            https://github.com/vosen/ZLUDA

            The reason Hyprland doesn’t work with proprietary Nvidia drivers is due to Nvidia refusing to implement the accepted Wayland standard in favor of their own, home-rolled solution which is incompatible. AFAIK, only GNOME and KDE implement that standard.

            • nexussapphire@lemm.ee
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 month ago

              Hyprland works fine on Nvidia, I’ve been using it for about a year now. It’s only going to improve now that Nvidia hired people from the Nouveau team to work on Nouveau and Nvidia is making the open drivers the default in version 560. Can’t wait for the 555 drivers they’ve been working on with the Wayland team and most of the major desktops to implement explicit sync etc.

              An option would be to only install the CUDA toolkit without the drivers but distros like Ubuntu just don’t support it. You could also switch display managers to sddm because Hyperland recommends it, might work better. Hyprland prints information in the tty if you launch it with Hyprland. I’m just thinking it’s gdm being weird tbh.

            • RadicalEagle@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              1 month ago

              Ah, I like this solution. Thanks for the suggestion! I set up GPU passthrough for a VM on a build years ago with QEMU. I’m sure I’ll be able to figure that out again.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      1 month ago

      By an Index and get into VR gaming on Linux. We livin on the edge ovar her. Shit breaks every day and there’s a wonky python script you have to use if you wanna be able to put the base stations into sleep mode 👍

    • 1984@lemmy.today
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      Yeah it just works now. Sometimes I miss the days where we had to troubleshoot sound drivers, because it made us learn so much. Even if we didn’t manage to fix the problem, we learned about how sound works in Linux.

    • UFODivebomb@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      Can i interest you on the deep customization of nixos?

      Jokes aside. I don’t really use the deep patching nix enables. The area of customization i want: look and feel of applications. It’s not something that’s doable really. Desktops are just different ways to launch a web browser T_T

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      Last week for me lol.

      AMD DRM bug in the kernel that prevents certain 3D rendering or something. Most games through WINE/proton was broken. Had to downgrade the kernel.

      Wouldn’t call that fun as it prevented one of the very few days per month I get to play games with some of my friends

      • idefix@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        It’s interesting to read people’s issues on Linux. It seems almost all of them come from the graphic stack and gaming. Using an Intel card I haven’t seen an issue in forever.

  • rsuri@lemmy.world
    link
    fedilink
    arrow-up
    54
    arrow-down
    5
    ·
    edit-2
    1 month ago

    Do you have multiple monitors?
    Yes - Don’t buy a mac
    No - Still don’t buy a mac

      • lud@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        3
        ·
        1 month ago

        I dont think it’s even possible to use more than two monitors on a M series computer (maybe except if you spend extra for the max edition)

        • KoalaUnknown@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          2
          ·
          1 month ago

          That is only the case on the base model chips. The Pro, Max, and Ultra chips all support multiple monitors.

              • lud@lemm.ee
                link
                fedilink
                arrow-up
                11
                arrow-down
                1
                ·
                1 month ago

                It’s still ridiculous to limit it.

                Pretty much any modern computer should be able to output to more monitors than that.

                • becausechemistry@lemm.ee
                  link
                  fedilink
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  1 month ago

                  limit it

                  There isn’t some software limitation here. It’s more that they only put two display controllers in the base level M-series chips. The vast, vast majority of users will have at most two displays. Putting more display controllers would add (minimal, but real) cost and complexity that most people won’t benefit from at all.

                  On the current gen base level chips, you can have one external display plus the onboard one, or close the laptop and have two externals. Seems like plenty to me for the cheapest option.

                • areyouevenreal@lemm.ee
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  1 month ago

                  Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

                  A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.

    • acockworkorange@mander.xyz
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      1 month ago

      I mean, yeah, don’t ever buy a Mac, but what’s up with the multiple monitors? Do they struggle with it?

      • efstajas@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        1 month ago

        macOS out of the box fucking sucks for monitor scaling with third party monitors. It’s honestly laughable for a modern OS. You can install some third party software that fixes it completely, but it really shouldn’t be necessary. I use an (admittedly pretty strange) LG DualUp monitor as a secondary, and out of the box macOS can only make everything either extremely tiny, extremely large, or blurry.

        Other than that, I’ve had no problems at all, and the window scaling between different DPI monitors is a lot smoother than it was with Windows previously.

      • KoalaUnknown@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        1 month ago

        The base model chips only supports 2 monitors. The Pro, Max, and Ultra chips all support multiple monitors.

        • ditty@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          edit-2
          1 month ago

          The base model chips only support 1 monitor.

          Apple artificially limits the base model chips to only support 1 monitor FTFY

          EDIT: revised statement based off my learning about frame-buffers below:

          Apple intentionally builds base-level MacBooks without adequate frame-buffers to force users to buy upgraded and more expensive products.

          • areyouevenreal@lemm.ee
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            1 month ago

            They all support two monitors (one internal and one external for macbooks, and two external for desktops). It’s not an artificial restriction. Each additional monitor needs a framebuffer. That’s an actual circuit that needs to be present in the chip.

            • Honytawk
              link
              fedilink
              arrow-up
              5
              arrow-down
              3
              ·
              1 month ago

              So they cheaped out on what is supposed to be a premium brand, gotcha

              • becausechemistry@lemm.ee
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                1 month ago

                What percentage of people who buy the least expensive MacBook do you think are going to hook it up to more than two displays? Or should they add more display controllers that won’t ever be used and charge more for them? I feel like either way people who would never buy one will complain on behalf of people who are fine with them.

                • Zangoose@lemmy.one
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 month ago

                  The least expensive MacBook is still $1000, closer to $1500 if you spec it with reasonable storage/ram. It really isn’t that much of a stretch to add $100-300 for a 1080/1440p monitor or two at a desk.

              • areyouevenreal@lemm.ee
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                1 month ago

                Not necessarily. The base machines aren’t that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I’ve seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

                Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It’s no wonder they didn’t put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren’t worth it for a small number of users.

            • ditty@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 month ago

              TIL, thanks! 🌝

              I use a Plugable docking station with DisplayLink with a base-level M1 MacBook Air and it handles multiple (3x 1080p) displays perfectly. My (limited) understanding is that they do that just using a driver. So at a basic level, couldn’t Apple include driver support for multiple monitors natively, seeing as it has adequate bandwidth in practice?

              • areyouevenreal@lemm.ee
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                1 month ago

                Sigh. It’s not just a fricking driver. It’s an entire framebuffer you plug into a USB or Thunderbolt port. That’s why they are more expensive, and why they even need a driver.

                A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That’s a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don’t run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.

                • ditty@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 month ago

                  Thank you for taking the time to reply, and for further sharing your expertise to our conversation! I understand different resolutions, that the docking station has its own chipset, and why the Plugable is more expensive than other docking stations as a result. I now have a more nuanced understanding of frame-buffers and how DisplayLink interfaces with an OS like MacOS.

                  Allow me to clarify the point I tried to make (and admittedly, I didn’t do a good job of expressing it previously). Rather than focusing on the technical specs, I had intended to have a more general conversation about design decisions and Apple’s philosophy. They know that consumers will want to hook up a base tier MacBook Air to two external displays, and intentionally chose not to build-in an additional frame-buffer to force users to spend more. I sincerely doubt there’s any cost-saving for the customer because Apple doesn’t include that out of the box.

                  Apple’s philosophy has always been that they know what’s best for their users. If a 2020 M1 MacBook Air supports both the internal 2K display and a single external 6K display, that suggests to me it should have the horsepower to drive two external 1080p displays (that’s just a feeling I have, not a known fact). And I’ll acknowledge that Apple has improved this limitation for the newer MBAs, which allow you to disable the built-in display and use two external displays.

                  My broader point is that Apple “knows what’s best” for their users: they want customers to buy an Apple display rather than to just stick with the 1080p LCDs they already own, because they’re not Retina®. Which do you honestly think is a more common use-case for a MacBook Air user: wanting to connect to two monitors (home office, University classroom system, numerous board room settings I’ve worked in, etc), or to connect their $1200 MBA to a $1600-$2300+ Studio Display? For that, anyone with an iota of common sense would be using a MBP etc since they’re likely a creative professional who would want the additional compute and graphics power for photo/video-editing, etc.

                  I don’t disagree with your explanation of the thought-process behind why Apple may have made this hardware decision for MBAs, but it is effectively an arbitrary, non cost-saving decision that will certainly impede customers who expect two displays to just work, since they can do that on their 10-year-old Toshiba Satellite or w/e.

                  Thanks, and have a great day

      • rsuri@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 month ago

        For me it’s that compared to windows and linux, handling multiple windows between screens is always problematic, and is made worse by alt-tab bringing up all the windows for an application, which means they pop up in the other monitors too which isn’t usually what I want. Maximizing is usually not as straightforward as one would hope, and the dock moves to any window if you leave your pointer at the bottom which can get annoying fast. As some point out apparently there’s 3rd party software that allows you to fix these issues, but that’s not an option for me because I use a locked-down Mac for work and can’t install 3rd party software, so I’m stuck with the annoying base behavior.

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 month ago
    $ pacman -Si god
    error: package 'god' was not found
    

    Take that, theists!

  • justme@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    36
    arrow-down
    5
    ·
    1 month ago

    I can be as rich as god and wouldn’t go for windows or apple. I would rather invest the money in good Foss development

    • Honytawk
      link
      fedilink
      arrow-up
      20
      arrow-down
      1
      ·
      1 month ago

      Probably the reason as to why you aren’t rich in the first place

    • chatokun@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Being a support person, if I was rich enough to frivolously buy systems, I’d have at least one of each as a reference system. Yes, I know, vms, but that’s for saving money/space. Especially MAC I’d have some hardware too. Definitely not a main system though. I currently have a broken Mac and a cheap chromebook for that reason, though due to being broken the Mac is rather useless now. When it worked I often used it to help test/troubleshooting customer stuff.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      For real I’ve put in a fraction supporting OpenSource then I had if I had to buy unfree software and I’ve gotten way more out of it.

      I see it as a difference between owning and renting.

  • BCsven@lemmy.ca
    link
    fedilink
    arrow-up
    31
    ·
    1 month ago

    I have tried templeOS. It is amazing one guy built all that. It feels like it needs training sessions to make better use of it, and also it is wacky as hell

    • MrSoup
      link
      fedilink
      arrow-up
      10
      ·
      1 month ago

      It feels like it needs training sessions to make better use of it, and also it is wacky as hell

      Seems the description of MS Office.

  • TheDemonBuer@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    1 month ago

    Everyone always forgets the “it just works,” easy, normie distributions like Fedora. I guess people figure if you’re looking for an OS like that, you might as well just use Windows, but I’d rather not.

    • sunstoned@lemmus.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      There’s something to practicing with the operating system family that most big commercial outfits use. Plus SELinux is neat, and there’s no Canonical ads.

      I use Fedora with home-manager, btw. After using Arch and Debian for years I really think Fedora (or adjacent like Nobara) is on its way to being the de facto starter distro.

      • Admetus@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        1 month ago

        I’ve seen Fedora lauded for being so responsive, and I’d probably go for that over the super bloated Ubuntu. I know it’s not debloated either but that is preferable over dealing with Arch which needs a lot of tinkering time which eats into my work time.

    • TalesOfTrees@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      Oh man, I’m so disappointed. The defaults don’t even port over any of the “really good” MLP themes. Might be because DE versions have changed, but I remember there being a fairly well done (for a stupid gag anyways) Rainbow Dash theme for KDE4.

      I mean, if I’m going to embrace the inner brony, why would I want the same generic looking “dark” theme every other offering has…

  • flop_leash_973@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    1 month ago

    Haha, it amuses me to no end that ever since I watched a “Down the Rabbit Hole” video on Youtube about TempleOS a few years back I have seen it crop up in varies places from time to time as I don’t remember ever seeing anything about it before.

    Makes me wonder if it was always there and I just didn’t notice it until I was familiar with it.

  • cordlesslamp@lemmy.today
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 month ago

    Let say I want to try Linux but I want to keep my Windows OS intact (for now), and I only have 1 SSD in my PC.

    Is there a solution that I can just partition the drive, install Linux, switch between OS by just restarting without affecting the other, AND later on remove one OS without wiping the SSD?

    • bitfucker@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      Yes, first you need to resize the partition to accommodate the new OS. Usually 40-60 GB is good enough for minimal linux installation if you didn’t do any gaming or other massive applications. The resizing can be done in windows using disk management utility baked into windows, or some other partition manager (easeus, magic tools, etc). After that, linux can be safely installed in the free space as a single partition.

      Now, sometimes the bootloader is fucked, but it is quite easy to fix. In fact, if you use grub, it usually runs os-probe for you to check for any other OS. So sometimes, fixing it is as simple as rerunning grubmkconfig. But there are other times where it is not as simple. It will vary depending on what happened and too long to list here. Arch Wiki usually covers a lot of the topic so you could try searching there, especially on the topic of boot sequence.

      Lastly, if you need to move the partition, the data already inside will need to be moved too. This can take time depending on the size. But it is doable and safe.

      If, later down the road you want to remove either OS, you can simply remove the partition after moving the data first. Linux can mount ntfs natively so no problem there. On windows, there is a program called ext4 explorer or something along the line to browse and copy from linux filesystem (which is usually ext4). Don’t forget to remove the boot information too after you’re done removing the partition.

      Now there is also the other suggestion to use a live environment but I didn’t suggest it since the experience can be lacking and more hassle in and of itself.

      • Cethin
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 month ago

        I want to add to this that Windows sometimes has its own ideas and decides it owns the disk. I had a dual boot with Windows and Linux and Windows updated and fucked up the file system. I was able to recover almost everything without that much issue, that it did require some extra tools and some knowledge. The boot partition I never recovered though. (I was able to fix it to get it to boot into the Linux install again, but not Windows no matter what I tried.)

        This was about a year ago, maybe a bit more. The issue I had with Linux prior to this, which is why I was dual booting, was gaming. At this point gaming was perfectly fine for me to ditch windows, so I just grabbed all the files I needed to keep and set the drive up new with a fresh install.

        • lud@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          1 month ago

          In general dual booting windows and Linux on the same disk is risky.

      • cordlesslamp@lemmy.today
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 month ago

        Thanks, but on second thought I don’t want to risk anything as I’m not quite the “technical” kind. I don’t even know how to dual boot 2 different windows version. I don’t think I’ll be able to fix it if anything broke.

        So I’ll buy another cheap SSD and put Linux on it while unplugged my old SSD. Then I’ll be choosing the boot drive during POST.

        I’m damn sick of Windows BS, I hope this’ll work out.

        • bitfucker@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          Yeah, that’s fair. But I will still recommend anyone trying out linux AND having a problem to consult Arch Wiki when they can. It is amazing what they have there. It will also increase your technical understanding of how your system works overtime. But if you don’t have any problems when driving linux, that is good too. It just means linux for the masses is coming closer.

          For some distro recommendations, if you love to tinker, I’d say go arch. You will learn a lot about your computer too, and it is also how I learn about mine and get the know how for a lot of things now. But also, if you don’t have the time to tinker, I’d recommend bazzite. I’ve read their documentation and came to the conclusion that if anything goes wrong, it would be easy to recover from it, has great community, and is based on a solid distro.

    • smnwcj@fedia.io
      link
      fedilink
      arrow-up
      6
      ·
      1 month ago

      I’ll add that id highly recommend making a backup before doing anything. You can more safely try out linux in a virtual machine as well

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      I just installed Linux Mint into a dual boot setup recently. Unsurprisingly, their install process made it pretty easy to partition the drive and have everything play nice together.

    • mexicancartel@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Yes I did this less than a week ago.

      I shrinked the windows main partition(the C: drive) to like only 70gb since i don’t want to use it at all, then made a live usb and go with custom partition selection. Then you have to give certain partitions for linux. The /boot/efi should be selected as windows boot partition so both show up in bootloader.

      Then you have to create a root and swap partition atleast, and you can have seperate home partition if you want tp install different linux distro without losing data in first linux.

    • celeste@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 month ago

      Yes, other options to try linux while keeping windows are windows subsystem linux (wsl) or booting live from a usb

      • cordlesslamp@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        Thanks, what I want to try out is the gaming capabilities. I don’t know if VM or live USB can do that reliably.

        I heard that AMD GPUs is better with Linux, right?

        • Land_Strider@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          I’ve been trying out Mint (Cinnamon) for some months now. I have an AMD Ryzen 5 3600 CPU and an AMD Radeon 6700XT graphics card, both of which work splendidly on Mint out of the box. This installation is my first ever attempt at using Linux, with dual booting on top of it (on the same sdd with partitioning), but I’d say it set up more nicely than any Windows formatting I’ve ever done over the years. Writing the .iso file to a USB drive was a bit different than I’m used to using Rufus for Windows, but Rufus can write it.

          Mint (Cinnamon) is based on Ubuntu, which itself is a massively changed Debian but with still a good compatibility with it on the surface.

          While Arch is great and all, if you are looking for a life-line after years of being a Windows user but finally deciding to not move on to the next Windows version because of all the shit they keep breaking and all the other ad and data mining they do on those versions, Mint is a great starting distro. It gets installed with all the hardware drivers present, for AMD hardware at least but Nvidia should work, too. No need to set up a modern working computer environment with requirement to install anything to get your things working. As long as OS installation goes correctly and it boots up, you are good to go.

          As for regular stuff:

          1. Libre Office is pre installed, and I find it pretty good even tho I had quite the dislike for it before. Select a theme and a layout preset for the toolbar, you are right in your element as if you are continuing to use MS Office.

          2. Gaming with Steam is just turning on one setting in Steam settings, the compatibility tab (Proton), and that’s it. Most games work out of the box. For others, check ProtonDB for what people say about the game. They usually work, or there is a little basic fiddling required at best. I can play Hunt: Showdown with Easy Anti Cheat without a hassle on it. Just another little Proton file installed, that’s all.

          3. For Windows-only programs, you can use Wine. Wine works in the background, and when properly installed, it allows you to just double click any .exes and run them. Programs can be a bit slower than using them on Windows, but most of them work on Linux with Wine if it is what matters to switch from Windows. You can play a lot of non-Steam games through that, too.

          4. Mint has a Microsoft Store-like program repository where you can install programs and their dependencies with one click. This works well most of the time, but sometimes Flatpak versions of these can be problematic. I’ve had Steam, Discord and Wine installed through it, and they had problems to some extent. For these, I switched to grabbing .deb installation files through their own websites, or in the case of Wine, installed through its own instructions on its website using a few terminal commands, which isn’t more complicated than using Registry editor or Group editor in Windows.

          5. Most other common stuff has good alternatives, with downsides or upsides. Switching from MPC to VLC, from Photoshop to Gimp, MS Office to Libre Office, etc. The internet forums have many detailed answers to these, or you can always ask for thoughts yourself. There usually is an alternative most of the time.

          One thing to keep in mind: As Mint Cinnamon is based on Ubuntu, you can use answers for Ubuntu most of the time. However, while using the answers, keep these in mind as a form of cheatsheet when troubleshooting, or looking for implementing things:

          Mint (Cinnamon) v21 and above are based on Ubuntu 22.04 LTS called Jammy, not Ubuntu 20.04 LTS called Focal(?). Almost all answers for 22.04 LTS will work on Mint Cinnamon, and all repositories and programs for it will work on Mint, too. 20.04 LTS, or recent 24.04 LTS, will have compatibility when looking for answers, but they are not directly what you are using.

          Mint Cinnamon also uses Gnome, not KDE, as the desktop environment, so keep that in mind when looking for answers. It also uses X11 of Xorg by default for its base graphics drawing, not Wayland.

        • Cethin
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 month ago

          Both of those will have worse performance, but I don’t see why they wouldn’t work. Just whenever it needs to grab more data it’ll have to go to the USB to get it, which is slow. You could load the game that’s stored on the disk already (this will require more effort and knowledge than installing Steam and it installing it locally on your Linux drive), so that’d be better, but the system data will be slow. If you have a lot of RAM it’ll reduce how often data is grabbed, so it’ll reduce the issues after boot.

    • lengau@midwest.social
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      One of the most important things to recognise before I start: Don’t try to make something permanent right now. None of this needs to be written in stone. Choose what’s going to be best for you right now and know that in a few weeks or months you might want to change it. With that in mind:

      What do you want out of Linux right now? A development system? Are you looking to see what it would be like to move away from Windows? Something else?

      Let’s start with the development system. Let’s say you’re comfortable on Windows and just want to do a few things that are easier or more convenient on Linux. In that case, you probably want Windows Subsystem for Linux. This will get you a bunch of things, including the ability to quickly and easily try out a bunch of distributions. Of course, this is going to be primarily a command line experience. You’re not going to get the “full experience” with a desktop environment, etc. But if you just “need Linux for a couple of things,” this is a great intro.

      Next, let’s say you want to try Linux out, see what the desktop is like, etc. This is a great opportunity to try a virtual machine. You’ll have limitations (less hardware access, maybe not as smooth a desktop as if it were on the hardware directly), but it’s a great way to play with distributions, especially if you want to explore multiple distros. (I’ll get to distros below)

      Got a distro you want and want to try it as your “main environment” for a while? Other folks have mentioned how to dual boot. Here, the most critical part in my opinion is to put your important data onto a third partition that’s easily accessible to both. On Linux, I’d suggest bind mounting directories from that partition in your home directory. If you want to wipe an OS later it’ll be a bit rough, but you can do it. You’ll just need to boot from a live USB to do it, and of course be very careful about what partitions you delete.

      Now, for distros:

      Everyone is going to recommend their pet distro, and to that end I recommend [REDACTED]. But! Here’s my actual guide for selecting a distro:

      1. Got a friend who’s willing to spend a decent amount of time helping you? Go with whatever they suggest, at least for now. It’s okay if it’s not where you’ll be eventually. What they’re familiar with right now will speed up their ability to help you, which will speed up your learning. What they use may well not be where you end up and that’s okay. I do however have two exceptions to this: first, if they suggest Gentoo or NixOS as your intro distro, find someone else. Gentoo and NixOS are both fantastic, but they are very much not beginner distros. In 6 months or a year though, they might be something you want to play with if you’re interested in doing a deep dive into Linux. Second, have them with you while you’re doing the install. You want to be doing the install, but they should be there to guide you and answer questions.
      2. Doing this on your own? Go with a beginner friendly distro. The main recommendations I have here are Ubuntu spins or Fedora spins. There may well be people who reply to my comment spewing hate about one or both of those recommendations, and while there’s controversy about both of these, at the end of the day they’re both great. (Conflict of interest declaration: I work for the company that makes one of those distros, and the other one is some of our biggest competition. I applied for this job in part because I thought that one of the things the community loves to hate about one of these was Great, Actually™, but I wanted to improve some of the things that I think are actually valid criticisms.)

      If internet randoms tell you “X is garbage, don’t use it,” feel free to disregard them. Most Linux distros are great. They all have smart, dedicated people working on them, and they each have their own vision of how they want it done. These ideas conflict sometimes, but that’s okay.

      And one final thing… Don’t fight against your distro’s way of doing something. At least not now. Most people telling you to do something that works against the distro are doing so for ideological, not practical, reasons. You don’t need to get involved in ideological wars - enjoy Linux for its positives.!

      • AnUnusualRelic@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        Those are wise words.

        Remember that in the end, all the distributions end up doing and installing pretty much the same thing (from the user’s pov). It doesn’t matter all that much what you use. Most of the major ones work just fine.

      • Cethin
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        This comment is good, but it’s very much the “scared of change” comment. It recommends the smallest amount of change possible, which might be good for some people but just diving in will probably be a better introduction.

        You don’t learn how to swim by sitting in a bath tub. You have to get into the water. Maybe wear some safety gear (dual boot or other options), but if you’re reasonably confident and/or competent you’ll be fine getting into Linux as long as you’re using one of the major distros.

        I assume almost everyone who has made it to Lemmy is competent enough with a computer to handle the transition to Linux. It really isn’t all that hard if you know how to use a search engine.