Those who don’t have the time or appetite to tweak/modify/troubleshoot their computers: What is your setup for a reliable and low-maintenance system?

Context:

I switched to Linux a couple of years ago (Debian 11/12). It took me a little while to learn new software and get things set up how I wanted, which I did and was fine.

I’ve had to replace my laptop though and install a distro (Fedora 41) with a newer kernel to make it work but even so, have had to fix a number of issues. This has also coincided with me having a lot less free time and being less interested in crafting my system and more interested in using it efficiently for tasks and creativity. I believe Debian 13 will have a new enough kernel to support my hardware out of the box and although it will still be a hassle for me to reinstall my OS again, I like the idea of getting it over with, starting again with something thoroughly tested and then not having to really touch anything for a couple of years. I don’t need the latest software at all times.

I know there are others here who have similar priorities, whether due to time constraints, age etc.

Do you have any other recommendations?

  • EarlGrey@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    Debian XFCE or Xubuntu LTS.

    xfce is stubbornly slow at introducing new features, but it is absolutely rock-solid. Hell I don’t think they’ve changed their icon set in some 20 years.

    Debian and *buntu LTS are also likewise slow feature updaters that focus on stability.

  • oldfart@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    5 hours ago

    Xubuntu LTS. I’ve been meaning to switch to Debian Stable when something breaks, but it’s my third LTS on the desktop and 5th on the laptop and there was just no opportunity. I also learned to avoid PPAs and other 3rd party repos, and just use appimages when possible.

    You can have a kernel from Testing or even Sid, I believe, but yeah, it’s what we want to avoid - tweaking.

    LTS is released every 2 years, for reference.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    12 hours ago

    fedora has been this for myself. maybe tweaking every now and then to fix whatever edge cases I’ve run into but it’s the least painful distro I’ve used so far

  • Magiilaro@feddit.org
    link
    fedilink
    arrow-up
    4
    ·
    10 hours ago

    My Arch Linux setup on my desktop and my servers are low-maintenance. I do updates on my servers every month or so (unless some security issue was announced, that will be patched right away) and my desktop a few times a week.

    Nearly anything can be low-maintenance with the proper care and consideration.

    For your constraints I would use just use Debian, Alma Linux or Linux Mint and stick with the official packages, flathub and default configuration on the system level. Those are low-maintenance out of the box in general.

  • cerement@slrpnk.net
    link
    fedilink
    arrow-up
    22
    ·
    1 day ago
    • yet another vote for Debian Stable
    • second the comment on: if you need a newer kernel for hardware reasons, use backports
    • Xfce
    • stick to flatpaks when dealing with wanting to try out a new program (if you like it, then make the decision to use apt or not)
    • don’t confuse “hasn’t been updated” with “hasn’t needed to be updated”
    • dino@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      12
      ·
      13 hours ago

      Such a bad comment, what does tinkering mean? Not use any software besides the default one? So only browsing and text apps? facepalm

      • Magiilaro@feddit.org
        link
        fedilink
        arrow-up
        8
        ·
        10 hours ago

        Tinkering, in my personal definition, would mean installing third party repositories for the package manager (or something like the AUR on Arch) or performing configuration changes on the system level… Just keep away as most as possible from accessing the root user (including su/sudo) is a general a good advice I would say.

    • umbrella@lemmy.ml
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      i want to try another distro than ubuntu, but the damn thing isnt giving me a single excuse to format my system. it doesnt break if you don’t fuck with it.

    • d00phy@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 day ago

      This really is the answer. The more services you add, the more of your attention they will require. Granted, for most services already integrated into the distro’s repo, the added admin overhead will likely be minimal, but it can add up. That’s not to say the admin overhead can’t be addressed. That’s why scripting and crons, among some other utilities, exist!

      • umbrella@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 day ago

        i think its more about modifying the system behavior, esp on desktop oses. i have many local services running on my server, and if set up right, its pretty much no maintenance at all.

  • lordnikon@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    1 day ago

    If you like debian and just need a newer kernel you could just add backports to your debian install then install the kernel during the install process.

      • asap@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        9 hours ago

        It’s just Fedora CoreOS with some QoL packages added at build time. Not niche at all. The very minor changes made are all transparent on GitHub.

        Choose CoreOS if you prefer, it’s equally zero maintenance.

    • moontorchy@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      13 hours ago

      Yeah, sure. I was running Bluefin-DX. One day image maintainers decided to replace something and things break. UBlue is an amazing project. Team is trying hard but it’s definitely not zero mainainace. I fear they are chasing so many UBlue flavours, recently an LTS one based on CoreOS, spreading thin.

      • j0rge@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        4 hours ago

        If you depend on third party modules you’ll end up with third party maintenance - we didn’t purposely decide to break this we don’t work at Nvidia.

        • moontorchy@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 hours ago

          Jorge, OP asked about “not having to really touch anything for a couple of years”. I am just sharing my experience. Big fan of containers and really appreciate your efforts of pulling containers tech into Linux desktop. Thank you!

          I don’t understand the answer though. Maybe I am missing something here. There’s an official Bluefin-DX-Nvidia iso. Nvidia-containers-toolkit was part of that iso.

          On a separate note, I liked the idea of GTS edition. Since few weeks ago iso became unavailable pending some fix. At the same time I see loads of new LTS edition buzz. It’s still in Alpha though. I feel confused.

          • j0rge@lemmy.ml
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 hour ago

            I don’t understand the answer though.

            The answer is if you’re depending on software that is closed and out of your control (aka. you have an Nvidia card) then you should have support expectations around that hardware and linux.

            There are no GTS ISOs because we don’t have a reliable way to make ISOs (the ones we have now are workarounds) but that should be finished soon.

      • asap@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        9 hours ago

        🤷 I’ve been running Aurora and uCore for over a year and have yet to do any maintenance.

        You can roll back to the previous working build by simply restarting, it’s pretty much the easiest fix ever and still zero maintenance (since you didn’t have to reconfigure or troubleshoot anything, just restart).

    • trevor@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      This is the way. The uBlue derivatives benefit from the most shared knowledge and problem-solving skills being delivered directly to users.

      Between that, and using a decorative distrobox config, I get an actually reliable system with packages from any distro I want.

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 day ago

      Doesn’t ucore also have to restart to apply updates?

      Not super ideal for a server as far as maintenance and uptime to have unexpected, frequent restarts as opposed to in-place updates, unless one’s startup is completely automated and drives are on-device keyfile decrypted, but that probably fits some threat models for security.

      The desktop versions are great!

      • Axum@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        13
        arrow-down
        4
        ·
        edit-2
        1 day ago

        Not super ideal for a server as far as maintenance and uptime to have unexpected, frequent restarts

        This is such a weird take given that 99.9% of people here are just running this on their home servers which aren’t dictated by a SLA, so it’s not like people need to worry about reboots. Just reboot once a month unless there’s some odd CVE you need to hit sooner than later.

        • dino@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          13 hours ago

          So why would somebody run that on their homeserver compared to tried and true staples with tons of documentation? 🍿

          • asap@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            10 hours ago

            It’s just Fedora CoreOS with some small quality-of-life packages added to the build.

            There’s tons of documentation for CoreOS and it’s been around for more than a decade.

            If you’re running a container workload, it can’t be beat in my opinion. All the security and configuration issues are handled for you, which is especially ideal for a home user who is generally not a security expert.

        • JustEnoughDucks@feddit.nl
          link
          fedilink
          arrow-up
          4
          ·
          1 day ago

          That is very fair!!

          But on the other hand, 99.9% of users don’t read all of the change notes for their packages and don’t have notifications for CVEs. In that case, in my opinion just doing updates as they come would be easier and safer.

      • asap@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 day ago

        They won’t apply unexpectedly, so you can reboot at a time that suits. Unless there’s a specific security risk there’s no need to apply them frequently. Total downtime is the length of a restart, which is also nice and easy.

        It won’t fit every use-case, but if you’re looking for a zero-maintenance containerized-workload option, it can’t be beat.

      • notfromhere@lemmy.one
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Run k3s on top and run your stateless services on a lightweight kubernetes, then you won’t care you have to reboot your hosts to apply updates?

    • Churbleyimyam@lemm.eeOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      I had problems with waking from sleep/hibernate, audio issues (total dropouts as well as distortion in screen-recording apps), choppy video playback and refusal to enter fullscreen, wonky cursor scaling, apps not working as expected or not running at all. I’ve managed to fix most of these or find temporary workarounds (grateful for flatpaks for once!) or alternative applications. But the experience was not fun, particularly as there was only a 2 week return window for the laptop and I needed to be sure the problems weren’t hardware design/choice related. And I’m finding it 50/50 whether an app actually works when I install it from the repo. There’s a lot less documentation for manually installing things as well and DNF is slow compared to apt…

      I don’t want to say for certain that Fedora as a distro is to blame but I suspect that it is. I miss my Debian days.

      • Domi@lemmy.secnd.me
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        (grateful for flatpaks for once!)

        That’s how I run my system right now. Fedora KDE + pretty much everything as Flatpak.

        Gives me a recent enough kernel and KDE version so I don’t have to worry when I get new hardware or new features drop but also restricts major updates to new Fedora versions so I can hold those back for a few weeks.

        I made a similar switch as you but from Ubuntu to Fedora because of outdated firmware and kernel.

      • ReversalHatchery@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        I had problems with waking from sleep/hibernate

        what graphics do you have? Don’t expect that to go away with nvidia. no such issues on AMD though, intel should be fine though

    • dbkblk@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 day ago

      This! Debian with Gnome or others is the answer. Take an afternoon to make it yours, then forget it. You can use backported kernels on Debian, to support newer hardware. Try this or upgrade to Debian 13 right now by changing the sourcefile to trixie instead of bookworm. Note : if you use Gnome, let gnome-software handle the updates for you (there’s an equivalent for kde). If you use others, configure unattented-upgrades for automatic updates.

  • remer@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    1 day ago

    I’ve been distro hopping for decades. I got exhausted with things constantly breaking. I’ve been using mint for the past six months with zero issues. It’s so refreshing that everything just works.

    • Diplomjodler@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      I second Mint. I’ve installed it on my laptop with zero issues, although that thing is pretty old so your mileage may vary on newer hardware. But mint comes with pretty up to date kernels these days so it’s definitely worth a try.

  • GustavoM@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    1 day ago

    You simply don’t do any maintenance whatsoever.

    t. Got a arch linux install that I (rarely) perform “sudo pacman -Syu --noconfirm” and it works like a champ.

    • F04118F@feddit.nl
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      edit-2
      1 day ago

      I used to lose my keys all the time. I don’t want to spend so much time looking for my keys, nowadays I mostly just leave them in the front door, I rarely lock it and it works like a champ.

      • GustavoM@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        1 day ago

        Comparing a PC maintenance to leaving the keys outside the front door is too dramatic, to not say the least…

        …unless you work at NASA and/or your PC is holding something too valuable/sensitive/high-priority for others to want to hack it “that badly” – which I (highly) doubt it.

        • F04118F@feddit.nl
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          1 day ago

          No it is

          https://www.pandasecurity.com/en/mediacenter/consequences-not-applying-patches/

          And:

          You’re allowing for more attack vectors that would not be there if the system were to be patched. Depending on the severity of the vulnerability, this can result in something like crashes or something as bad as remote code execution, which means attackers can essentially do whatever they want with the pwned machine, such as dropping malware and such. If you wanna try this in action, just spin up a old EOL Windows machine and throw a bunch of metasploit payloads at it and see what you can get.

          While nothing sensitive may be going to or on the machine (which may seem to be the case but rarely is the case), this acts as an initial foothold in your environment and can be used as a jumpbox of sorts for the attacker to enumerate the rest of your network.

          And:

          Not having vulnerability fixes that are already public. Once a patch/update is released, it inherently exposes to a wider audience that a vulnerability exists (assuming we’re only talking about security updates). That then sets a target on all devices running that software that they are vulnerable until updated.

          There’s a reason after windows Patch Tuesday there is Exploit Wednesday.

          Yes, a computer with vulnerabilities can allow access to others on the network. That’s what it means to step through a network. If computer A is compromised, computer B doesn’t know that so it will still have the same permissions as pre-compromise. If computer A was allowed admin access to computer B, now there are 2 compromised computers.

          From https://www.reddit.com/r/cybersecurity/comments/18nt1o2/for_individuals_what_are_the_actual_security/

          • unhrpetby@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            18 hours ago

            Depends on the environment surrounding the door, as well as the environment surrounding the computer.

            Some people simply care less about their computer security. The debate stops there. Security operates on a foundation of what you want to secure.

            By comparing two environments of someone’s life you know little about, you are commenting from ignorance.

            • F04118F@feddit.nl
              link
              fedilink
              arrow-up
              1
              ·
              12 hours ago

              If they don’t keep any private data on any computer that trusts their home network/wifi and don’t do taxes or banking on those, there’s no problem.

              But if they do, I maintain that the analogy is correct: their unpatched machine is an easy way to digitally get access to their home, just like an unlocked door is to a physical home.

          • GustavoM@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            24 hours ago

            Nice cherry picking/moving the goalpost, but that is not how refuting works. A PC at NASA has a much higher “threat level” than my Orange pi zero 3, just chilling on the background. Which means, a potential “security hole” may prove harmful for these pcs… but it’ll definitely not hurt me in the slightest.

            And before you parrot with other links and/or excuses… yes, I’m not negating their existence. I’m just saying they are there… but, well… “who cares”? If anything, its much faster to set up my distro back up “just like never happened before” than performing any “maintenance” whatsoever. Again, “Common sense antivirus” reigns supreme here – know what you are doing, and none of these things will matter.

            • F04118F@feddit.nl
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              13 hours ago

              You keep using the word “maintenance”. All I’m worried about is not installing any security patches for months.

              The problem that I tried to highlight with my “cherry picking” is:

              • Running a machine with open vulnerabilities for which patches exist also “paints a target on your back”: even if your data is worthless, you are essentially offering free cloud compute.
              • But mostly, a single compromised machine can be an entrypoint towards your entire home network.

              So unless you have separated this Orange Pi into its own VLAN or done some other advanced router magic, the Orange Pi can reach, and thus more easily attack all your other devices on the network.

              Unless you treat your entire home network as untrusted and have everything shut off on the computers where you do keep private data, the Orange Pi will still be a security risk to your entire home network, regardless of what can be found on the little machine itself.

    • Daniel Quinn@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      Ubuntu is literally just Debian unstable with a bunch of patches. Literally every time I’ve been forced to use it, it’s been broken in at least a few obvious places.

        • Naich@lemmings.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 day ago

          Ubuntu comes with non-free drivers which can make it easier to set up and use. I use Debian on my server and Ubuntu on my laptops. They have both been pretty reliable for me. LTS versions of Ubuntu are pretty bug free but have older versions of software. I’d guess that Daniel was using a non-LTS release which are a bit more bleeding edge. The LTS ones strike a good balance between modernity and stability.

        • Daniel Quinn@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          Absolutely. I’ve been running Debian for literally decades both personally & professionally (on servers) and it’s rock-solid.

          On the desktop, it’s also very stable, but holy-fuck is it old. I’m happy to accept the occasionally bug in exchange for modern software though, so I use Arch (btw) on the desktop.

    • spaghettiwestern@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      I am currently using an recent version of Ubuntu live USB for backups and a “serious” error window pops up every time I boot it. Same experience with Ubuntu installations. For me at least, Ubuntu isn’t anything close to stable.

  • 9488fcea02a9@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    1 day ago

    My desktop has been running debian for 5 years no problem including 2 major debian version upgrades, and a new(er) GPU.

    I had an old laptop that ran the same debian install for 8 years. All upgrades in place, no reinstalls.

    boring, and works. Stable + backports should cover the majority of people with new hardware support needs.