I’m posting this as more of a “fun thought” than anything else.

It’s generally considered a fact that Linux, along with many other open-source software projects, are more efficient than their propriety closed-source counterparts, specifically in terms of the code that they execute.

There are numerous reasons for this, but a large contributing factor is that open-source, generally speaking, incentivises developers to write better code.

Currently, in many instances, it can be argued that Linux is often less power-efficient than its closed-source counterparts, such as Windows and OSX. However, the reason for this lies not in the operating system itself, but rather the lack of certain built-in hardware support for Linux. Yes, it’s possible to make Linux more power-efficient through configuring things differently, or optimizing certain features of your operating system, but it’s not entirely uncommon to see posts from newer Linux laptop users reporting decreased battery life for these reasons.

Taking a step back from this, though, and looking at a hypothetical world where Linux, or possibly other open-source operating systems and software holds the majority market share globally, I find it to be an interesting thought: How much more power efficient would the world be as a whole?

Of course, computing does not account for the majority of electricity and energy consumption, and I’m not claiming that we’d see radical power usage changes across the world, I’m talking specifically in relation to computing. If hardware was built for Linux, and computers came pre-installed with optimizations and fixes targetted at their specific hardware, how much energy would we be saving on each year?

Nanny Cath watching her YouTube videos, or Jonny scrolling through his Instagram feed, would be doing so in a much more energy-efficient manner.

I suppose I’m not really arguing much, just posting as an interesting thought.

  • magic_lobster_party@kbin.run
    link
    fedilink
    arrow-up
    40
    arrow-down
    1
    ·
    1 month ago

    It’s generally considered a fact that Linux, along with many other open-source software projects, are more efficient than their propriety closed-source counterparts

    This is not necessarily true. Linux had trouble with Nvidia Optimus, which is a GPU technology that seamlessly switches between power modes. Well, that is if it works properly, which it didn’t for Linux. I haven’t heard it in a while, so I assume it’s not a problem now anymore.

    But it was a big problem where Linux laptops drained batteries much faster because they were using the GPUs at max capacity at all times.

    What I’m saying is that the efficiency of Linux depends on access to hardware features, and that might depend on the vendors of the drivers.

    Also, like it or not, if there’s one thing I envy about Mac is its power efficiency. They usually last really long on one charge.

    • LalSalaamComrade@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 month ago

      I remember my laptop draining under an hour, thanks to the crappy Nvidia Optimus on X11, and the unbearable screen-tearing. This was during the time when I was using Fedora. I moved over to Wayland, without the nonfree drivers, because that was way more emery-efficient, and had my battery lasting for quite a long time.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Yes, I probably should have rephrased that as “are often more efficient” rather than implying that this is always the case. I do think, and I mentioned this somewhere else, though, that it’s quite a hard comparison to make. I’d probably make the argument that if the driver itself was the issue, making the driver open-source would likely (and that’s a “likely” going off an assumption which I can’t back up) be more efficient.

      Generally speaking, my point does still apply for fully open-source software which has been developed specifically for Linux. Unfortunately, we won’t be seeing much mainstream Linux-bespoke software for a while, at least not until the year of the Linux desktop finally arrives.

      I completely agree with what you’re saying, though.

  • ashaman2007@lemm.ee
    link
    fedilink
    arrow-up
    20
    ·
    1 month ago

    I wonder how this calculus changes with the dawn of AI built into the OS… will a Linux system that avoids all that nonsense end up being more energy efficient?

  • MonkderVierte@lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    1 month ago

    My Thinkpad that would have run 8 hours with Windows 7 runs 10 hours with Linux, despite the battery getting old.

  • Dave.@aussie.zone
    link
    fedilink
    arrow-up
    15
    ·
    1 month ago

    Energy efficiency can be offset by extra computational ability though.

    Eg Linux has a plethora of CPU and IO schedulers and allows you to tune the system to maximise performance for your particular workload. Getting more performance than with the generic CPU and IO schedulers provided in other OS’s generally means more power consumption, unless you do some sort of “performance per watt” calculation to take that into account.

  • LarmyOfLone@lemm.ee
    link
    fedilink
    arrow-up
    15
    ·
    1 month ago

    Maybe the EU should pass some regulations that require hardware manufacturers to open source the drivers for power saving on linux?

    How is the situation for modern desktop hardware like Intel 12000 series or AMD? Is there any problems there? I’d like to build a low power desktop PC with linux that can still game.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 month ago

      Open sourcing drivers would definitely go miles in helping to improve Linux’s optimization and power efficiency as a whole. Unfortunately, though, until the majority of software is written to be bespoke to Linux, we’re always going to be at a disadvantage. One day…

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      30 days ago

      I have to use windoze for my work laptop…

      15% cpu constantly being used by bullshit telemetry processes. Gigs of RAM also being eaten up by bullshit processes and “try copilot!!” popups every time i accidently make the wrong gesture on the touchpad.

      My 9 year old laptop with only 8GB RAM running debian is blazing fast in comparison.

  • AndrewZabar@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 month ago

    One large factory or hospital uses more power in a day than most people will use over the course of a year or maybe more. In many cases more than all Linux users combined lol. It’s astonishing. Same goes for waste production, pollution etc. It is those places where energy consumption and waste need to change drastically.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      30 days ago

      You mention waste, and actually that’s another interesting point. It’s no secret that Linux words wonders on older hardware, precisely due to its high level of optimization and low storage space requirement. Therefore, it could be argued that using Linux and other FOSS would quite literally reduce the amount of e-waste produced each year, since people would be able to use the same computer for longer.

      • AndrewZabar@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        30 days ago

        Absolutely! I’ve setup Linux on older machines and people love it. They don’t have to buy a new computer as they thought they would need to. They’re usually astonished at the difference! Plus people on a very low budget can buy something for like $50 and get the exact same - or better - experience as someone who just spent $1,000 on a new Windows Bloatware machine. For pretty much any needs other than gaming or heavy editing of photo or video, they’re going to be the same speed.

        I was running a 13 year old computer for a while and it was absolutely instant for everything. Music, video play, web browsing, email, casual games, etc. all just ran flawlessly, and my desktop was absolute eye candy with KDE and all kinds of custom stuff. Meanwhile Winblows 11 users have fucking anxiety attacks with all the shit that gets shoved in their faces just constantly being interrupted or inconvenienced. For what? Oh and also I was able to use a printer immediately upon plugging in the USB whereas Windows 11 refused to recognize the device no matter what drivers we tried.

        lol it’s astonishing and most people have no idea they’re being absolutely ass-raped by Microsoft with their toxic waste OS.

        • DNAmaster10@lemmy.sdf.orgOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          30 days ago

          I think that’s the big thing. The vast majority of computer users need little more than a bit of word processing, YouTube, maybe some online banking. Beyond that? Nothing at all.

          These tasks require such a ridiculously small amount of computing power when compared to other tasks, such as gaming and video editing, that 90% of the power their computer has is just not needed, and is instead being consumed by Windows.

          • AndrewZabar@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            30 days ago

            Yup! That’s why Windows and other software has to keep getting more and more bloated. Otherwise nobody would buy newer hardware all the time. I mean there’s always a need for new hardware, but driving the consumerism aspect of it, getting people to waste money en masse, requires creating this false need. It’s absurd, wasteful, despicable and only exists because companies aren’t satisfied with millions when they can get billions.

            It also relies on people being complete braindead morons. Hence its success.

      • DNAmaster10@lemmy.sdf.orgOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        30 days ago

        I agree, and I did mention at the end of my post that I’m not saying we’d see any radical changes in energy consumption. At the end of the day, manufacturing, agriculture, transportation etc use gigawatts more power than any computing activity does currently, and although that could change in the future, I do still think it’s an interesting thought to have.

      • AndrewZabar@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Doesn’t matter. I’m not saying we shouldn’t bother - of course we should. I’m just saying the world on the whole is only truly going to be saved when we force these colossi to change.

  • IsoSpandy@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    30 days ago

    My laptop came with win11 preinstalled… I used it like that for about 4 months. I can very confidently say installing Linux increased the battery duration of my laptop by about 20%.

  • IsoKiero@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    Interesting thought indeed, but I highly doubt that difference is anything you could measure and there’s a ton of contributing factors, like what kind of services are running on a given host. So, in order to get a reasonable comparison you should run multiple different software with pretty much identical usage patterns on both operating systems to get any kind of comparable results.

    Also, the hardware support plays a big part. A laptop with dual GPUs and a “perfect” support from drivers on Windows would absolutely wipe the floor with Linux which couldn’t switch GPUs at the fly (I don’t know how well that scenario is supported on linux today). Same with multicore-cpu’s and their efficient usage, but I think on that the operating system plays a lot smaller role.

    However changes in hardware, like ARM CPUs, would make a huge difference globally, and at least traditionally that’s the part where linux shines on compatibility and why Macs run on batteries for longer. But in the reality, if we could squeeze more of our CPU cycles globally to do stuff more efficiently we’d just throw more stuff on them and still consume more power.

    Back when cellphones (and other rechargeable things) became mainstream their chargers were so unefficient that unplugging them actually made sense, but today our USB-bricks consume next to nothing when they’re idle so it doesn’t really matter.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Yes, massively. At least with current data, I don’t imagine it would even be possible to measure this on a large scale, especially given the variation in what a computer is actually trying to do. I think it’s made even harder by the fact that software is often targetted at Windows or OSX rather than Linux, so even benchmarking software is near impossible unless you’re writing software which is able to leverage the specific unique features of Linux which make it more opimized.

      • IsoKiero@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        Linux, so even benchmarking software is near impossible unless you’re writing software which is able to leverage the specific unique features of Linux which make it more opimized.

        True. I have no doubt that you could set up a linux system to calculate pi to 10 million digits (or something similar) more power efficiently than windows-based system, but that would include compiling your own kernel leaving out everything unnecesary for that particular system, shutting down a ton of daemons which is commonly run on a typical desktop and so on and waste a ton more power on testing that you could never save. And that might not even be faster, just less power hungry, but no matter what that would be far far away from any real world scenario and instead be a competition to build a hardware and software to do that very spesific thing with as little power as possible.

        • DNAmaster10@lemmy.sdf.orgOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          I think the other difficulty would be the requirement of knowing both Linux and Windows through-and-through to ensure the code you’re writing is leveraging all the os-specific advantages. But yes, it’s definitely an interesting hypothetical.

    • ReveredOxygen@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      At least with AMD on Wayland, gpu offloading works seamlessly. But I’m not sure if the GPU is actually powered off when I’m not using it; my use case is an egpu rather than a dual GPU laptop so I don’t notice battery from it. I don’t know what the situation is with Nvidia or xorg

  • astro_ray@lemdro.id
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    I think LF Energy published a report on how open source is more sustainable. Although, I don’t quite remember the details of the report, it was more focused on sustainable projects not linux and such. If you are interested you can find more studies that explore the idea more quantitatively.

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    ·
    1 month ago

    let’s just say, if you compile a lot just for yourself (looking at you LFS and Gentoo users), you get spikes of inefficiency and it will take a while to even those out with your hopefully superior configuration and compiler flags

  • BaumGeist@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    30 days ago

    I’m a big fan of the idea of efficient computing, and I think we’d see more power savings at the End Users based on hardware. I don’t need an intel i9-nteen50 and a Geforce 4090 to mindlessly ingest videos or browse lemmy. In fact, I could get away with that using less power than my phone uses; we really should move to the ARM model of low power cores suitable for most tasks and performance cores that only turn on when necessary. Pair that with less bloatware and you’re getting maximum performance per instruction run.

    SoCs also have the benefit of power efficient GPU and memory, while standardizing hardware so programmers can optimize to the platform again instead of getting lost in APIs and driver bloat.

    The only downside is the difficulty of upgrading hardware, but CPUs (and GPUs) are basically blackboxes to the End User already and no one complains about not being able to upgrade just the L1 cache (or vram).

    Imagine a future where most end user MOBOs are essentially just a socket for a socketed-SoC standard, some m.2 ports, and of course the PCI slots (with the usual hardwired ports for peripherals). Desktops/laptops would generate less waste heat, computers would use less electricity, graphical software developement would be less of a fustercluck (imagine the manhours saved), there’d be less e-waste (imagine not needing a new mobo for the new chipset if you want to upgrade your cpu after 5 years), you’d be able to upgrade laptop PUs.

    Of course the actual implementation of such a standard would necessarily get fuckered by competing interests and people who only want to see the numbers go up (both profit-wise and performance-wise) and we’d be back where we are now… But a gal can dream.

  • Fisch@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Couldn’t you just compare the energy usage of Laptops or desktop PCs with native support running Linux compared to the energy usage when running Windows on them? I have a PC with an AMD GPU and CPU so my hardware is fully supported, I could actually test it. I think a laptop would be better to test on tho, since a desktop PC might not be trying to use as little power as possible in the first place.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      You probably could, but reasonably there’s not enough data out there to do this.

      Still, I’ll mention that even with an AMD CPU and GPU, Linux does often lack support or configuration off-the-bat, to massively varying degrees. The well-known example of this is with Nvidia’s propriety GPU drivers, which historically have been a massive issue, and will probably continue to be for a while even with Nvidia exposing more of its source code with its GPU drivers.

      The kind of support which I’m referring to, though, extends beyond this in many ways. One thing I didn’t mention, for example, is software support for linux. Many linux ports fail to leverage the full potential of Linux, either because the developers don’t know how to, or because they don’t care to. I recently read a dev blog for Factorio relating to this issue. The developer spoke about a very specific optimization which can be applied to Linux when saving games, which, in short, allowed for games to be saved concurrently, improving performance. Using this feature requires programming specifically for linux. While Proton offers incredible gaming support on Linux today, this sort of thing is not something which Proton can magically make work on its own.

      The same sort of idea often extends out into other areas of software and hardware. Applications which have been directly ported to Linux without much consideration often fail to implement these sorts of additional features and optimizations.

      The issue of hardware is, indeed, slightly different. One key thing which is often overlooked by people when assessing this sort of thing is the optimizations and tweaks applied by the hardware manufacturers and vendors themselves. These tweaks are often highly specific to the hardware they’re used for, and usually the vendors will only apply them to work with Windows, or the operating system which the laptop or computer ships with. Going back to the driver issue, the same thing applies. GPU manufacturers will often release high-quality drivers aimed specifically at Windows, offering optimizations which specifically benefit Windows. There’s almost zero incentive for these companies to release the same, or on-par drivers for Linux, due to its smaller market share.

      What this means, is that a much larger amount of work needs to be done by the Linux community to create or improve drivers for specific hardware. Drivers which will work off-the-bat with Windows will not work at all with Linux, and companies which offer Linux alternatives for their drivers often invest significantly more time on their Windows-counterparts. This is only complicated by the fact that many hardware manufacturers keep their driver source-code highly secretive, so trying to program one or alter an existing one for linux is significantly more difficult.

      AMD, as you mentioned, is often much better than alternatives such as Nvidia when it comes to releasing these “secrets” or source code, which makes developing AMD drivers for Linux significantly easier, allowing driver developers to apply many more optimizations than they would otherwise be able to.

      In conclusion, then, the only way this can truly be fixed is if these companies choose to support Linux as much as they do Windows, which unfortunately won’t truly happen until there’s some sort of monetary incentive (ie Linux having a majority market share).