A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can’t remember the exact models, but they were roughly from the late 90’s)

It spurred a lot of discussion on how many years of hardware support is reasonable to expect.

I would like to hear y’alls views on this. What do you think is reasonable?

The fact that some people were mad that their 25 year old GPU wouldn’t be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don’t think it’s unreasonable for the devs to drop it after two and a half decades.

I think for me, a 10 year minimum seems reasonable.

And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!

And don’t forget to Pay for your free software!!!

  • ShortN0te@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    1 month ago

    I do not think that can be determined in the tech space with ‘age’ alone. Popularity, usability and performance are much more important factors.

    It was already brought up in another comment, the gtx 1000th gen, is a nice example. The gtx 1080 is after 8 years still a valid GPU to use in gaming and the 1050 a nice little efficient cheap video encode engine which supports almost all modern widespread codecs and settings (except AV1).

    • SayCyberOnceMore@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      I agree with this point: age isn’t the measure of usefulness, popularity is

      Something might be 10yrs old and uaed by many people… and also something 10 months old is no longer used.

      Also, just a thought, if it’s “old” it’s probably a standard too, so probably doesn’t actually need much (relative term) effort to maintain…