• kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    22 hours ago

    Regardless of the OS, if you’re using the computer for anything productive, the application software, not the OS, will eat the majority of the RAM anyway. If you’re looking at the minimum requirements, chances are you’re not looking to do anything besides browsing the web with 5 tabs open.

    It sucks though, I agree - software should get more efficient over time, just like hardware does. Out of curiosity, do we have anything more specific, i.e. how they tested that, what apps were running and so on? Or maybe they now deem that more things should be running?

      • kamen@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        30 minutes ago

        I remember that with Opera (before the switch to Chromium) I was able to open literally 100+ tabs on a machine with 1 gig of RAM. Sure, the web was simpler back then, but not by much.

    • Jason2357@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      18 hours ago

      Which is exactly what Ubuntu is doing. The desktop and even most native desktop applications that come with it will run just fine with 1 or 2GB of ram. If you used it like a 90s computer for 90s computer tasks, it will work fine.

      In practice, however, users will open a web browser to some “modern” websites or a couple electron apps and have a very bad experience.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      It sucks though, I agree - software should get more efficient over time, just like hardware does.

      It generally does, for any given computing task, but the problem is that generally software adds more features over time, not least of which is supporting new hardware that hits the ecosystem.