• N0x0n@lemmy.ml
    link
    fedilink
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    3 days ago

    have at least 160GB of combined VRAM and system RAM.

    Yeaaaah so it’s possible but at the same time not accessible ? Those things are fucking power hungry and now I get why we all needed to buy led things, 12 volts powered systems and switch off every standby device: AI !

    And I dummy though innocently it was for the planet… 🥲

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      And I [thought?] it was for the planet… 🥲

      It was maybe never for the planet. We accidentally got LED lights, but the entire effort may have been to provide a scapegoat for the energy issues the oil barons didnt want to address. Dig a bit and you’ll find some neat charts and timelines, and maybe you’ll wonder too whether it was all a smokescreen like the recycling thing sometimes resembles.

    • Simon 𐕣he 🪨 Johnson@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 days ago

      Lol the suggested hardware for usable performance is ~$50k MSRP for the GPUs alone and that’s an SXM5 socket so all proprietary extremely expensive and specific hardware.

      My PC currently has a 7900 XTX which gives me about 156 GB combined VRAM, but it literally generates 1-3 words per second even at this level. DDR5 wouldn’t really help, because it’s a memory bandwidth issue.

      TBH for most reasonable use cases 8 bit parameter size quantizations that can run on a laptop will give you more or less what you want.