• grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    3
    ·
    2 days ago

    This makes me want to buy Intel instead of AMD for the first time in a couple of decades.

    • brokenlcd@feddit.it
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 days ago

      I mean if the new gen of gpus has accellerators it makes sense; actually out of curiosity, does any of the new intel stuff have any of that? I am still at the old i5 chips

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        I’m having a hard time understanding your question, but I’ll try my best:

        if the new gen of gpus has accellerators

        • GPUs are pretty much nothing but [graphics] accelerators, although they are increasingly general-purpose for parallel computation and have a few other bits and pieces tacked on, like hardware video compression/decompression.

        • If you typo’d “CPU,” then the answer appears to be that Intel desktop CPUs with integrated graphics are much more common than AMD CPUs with integrated graphics (a.k.a. “APUs”) because Intel sprinkles them in throughout their product range, whereas AMD mostly leaves the mid- to top-end of their range sans graphics because they assume you’ll buy a discrete graphics card. The integrated graphics on the AMD chips that do have them tend to be way faster than Intel integrated graphics, however.

        • If you mean “AI accelerators,” then the answer is that that functionality is inherently part of what GPUs do these days (give or take driver support for Nvidia’s proprietary CUDA API) and also CPUs (from both Intel and AMD) are starting to come out with dedicated AI cores.

        does any of the new intel stuff have any of that? I am still at the old i5 chips

        “Old i5 chips” doesn’t mean much – that just means you have a midrange chip from any time between 2008 and now. What matters is the model number that comes after the “Core i5” part, e.g. “Core i5 750” (1st-gen from 2008) vs. “Core i5 14600” (most recent gen before rebranding to “Core Ultra 5”, from just last year).


        As far as “it makes sense” goes, to be honest, an Intel CPU would still probably be a hard sell for me. The only reason I might consider one is if I had some niche circumstance (e.g. I was trying to build a Jellyfin server and having the best integrated hardware video encode/decode was the only thing I cared about).

        What I really had in mind when I say it makes me want to buy Intel (aside from joking about rejecting “AI” buzzword hype) is the new Intel discrete GPU (“Battlemage”), oddly enough. It’s getting to be about time for me to finally upgrade from the AMD Vega 56 I’ve been using for over seven(!) years now, so I’ll be interested to see how the Intel Arc B770 might compare to the AMD Radeon RX 9070 (or whichever model it’s competing against).

        • I_Has_A_Hat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 hour ago

          If you can get your hands on a 12th gen Intel CPU, they are some of the best. First gen with the new architecture and they over engineered the hell out of it because they didn’t know what it could handle. They scaled WAY back on the 13th and 14th gens, which is why you hear about so many issues in those processors, but not the 12th gen.

      • ikidd@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        These companies are putting NPUs into their processors, and nobody will ever build the software to use them because everything is done on GPUs. It’s a dog and pony show.