• EndlessNightmare@reddthat.com
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    4 hours ago

    If it ever becomes the standard desktop processor, they’ll pull the rug like they have with graphics processors and push everything to AI datacenters.

    Hard pass

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    3 hours ago

    I am literally just waiting for China to catch up and knock over all 3 of these TSMC suckers.

    I don’t care if they throw a 2000% tarrif on it, I will figure out a way to bypass it so I can enjoy pre inflation PC prices again when high end GPUs were going for $300, SSDs became so cheap that the HDD market actually started falling behind, and you could chuck RAM sticks around like spare change.

  • solrize@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 hours ago

    This is more about the Arm X925 core than about Nvidia. The X925 is a new superscalar ARM core that’s the first one competitive with current x64 at single threaded compute.

  • Omega_Jimes@lemmy.ca
    link
    fedilink
    English
    arrow-up
    146
    ·
    10 hours ago

    Do I want another option in the desktop CPU space? YES

    Do I want that option to be Nvidia? NOPE

    • Semperverus@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      9 hours ago

      I’m looking forward to the MilkV chipsets that are RISC V architecture. They have like a microATX board that just takes regular computer components and has functioning graphics drivers for AMD. Nothing is optimized for it but its a 64 core CPU if I recall correctly, and its ridiculously low wattage for what it does.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 hours ago

      I mean if it’s 2nd hand… and the free (libre) drivers are good… and AMD hasn’t gone full Intel… maybe??

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      ·
      10 hours ago

      I think that’s 100% what this is, and it’s a very smart play if that’s the case. Intel are reeling from some significant setbacks, while Nvidia is swimming in cash. There’s never been a better time for them to make a play for the desktop CPU space.

      And they’ve got absolutely no illusions about what’s happening with AI. They’re the ones who are literally paying AI companies to buy their chips. They know the space is collapsing. But as the guys selling the picks and shovels, they can ride out that collapse if they’re smart.

      End of the day, if what we get out of this is a new, serious competitor in the CPU space, that’ll at least be some kind of win. With Nvidia’s money and expertise they could really force Intel to get their shit together. AMD chasing their heels is the only that’s ever kept them from completely going to shit, but more competition is even better. With all three major companies playing in both the CPU and GPU spaces, that could be really good for consumers.

    • empireOfLove2@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 hours ago

      Both that and vertical integration. They can capture even more of the market by creating all in one Nvidia-only machines that you have to buy the whole rig to use their accelerators

    • Technus@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 hours ago

      Yeah, that was my question. Why the hell would they develop new silicon when 99% of their fab space is dedicated to feeding the AI bubble?

  • breadsmasher@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    edit-2
    10 hours ago

    clock speed of 4GHz, which is far below AMD and Intel’s 5GHz.

    phrasing is odd. 25% lower clock speed isnt “far below”

    but also fuck nvidia

    • ramble81@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      6 hours ago

      You willing to take a 25% pay cut? Yeah that’s hella far. Especially when you’re up in the GHz range.

      • Voroxpete@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        20
        ·
        10 hours ago

        Yeah, we’ve been through this exact same game with multiple iterations of Intel and AMD chips. When AMD first started doing consumer CPUs they badged them according to their equivalent Intel clock speed because one to one comparisons were misleading.

        What’s the L1 and L2 cache? What are the bus speeds? How many cores and how are they architectured? Multi-threading? How many steps is the instruction cycle? There are so many factors beyond just clock speed that play into real world performance.

      • Peffse@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        10 hours ago

        I can’t believe people still look at Hz and think it’s a sole metric that can be used for performance.

        Do you think they look at the 2005 Pentium 4’s 3.8GHz and assume it’s only slightly worse than what Nvidia will put on the market?

      • lacaio 🇧🇷🏴‍☠️🇸🇴@lemmy.eco.brOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 hours ago

        I’m hopeful ARM will follow more the licensing path than the going full Android path. I think stronger ARM computers, built at the ISA level by any company are also stronger RISCV computers. Builders like Rockchip (China) show that ARM and RISCV computers will bring alternatives to people, possibly with smaller fabs or on demand.

    • Sturgist@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      9 hours ago

      25% lower clock speed isnt “far below”

      AHEM! AKTCHEWALEE… it’s 20% which is even less qualified to be “far below” the other two.

  • melfie@lemy.lol
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 hours ago

    Meh, I’m waiting for AMD’s RDNA 5 to be released in 2027 and am hoping for some decent SoCs that are at least comparable to today’s RTX 5080, except without artificially limited VRAM. The current AI Max SoCs are pretty decent, but the RDNA 5 RTX cores are going to be what really makes it worthwhile for me personally, since I do a lot of Blender rendering and gaming.