• 1 Post
  • 32 Comments
Joined 9 months ago
cake
Cake day: May 28th, 2024

help-circle
  • Yeah, and a great post too - because some of your points here just point out that everyone ELSE have deprecated PhysX as well. Unity and Unreal both dropped it long ago. It’s basically a moot point for 99.9% of people playing games.

    Instead of using a PPU on the GPU, most people have focused on GPGPU physics calculations instead. The idea behind PhysX was a difficult one to launch in the first place. Given that most chip real-estate is going to these VPUs, I’m not surprised at all that they ditched the PPU for a more generalized version.


  • It only ever got deployed in a few dozen games

    Is the only sentence in the entire article you need to be aware of.

    This is rage-bait.

    This is a list of the games it affects:

    • Monster Madness: Battle for Suburbia
    • Tom Clancy’s Ghost Recon Advanced Warfighter 2
    • Crazy Machines 2
    • Unreal Tournament 3
    • Warmonger: Operation Downtown Destruction
    • Hot Dance Party
    • QQ Dance
    • Hot Dance Party II
    • Sacred 2: Fallen Angel
    • Cryostasis: Sleep of Reason
    • Mirror’s Edge
    • Armageddon Riders
    • Darkest of Days
    • Batman: Arkham Asylum
    • Sacred 2: Ice & Blood
    • Shattered Horizon
    • Star Trek DAC
    • Metro 2033
    • Dark Void
    • Blur
    • Mafia II
    • Hydrophobia: Prophecy
    • Jianxia 3
    • Alice: Madness Returns
    • MStar
    • Batman: Arkham City
    • 7554
    • Depth Hunter
    • Deep Black
    • Gas Guzzlers: Combat Carnage
    • The Secret World
    • Continent of the Ninth (C9)
    • Borderlands 2
    • Passion Leads Army
    • QQ Dance 2
    • Star Trek
    • Mars: War Logs
    • Metro: Last Light
    • Rise of the Triad
    • The Bureau: XCOM Declassified
    • Batman: Arkham Origins
    • Assassin’s Creed IV: Black Flag
    • Borderlands: The Pre-Sequel






  • How many phone numbers do you know off of the top of your head?

    In the 90s, my mother could rattle off 20 or more.

    But they’re all in her phone now. Are luddites going to start abandoning phones because they’re losing the ability to remember phone numbers? No, of course not.

    Either way, these fancy prediction engines have better critical thinking skills than most of the flesh and bone people I meet every day to begin with. The world might actually be smarter on average if they didn’t open their mouths.



  • kitnaht@lemmy.worldtoLinux@lemmy.mlAMD vs Nvidia
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    11 days ago

    Nobody is bitching. Rage less. My constructive point is that NVidia is a better option. NVidia’s CUDA stack is software - and unfortunately for us, that means it’s also paired with their hardware.

    Many people care if choosing something is going to hobble their workflow. In this point, if you’re using Blender, choosing AMD is going to hobble your productivity. I’m just stating facts.



  • kitnaht@lemmy.worldtoLinux@lemmy.mlAMD vs Nvidia
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    11 days ago

    Blender works with AMD hardware just great

    No it doesn’t. That’s our point. It works 30% as fast as its competition. That’s not “working just great”…it’s working slowly and like shit. The whole damn point of a GPU is to accelerate that work. The work that your AMD-HIP is doing in blender, could take an hour, and the NVidia would pump it out in 20 minutes.


  • kitnaht@lemmy.worldtoLinux@lemmy.mlAMD vs Nvidia
    link
    fedilink
    arrow-up
    7
    arrow-down
    5
    ·
    11 days ago

    Everyone’s gonna suggest AMD here because of your requirement of no-proprietary drivers; but unless you’re some sort of high-value target to a foreign government, I honestly choose the more pragmatic route of just using the proprietary NVidia driver and going NVidia. Especially if I’m not budget constrained on card.

    The fact of the matter is, AMD has just simply fallen behind. NVidia cards are (and have been for like 3 generations now) more performant. There is good reason why they dominate the market right now; they’re just simply better.

    It really depends on how far you want to take your zealotry on open source; there are parts of the CPU microcode that can see everything you do. Those are proprietary. Your bios is proprietary. You’re probably running 100 different proprietary blobs even IF you choose not to use the drivers that NVidia supplies; so why hobble yourself with a slower card that doesn’t have CUDA instructions? (often also very good for AI work if you are interested in that at all)

    I certainly understand wanting to push that direction for the sake of pushing that direction but - is performance and stability less important than using a proprietary driver?





  • Separate the Art from the Artist is accepted by SANE people, who don’t have time to milk tirades so that they can play victim on the internet so that strangers think they are virtuous.

    There is not a thing you do, have, own, buy, or operate that isn’t part of slavery, human exploitation, etc in some way or another. You only do this because you want to virtue signal to others that you think “good thoughts” and so you can be praised for being brave.

    You’re not brave. You don’t make a difference. And nobody cares.

    If you eat any kind of meat, your an evil “carnist”. If you don’t adopt pets from the shelter, then you’re contributing to pet farming. If you don’t drink from a paper straw then you’re killing turtles.

    Everyone everywhere has some problem with someone’s something. You literally cannot avoid it all. Own a smartphone? Then you’re evil and deserve to die because you’re carrying an item that was made by slavery! Let me guess…you’re not gonna give up the smartphone…are you? Yeah - I didn’t think you would.