this post was submitted on 04 Jul 2025
387 points (95.5% liked)

Technology

72894 readers
3067 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] just_another_person@lemmy.world 90 points 1 week ago (20 children)

My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.

[–] eager_eagle@lemmy.world 56 points 1 week ago (4 children)

they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

[–] Chronographs@lemmy.zip 45 points 1 week ago

That’s exactly it, they have no competition at the high end

[–] just_another_person@lemmy.world 34 points 1 week ago (5 children)

Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.

For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

[–] eager_eagle@lemmy.world 25 points 1 week ago (3 children)

this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.

[–] SheeEttin@lemmy.zip 19 points 1 week ago (2 children)

Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.

[–] qupada@fedia.io 20 points 1 week ago (2 children)

Fortunately, even that tide is shifting.

I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).

It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.

With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.

Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.

To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.

[–] felsiq@lemmy.zip 4 points 1 week ago

AMD’s also apparently unifying their server and consumer gpu departments for RDNA5/UDNA iirc, which I’m really hoping helps with this too

[–] SheeEttin@lemmy.zip 3 points 1 week ago* (last edited 1 week ago)

I know Dell has been doing a lot of AMD CPUs recently, and those have definitely been beating Intel, so hopefully this continues. But I'll believe it when I see it. Often, these things rarely pan out in terms of price/performance and support.

[–] eager_eagle@lemmy.world 2 points 1 week ago* (last edited 1 week ago)

yeah, I helped raise hw requirements for two servers recently, an alternative to nvidia wasn't even on the table

load more comments (2 replies)
[–] Ulrich@feddit.org 2 points 1 week ago (6 children)

Then why does Nvidia have so much more money?

[–] iopq@lemmy.world 1 points 1 week ago

Because of vendor lock in

load more comments (5 replies)
load more comments (3 replies)
[–] Naz@sh.itjust.works 9 points 1 week ago (1 children)

I have overclocked my AMD 7900XTX as far as it will go on air alone.

Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.

At it's absolute best, it's competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder's Edition (the slowest of the stock 4090 lineup).

The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn't shown anything new.

AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we'll see some serious price cuts and competition.

[–] bilb@lemmy.ml 1 points 1 week ago* (last edited 1 week ago)

And/or Intel. (I can dream, right?) Hell, perform a miracle Moore Threads!

load more comments (1 replies)
[–] RazgrizOne@piefed.zip 17 points 1 week ago (1 children)

Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.

[–] CheeseNoodle@lemmy.world 5 points 1 week ago (1 children)

Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

[–] RazgrizOne@piefed.zip 2 points 1 week ago* (last edited 1 week ago) (27 children)

Yeah I got a 9070 + 9800x3d for around $1100 all-in. Couldn’t be happier with the performance. Expedition 33 running max settings at 3440x1440 and 80-90fps

load more comments (27 replies)
[–] Static_Rocket@lemmy.world 7 points 1 week ago (1 children)

Well, to be fair the 10 series was actually an impressive improvement to what was available. Since then I switched to AMD for better SW support. I know since then the improvements have dwindled.

[–] just_another_person@lemmy.world 3 points 1 week ago (1 children)

AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia's latest lines of Jetson are just recooked versions from years ago.

[–] FreedomAdvocate@lemmy.net.au 1 points 1 week ago

AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does.

AMD could only do that because they were so far behind. GPU manufacturers, at least nvidia, are approaching the limits of what they can do with current fabrication technology other than simply throwing "more" at it. Without a breakthrough in tech all they can really do is jack up power requirements and clock speeds. AMD will be there soon too.

[–] Shizu@lemmy.world 3 points 1 week ago

Cause numbers go brrrrrrrrr

[–] Chozo@fedia.io 3 points 1 week ago (1 children)

But but but but but my shadows look 3% more realistic now!

[–] cyberpunk007@lemmy.ca 8 points 1 week ago (2 children)

The best part is, for me, ray tracing looks great. When I'm standing there and slowly looking around.

When I'm running and gunning and shits exploding, I don't think the human eye is even capable of comprehending the difference between raster and ray tracing at that point.

[–] Chozo@fedia.io 3 points 1 week ago

Yeah, that's what's always bothered me about the drive for the highest-fidelity graphics possible. In motion, those details are only visible for a frame or two in most cases.

For instance, some of the PC mods I've seen for Cyberpunk 2077 look absolutely gorgeous... in screenshots. But once you get into a car and start driving or get into combat, it looks nearly indistinguishable from what I see playing the vanilla game on my PS5.

[–] FreedomAdvocate@lemmy.net.au 2 points 1 week ago* (last edited 1 week ago) (4 children)

It absolutely is, because Ray tracing isn’t just about how precise or good the reflections/shadows look, it’s also about reflecting/getting shadows from things that are outside of your field of view. That’s the biggest difference.

One of the first “holy shit!” moments for me was playing doom I think it was, and walking down a corridor and being able to see that there were enemies around the corner by seeing their reflection on the opposite wall. That’s never been possible before, and it’s only possible thanks to raytracing. Same with being able to see shadows from enemies that are behind you out of screen to the side.

load more comments (4 replies)
load more comments (15 replies)