this post was submitted on 04 Jul 2025
387 points (95.5% liked)

Technology

72729 readers
1637 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] JuxtaposedJaguar@lemmy.ml 16 points 1 week ago (1 children)

I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you're seeing more fake frames than real frames. It's deceptive and like snake oil in that Nvidia isn't distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say "The RTX 5040 has the same performance as the RTX 4090" but that's with 3 fake frames for every real frame, that's incredibly deceptive.

[–] FreedomAdvocate@lemmy.net.au 3 points 1 week ago* (last edited 1 week ago) (1 children)

He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.

[–] iopq@lemmy.world 1 points 4 days ago (1 children)

It does add latency, you need 1-2ms to upscale the frame. However, if you are using a lower render resolution (instead of going up in resolution while rendering internally the same) then the latency will be lower because you have a higher frame rate

[–] FreedomAdvocate@lemmy.net.au 1 points 4 days ago (1 children)

Yeah, so it doesn’t add latency. It takes like 1-2ms iirc in the pipeline, which like you said is less than/the same/negligibly more than it would take to render at the native resolution.

[–] iopq@lemmy.world 0 points 4 days ago (1 children)

Which also means it's not possible to use it to go to 1000 fps

[–] FreedomAdvocate@lemmy.net.au 1 points 4 days ago (1 children)

So it has limits? Oh no….. At 1000fps you can’t do much rendering effects at all. Luckily no one, and I do literally mean no one, plays games at 1000fps.

[–] iopq@lemmy.world 0 points 4 days ago (1 children)

Yes, but that also means there's no FPS advantage at all at 500 Hz using DLSS and people do play at 500Hz

[–] FreedomAdvocate@lemmy.net.au 1 points 3 days ago (1 children)

If you’re playing games at 500fps you don’t need DLSS. What is your point? Again - it’s for situations where you can’t get a good framerate at the settings you want to use.

How is this hard to understand?

[–] iopq@lemmy.world 0 points 3 days ago (1 children)

My point is my 2060 can't reach 500 fps even if you run the game in DLSS. You need a more powerful GPU, DLSS can only increase your FPS if the FPS is terrible, it can't boost you from 250 to 500

[–] FreedomAdvocate@lemmy.net.au 1 points 2 days ago (1 children)
[–] iopq@lemmy.world 0 points 2 days ago (1 children)

It would if DLSS didn't add latency, but it does

[–] FreedomAdvocate@lemmy.net.au 1 points 2 days ago (1 children)

It adds rendering time, not "latency" btw.

DLSS improves framerates at basically no cost, to let people hit playable or high framerates at quality levels they couldn't without it. It's not for hitting 500fps, it's for hitting 30/60/100 etc.

[–] iopq@lemmy.world 1 points 1 day ago (1 children)

It doesn't render anything, so it can't add rendering time, it just generates an upscaled version of an already rendered frame

[–] FreedomAdvocate@lemmy.net.au 1 points 23 hours ago (1 children)

Ok so you definitely don't understand how DLSS works lol.

DLSS has to be implemented by the developers of the game. They literally have to use the DLSS APIs in their game code. DLSS requires things like the player input and motion vectors for all scenes, materials, and objects that are in the frame. It adds time to the rendering pipeline. The more powerful your GPU the less rendering time it adds.

We're getting way off track now anyway, so to go back to the start: DLSS Super Resolution is amazing because it lets you get a framerate bump with either little-to-no visibile change to IQ, to a very noticeable degradation of IQ depending on how much of a framerate bump you get. It is one of the most significant advancements in gaming this century IMO.

On my PC with a 4070 Super, I can play COD BO6 at a near locked 120fps on my 4K 120hz VRR tv at "4K" using DLSS, whereas my PC definitely cannot do that without DLSS. It looks like native 4K, and believe me I've taken many screenshots and compared them at 300% zoom lol.

[–] iopq@lemmy.world 0 points 20 hours ago (1 children)

That screenshot said generated, not rendered. DLSS generates the final frame taking the motion vectors and the rendered lower resolution frame. It does not go in the rendering pipeline since the lower resolution frame has to be completely done rendering

[–] FreedomAdvocate@lemmy.net.au 1 points 10 hours ago

DLSS is applied in the rendering pipeline before post processing effects. It is part of the rendering pipeline.

You clearly don’t know what you’re talking about. We’re done here.