Awesome Tips I Hate That Fake Frames are Good Now…
Get 20% off DeleteMe US consumer plans when you go to and use promo code LTT20 at checkout.
DeleteMe International Plans:
Get 20% off DeleteMe US consumer plans when you go to and use promo code LTT20 at checkout.
DeleteMe International Plans:
NVIDIA is here at CES 2026 and is showing off the all new DLSS 4.5, 6x Frame Gen, Dynamic Frame Gen, and more. What does all this mean? Well, you’ll just have to watch and find out if this is an incremental upgrade, or a major one.
Discuss on the forum:
Check out our Channel Partners:
Secretlab – Get great chairs at:
PIA – Get the VPN of our choice:
dbrand – Buy a “Circuit” series skin for your device:
► GET OUR MERCH:
► GET EXCLUSIVE CONTENT ON FLOATPLANE:
► SPONSORS, AFFILIATES, AND PARTNERS:
Purchases made through some store links may provide some compensation to Linus Media Group. Affiliate links powered in part by
Linus Sebastian is an investor in Framework Computer, Inc and HexOS by Eshtek.
CHAPTERS
—————————————————
0:00 Intro
2:02 DLSS 4.5
6:03 6X Multi Frame Gen
8:27 Dynamic Frame Gen
11:19 Where Reflex 2?
11:39 G-Sync Pulsar
16:05 Outro
Deals for Days. Big home savings are happening now.



That AI stuff is still not worth 32gb of good RAM being priced at 900 USD.
DLSS is great i think we all can agree on that, but Frame Gen feels like its made for People Jensens Age. Cause while it may look like 200 Frames it feels like im using a gt 1030 to play my games…
This feels like a sales pitch
If it has 1% Ghosting/Trailing
If it has 1% Input lag
If it has 1% Fake frames
Then it's Completely USELESS for people who have standards and care about their singleplayer & competetive experience.
Throw it in the trash.
game devs are happy, someone really said "the more you bye the more you save" but they were aiming that for game developers
Lossless scaling does something.. NVIDIA "WRITE THAT DOWN WRITE THAT DOWN"
Soon Nvidia will be charging you extra to get more real frames in!
Fake frames are not good now. But its nice that YouTube compression hides it and allows devs to take advantage of bad optimization.
Now I just need 6000 dollar for a GPU. 😂😅
love these kinds of video's. linus just calling out all the good AND BAD right in the creators faces so they can actually hear real feedback and concerns…. since they just ignore them while collecting fat racks behind a desk
Dynamic fake frames: I tried only Lossless Scaling's adaptive fake frames only in Serious Sam 4, but it was surprisingly good.
It would be even better if we could run our 240Hz and up monitors natively without fake frames, as generated frames always will have missing information which can not be recreated.
I'll admit, the DLSS 4.5 looks pretty slick. But until Reflex 2 comes around, FG will have to stay on the shelf, at least for me.
Just done some quick testing on rtx 4090. In KCD2 new model is way sharper in motion and has limited occlusion artifacts. Now the bad news, IT HAS a big impact on performance. Quality dlss is now only slightly faster than native rendering. DLAA previously hitting 60 fps now 48. On the other hand performance dlss looks better than ever, so its usable. It would be a huge win if not for the performance cost.
I guess the 30 series GPUs are now the PEAK NVIDIA
i literally always turn off Motion Blur and DLss and even with a RTX 2080(now a 200-250 USD card) i can get 100+fps at 1440p in most games. I see no reason or need for this DLSS bullshit.
50ms latency, no thanks. Anything over 25ms is unplayable unless it is a visual novel, turn based rts, or similar game, in which case you dont need the extra frames anyway…
only for games that supports it
The start of the video kind of made it sound like you were comparing dlss 4.5 to ground truth and not the old model. took me a while to realise that the improvements weren't over the native rendering. Which honestly would have been a much more interesting comparison even if I realise nvidAI wouldn't like that.
Here I am playing Arkham Knight and wondering how good it looks while running so well without the need for fake frames.
Hopefully, I do hope, this does not make Developers fall to lazy optimization relying heavily on DLSS and Frame Generation for disguising native performance. I am looking at you Randy Pitchford.
I tested frame gen several times, and always disabled it immediately… frame gen is 💩
something to consider which might apply to only a few games on the example of Diablo 4
All watt readings are the ENTIRE PC, not just the GPU. Its just a little screen on the PSU that informs about the power usage.
– native rendering on my PC around 400w in my test scene at 5k
– DLSS balanced 270w at 5k
– DLSS quality +frame gen around 290w at 5k
DLSS seems to use up significantly less power than native rendering .. and when you are not pixel peeping .. and it is a low precision game, it works well. For games like Hades or Silksong, i would likely not use frame gen (but those games really don t need it because Silksong does not even tax the gpu enough to spin up the fan and Hades idles at around 150w
i fucking love this timeline. GPUs are $5000 to deliver me fake frames in a game that developers optimize for 720p 30fps knowing that fake frames and upscaling will take care of the rest….
Why is Linus shilling for Nvidia?
I love "solving" TAA by using a hardware exclusive feature
9:06 I thought that was gamernexus for a sec
i can clearly see the flickering/strobing in the g sync section and its extremely annoying but im gonna assume its only visible on camera…. hopefully?
My games play great on rtx 5080 with dlss 4 i truly do not notice difference
If you can set the dynamic frame gen to try and lock the framerate to your monitor's max refresh rate, that would be pretty cool tbh. say your gpu can go 140-150 FPS, with drops to 100fps and your monitor is 165hz, the dynamic frame gen can just fill the gaps and keep your fps at 165. would be great! don't expect 30-40 FPS to turn into 160fps comfortably though 🙂
holy camera angles batman
I thought they'd call it quits with adding more fake frames at this point but the other two things are actually exciting ngl!
cool stuff, cant wait to see it not working as intended ans still having awful ghosting and smearing because of course it will get it
Even if you hate Nvidia as a company, their technologies are absolutely top-notch and unfortunately, the competition is lagging behind.
would it work on VR?
I cam say that ULMB 2 curbstomps OLED in terms of motion clarity and blur. Been maining my PG27AQN for 2.5 years and going. Pulsar will probably improve it a little tho.
so they basically caught up to lossless scaling.. neat 👍
Ca…can…..can my 5070ti do this?…..
I was just looking at the latency rather than the fps and nvidia needs to answer for this
10:54 that must be a soundboard right
I have a 144Hz monitor, and a 5080 (don't judge me, at the time it was barely more expensive than 5070ti).
I enjoy my PC not screaming so loudly, so I did turn on like 2x or 3x frame gen to lower my GPU temp. That would be around 48-72 real fps. The 72 one was pretty fun. The game was Cyberpunk 2077.
So I would really like the workers at nvidia improving this tool,. Stable fps sounds nice, there's a lot of games where latency is irrelevant. In online shooters, if it so happens that you have a bad server connection and additional 50ms of latency, it is horrendous. I don't need to add a separate GPU-caused latency, ever, if I compete like that. But in a city builder or a factorio-like game, in the medieval RPGs, in strategy games, give more fps at the price of latency.
Heh, with the RAM and GPU prices the only upgrade to my PC in the next few years will be better cooling for my 5800X3D and RTX5080 XDDD The 64GB or DDR4 ram that I have and need, upgrading the whole system to DDR5 would mean spending more on RAM than on the CPU and motherboard combined XDDDDD Just ill us, nvidia with your ai.
0:31 But then I saw her face 🎶
Welp great, now game devs will not fix stuttering and instead use it for a fix
Why does the thumb nail look ai
Games are now optimised for frame gen rather than optimised for windows/linux/console
Gotta love how they keep trying to impress with the fact that they're training their models on these magical "higher quality" images just because they're rendering without the plague that is TAA. Guess what, I'll just turn that off myself and I can have magically higher quality myself with my own actually real frames.
This is theoretically rly good
It raises the ceiling with less hardware, which is actually a benefit
Do I believe in Nvidia giving the consumers the benefit, or would they try to scam the consumer w/ the
5070=4090 logic?
Ye id say the latter🙄
maybe the RAM prices arent so bad
I want gpus that focus on rasterization!
Maybe Reflex 2 is the friends we made along the way
If you "hate dlss" you should probably find a new hobby.