I used this mod last year, I thought it was a gen frame, I just installed it the first time and it worked. However, after a few weeks I played The Last Of Us Part I again after a long time and I realized that this mod no longer worked, even though I didn't change anything. Is it due to the influence of newer game patch updates? Help
Question, should you use this specific version of DLSS Tweaks for TLoU? Or can you use the general-use one from the github? Not sure if there is a difference, was just wondering if I should use this over the general one
Some NV profile settings were recently found in 3.1.11+ that can do some of what DLSSTweaks offers, unfortunately changes to them are system-wide and can't be made specific to game EXE for some reason.
I'm trying to force Preset C for all quality modes but so far, the hud shows no difference as if the override was not applying, though the ini reports everything should be okay. Should i believe the hud is incorrect in this case ?
Yep F is what I use in almost every game, NV themselves seem to recommend using F for DLAA in their programmer guide too, but DLSS usually makes it default to A for some reason, not sure what's up with that.
AFAIK preset F uses pretty much the same model that 2.5.1 included, a lot of people recommended 2.5.1 over other versions for it's sharpness too.
Certain games might still work better using other presets though, if you want to find the best one for the game easiest way is probably to use dev DLL from https://github.com/NVIDIA/DLSS/tree/main/lib/Windows_x86_64/dev, enable the HUD overlay, then use the keybind that overlay mentions to change preset during gameplay.
So I did some testing, and only difference between presets that I could see is sharpening level, A being strongest and F being softest. I was wondering why the game looks so oversharpened, preset F looks great with DLAA on 3840x2160 DLDSR, smoothnes 50%. Any way to hide the overlay? Without disabling it in ini I mean.
You do know that DSR is called Downscaling, NOT upscaling, and the only benefit is better AA, don't you? Maybe not.
Display @ Native is hardware limitation, this is fixed, it's hardware locked @ Native by Definition. So in the following Native is the reference point eg Lower = Lower than Native
One other consideration, without post processing (Upsampling/Downsampling) integer scaling is better, because Pixels are integers. This maeans a 2×2 scaling factor, up or down example Four 1080p screens = one 4k screen Fractional scaling just doesn't fit example three 1080p screens = 3/4 of a 4k screen, so data is lost (maths wise, fractions are dropped). A 4:1 ratio or in DSR 4.00x NATIVE Downscaling Render @ 4k Display @ 1080p NATIVE
Generalising this for any use case Upscaling (Benefit higher FPS, cost, quality of image) Render @ Lower to Display @ Native
Downscaling (Benefit Best AA you can get Cost Much lower FPS Render @ Higher to Display @ Native
Nvidia DSR is Downsampling*, because a post processing layer is added. * Technically the words mean same thing, but Nvidia Always uses the term sampling officially, while most gamers say scaling.
The DSR process significantly improves image quality, and with the addition of the 13-tap Gaussian filter aliasing artifacts experienced with traditional downsampling are greatly reduced or entirely eliminated, further improving image quality.
This can mitigate the fractional scaling loss of data to some extent, but Integer scaling is still better. Source: DSR | Technology | GeForce
What does DL DSR do, it's still Downscaling, but tweaked by DLSS Render @ ABOVE NATIVE using DLSS to Display @ Native
Every form of AA ever invented, including the best, which is DLAA, is just trying to gain the Downscaling benefit, without the extreme GPU cost. Downscale 4.00×Native is best AA you can get, but costs most in FPS lost, eye watering expensive, makes the RT cost look trivial. 2.25×Native still good AA, but cost only one eye poked out, not both, RT still looks moderate. Nvidia Marketing BS = DL scaling (same quality, 2×more efficient) so just pepper sprayed eyes RT stating to get close. 2.25 DLxNative, means Render @ 2,25x Native by Upscaling from Native, then Display @ Native
Now DLAA is basically just Upscaling the AA, by training the AI on Downscaling, just like DLSS, it's another attempt at better AA. By far the best, without actually Downscaling, and a small FPS hit like most AA.
About the overlay, if you enable WatchIniUpdates in the INI then you should be able to alt-tab out of the game and change OverrideDlssHud at any time to enable/disable it. (next release will probably remove WatchIniUpdates and make it always watch for updates instead, also might have a way to let presets be changed without needing OverrideAppId too)
Better AA is the exact reason I use DLDSR. I would use DSR, but I am on 1440p display and 5120x2880 is just too much even for my 4090, I like to play on 75fps maxed out and this works just fine, mostly I am on 90% gpu usage.
Also I discovered why was the image so oversharpened. The game uses TAA + it's sharpening even with DLSS (DLAA) enabled! So first I disabled both TAA and sharpening, but that resulted in bad shimmering. (because of TAA, same thing happen with RDR2 for example, game relies on TAA) But someone found out how to disable sharpening only and keep TAA functional. It required some hex editing the .exe file, but it works beautifully. With sharpening off and TAA on, the image looks really great now. I highly recommend disabling the sharpening, it really makes a difference.
@Gerchan any chance you can link the fix for the sharpening? Was only able to find ones that disable it with TAA, playing with just sharpening disabled would probably help this game a lot.
I tried to desable sharpening (with following HEX editing) and use DLAA = true only but I have a kind of shimmering or ghosting (something like that) on some surface when I move the camera
technically dlss 2 should look better because even though its using 2 lower res frames jjittered over a native frame the win is because it has the extra frame data, its jitter is better because its based on machine learning and it can adjust the jitter pattern to get rid of most artifacts and finally TAAU uses less data and also has to apply a blur to hide artifacts then it sharpens a blurry image where dlss 2 is sharpening an already sharp image but i disable sharpening on dlss2 because it looks awful. Yes it depends on more than just sharpening but thats the trigger and its not needed anyways. When it comes to sharpness a modern display...assuming its not a low end one should do a significantly better job at image sharpening than gpu plus its not even needed, oddly i find in mmmost games it actually makes the image blurrier in motion as well as darker
73 comments
its DLAA thing heard
with frame genration
More info can be found at https://github.com/Orbmu2k/nvidiaProfileInspector/issues/156#issuecomment-1661197267
I'm trying to force Preset C for all quality modes but so far, the hud shows no difference as if the override was not applying, though the ini reports everything should be okay.
Should i believe the hud is incorrect in this case ?
Thanks :) !
AFAIK preset F uses pretty much the same model that 2.5.1 included, a lot of people recommended 2.5.1 over other versions for it's sharpness too.
Certain games might still work better using other presets though, if you want to find the best one for the game easiest way is probably to use dev DLL from https://github.com/NVIDIA/DLSS/tree/main/lib/Windows_x86_64/dev, enable the HUD overlay, then use the keybind that overlay mentions to change preset during gameplay.
Maybe not.
Display @ Native is hardware limitation, this is fixed, it's hardware locked @ Native by Definition.
So in the following Native is the reference point eg Lower = Lower than Native
One other consideration, without post processing (Upsampling/Downsampling) integer scaling is better, because Pixels are integers.
This maeans a 2×2 scaling factor, up or down example Four 1080p screens = one 4k screen
Fractional scaling just doesn't fit example three 1080p screens = 3/4 of a 4k screen, so data is lost (maths wise, fractions are dropped).
A 4:1 ratio or in DSR 4.00x NATIVE
Downscaling
Render @ 4k Display @ 1080p NATIVE
Generalising this for any use case
Upscaling (Benefit higher FPS, cost, quality of image)
Render @ Lower to Display @ Native
Downscaling (Benefit Best AA you can get Cost Much lower FPS
Render @ Higher to Display @ Native
Nvidia DSR is Downsampling*, because a post processing layer is added.
* Technically the words mean same thing, but Nvidia Always uses the term sampling officially, while most gamers say scaling.
Source: DSR | Technology | GeForce
What does DL DSR do, it's still Downscaling, but tweaked by DLSS
Render @ ABOVE NATIVE using DLSS to Display @ Native
Every form of AA ever invented, including the best, which is DLAA, is just trying to gain the Downscaling benefit, without the extreme GPU cost.
Downscale 4.00×Native is best AA you can get, but costs most in FPS lost, eye watering expensive, makes the RT cost look trivial.
2.25×Native still good AA, but cost only one eye poked out, not both, RT still looks moderate.
Nvidia Marketing BS = DL scaling (same quality, 2×more efficient) so just pepper sprayed eyes RT stating to get close.
2.25 DLxNative, means Render @ 2,25x Native by Upscaling from Native, then Display @ Native
Now DLAA is basically just Upscaling the AA, by training the AI on Downscaling, just like DLSS, it's another attempt at better AA.
By far the best, without actually Downscaling, and a small FPS hit like most AA.
If you follow DoktorSleepless's replies there he also shows some issues F has on Spiderman, seems D is the best one for that at least: https://www.reddit.com/r/nvidia/comments/10z2ra9/nvidia_publishes_dlss_super_resolution_sdk_31/j81y6hb/
About the overlay, if you enable WatchIniUpdates in the INI then you should be able to alt-tab out of the game and change OverrideDlssHud at any time to enable/disable it.
(next release will probably remove WatchIniUpdates and make it always watch for updates instead, also might have a way to let presets be changed without needing OverrideAppId too)
Also I discovered why was the image so oversharpened. The game uses TAA + it's sharpening even with DLSS (DLAA) enabled! So first I disabled both TAA and sharpening, but that resulted in bad shimmering. (because of TAA, same thing happen with RDR2 for example, game relies on TAA) But someone found out how to disable sharpening only and keep TAA functional. It required some hex editing the .exe file, but it works beautifully. With sharpening off and TAA on, the image looks really great now. I highly recommend disabling the sharpening, it really makes a difference.
but here is hex values that you need to edit:
Sharpening off:
Find:
74 0A C5 FA 10 86 E4 1A 00 00
Replace:EB 0A C5 FA 10 86 E4 1A 00 00
Sharpening off:
Find:
74 0A C5 FA 10 86 E4 1A 00 00
Replace:EB 0A C5 FA 10 86 E4 1A 00 00
rdr2 at launch is a great example.