NVIDIA Slows Performance in New Drivers

Share this video on

What's Hot

What's New

Top Grossing

Top of the Chart

Recommend

vlachos93 : Well. It's official. Nvidia is the new Apple. My next gpu will be from AMD.

GodLikeMeAgain : My next graphic will be AMD I will never buy NVIDIA again.

Nick Salvatore : “Wow the new graphics cards aren’t much faster...” (NVIDIA downgrades 10 series) “Wow 2080 is so much better!!”

TheRedDaren : That's because right side of frame takes more jpeg -Nvidea(probably)

Mahone : And it's bullshiit well done mates you've just proven that all of your past benchmark vids might be flawed.

nutella4eva : Before grabbing your pitchforks, see the response by Tech Yes City. He offers a pretty valid explanation for the difference in performance.

Meatball : i wonder how many death threats nvidea just got? ¯\_(ツ)_/¯

Mr Alien : My next gpu will be amd

MankaroyTM : Pffffff, I run Intel HD Graphics *I'm being serious, I have Intel HD Graphics*

Harpuli : Nvidia =scam

Jamal : Nvidia released gimped drivers now and will remove it, so people think it was a "bug". Nvidia waits couple of months and will release the gimped drivers again. They are doing it since the GTX 700 Series.

Pota2chip Gaming : So apple bought nividia? Lmao

Gunna : Video is proven to be incorrect. Please redo this test with the same version of windows 10 without it updating pls.

Tài Trần : My friend told me Driver 391.35 is the best driver for GTX 10th series

Teodor Dimdal : Hasn't a bunch of YouTubers already debunked this and said it was a bug with one specific driver?

Mikoto Misaka -KBHD- : Gotta love how everyone thinks this is intentional and not just a dodge drive. They let you run and provide downloads to old drivers for a reason, and that reason is because a lot of the time there latest drivers are dodgy and unstable. The latest windows update has been know to kill systems (I.e strain hardware and corrupt) and delete personal files, I don't see people switching to Linux or mac though????

Gizmo Mogwai : 12 minutes video and 5 YouTube ads? Seriously??

El S. : Wtf???

ASD BEF : FAKE BENCHMARK!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

THE BLACK DRAGON : This is BS and will always be, its the spectre windows patch *NOT* the drivers causing the slowdown .   Comment on r/Nvidia investigating. Hello again, nVidia fellows. If you only want the 416.34 driver results, jump straight to the 416.34 section mid-post, or directly to the "Verdict" section at the end (Quick TL;DR; hint: I haven't noticed any significant performance change between 416.16 and 416.34, except maybe for FC5, which got a couple of extra FPS). But first and foremost, please let me give credit where it's due: After the issues in my previous tests (https://www.reddit.com/r/nvidia/comments/9lbmgx/driver_41616_faqdiscussion/e75rd03) I tested again yesterday with one of my usual games under 416.16 and Windows 10 v1809, but this time with the Windows 10 Spectre microcode patch disabled via Windows registry key. Those are the results (the first two batches are copied from the 416.16 driver thread) 416.16 driver performance losses with Win10 v1809 patch installed   Wildlands three consecutive runs with 411.70 (W10 v1803 April Update, no Spectre patch installed yet): Avg FPS: 73.22 / 73.13 / 73.96 Min FPS: 64.87 / 66.00 / 66.34 Max FPS: 85.00 / 83.09 / 83.25 Avg CPU: 73.4% / 74.1% / 73.7% Avg GPU: 90.1% / 88.3% / 89.4% Wildlands three consecutive runs with 416.16 (W10 v1809 October Update, which installed Spectre microcode patch): Avg FPS: 70.49 / 71.05 / 69.69 Min FPS: 54.78 / 58.88 / 59.41 Max FPS: 80.52 / 79.76 / 79.00 Avg CPU: 77.7% / 74.3% / 78.0% Avg GPU: 84.8% / 86.2% / 82.3% Wildlands three consecutive runs with 416.16 (W10 v1809 October Update, but Spectre patch disabled): Avg FPS: 72.82 / 73.01 / 72.97 Min FPS: 63.81 / 62.10 / 64.45 Max FPS: 84.90 / 85.09 / 83.51 Avg CPU: 69.8% / 71.1% / 70.9% Avg GPU: 92.3% / 91.2% / 92.1%   With this numbers, I can confirm that most (if not all) the performance I lost on the previous driver test was not because the new nVidia drivers, and not because the Windows 10 v1809 October update itself, but due to the Spectre microcode patch (which was installed in my machine along the v1809 October update). Some people already got it installed in the past as a standalone KB4100347 update, so they won't probably notice any difference after Win10 v1809 October update. As I replied to a poster below, note that the Meltdown and Spectre patches do not have any direct effect on your GPU nor your graphics horsepower, only over your Intel main processor. You won't see any difference in performance unless your gaming is somewhat CPU bottlenecked (as was my case with the mid range i5-4590 4th gen processor with a pretty powerful 1070Ti GPU under DX11 games). If your system is mostly GPU constrained (low-mid end GPU or a high end main processor), the Spectre microcode patch shouldn't affect your gaming performance.     416.34 WHQL driver test And now back with the early performance benchmark for the new 416.34 WHQL driver, compared to the 416.16 release. The usual disclaimer: Remember this is NOT an exhaustive benchmark, just some quick numbers; and I can only judge for my own custom PC configuration. Any other configuration, different nVidia architecture, OS version,... may give you different results. For reference, my benchmark PC is a Windows 10 (v. 1809 October's Update, latest updates applied) custom built desktop, 8Gb DDR3 Ram, Intel i7-4790k with one Asus Strix GTX 1070Ti Advanced Binned, on a single 1080p 60 hz. monitor with no HDR nor G-Sync. I don't use Ansel nor Freestyle. Stock clocks on both CPU and GPU. You may have noticed that I've just upgraded my CPU to a more powerful i7-4790k, so hopefully all CPU bottleneck issues I had on previous tests should be gone. Thanks to this, the Batman: Arkham Knight and the Shadow of Mordor tests are back. Unless explicitly stated otherwise, all games run borderless windowed, using the built in Benchmarking tool, with available 'cinematic' options disabled whenever possible, (like Motion Blur, Chromatic Aberration, Film Grain Effects, Vignette effects, Depth of Field effects and such, not due to performance, but for my own preference and image quality reasons).   Results below: First one. Tom Clancy's: The Division. 1080p resolution with almost maxed settings (just lowered a bit Extra Streaming Distance and Object Detail), Neutral Lightning, Dx12 enabled, no ingame vSync nor Frame Cap limiter. The Division: 3 consecutive runs with 416.16: Avg. FPS: 87.4 / 87.4 / 88.0 Typical FPS: 88.2 / 88.0 / 88.6 Avg. CPU: 69% / 73% / 67% Avg. GPU: 95% / 95% / 95% The Division: 3 consecutive runs with 416.34: Avg. FPS: 87.7 / 87.6 / 87.6 Typical FPS: 88.3 / 88.2 / 88.1 Avg. CPU: 69% / 66% / 69% Avg. GPU: 96% / 96% / 95% Once again performance doesn't change at all in the Division. The same FPS numbers, and the same perceived smoothness (stuttering almost non existant). Notice that the FPS numbers here (taken with a i7-4790k 4Ghz processor) are almost an exact copy of the ones used on the previous bemchmark for the 416.16 drivers (taken with a i5-4590 3.3Ghz CPU). Kudos to DirectX 12, as this proves the new API is much less CPU dependant than Dx11. If only The Division were as stable on Dx12 than with Dx11... :)   Next one. A Dx11 game: Ghost Recon: Wildlands on 1080p, mostly V.High but no Gameworks options enabled. Wildlands three consecutive runs with 416.16: Avg FPS: 78.91 / 78.04 / 77.93 Min FPS: 67.38 / 69.79 / 66.87 Max FPS: 89.46 / 87.82 / 88.73 Avg CPU: 50.4% / 51.8% / 49.9% Avg GPU: 95.8% / 95.9% / 95.8% Wildlands three consecutive runs with 416.34: Avg FPS: 78.41 / 78.34 / 78.48 Min FPS: 59.46 / 68.52 / 68.18 Max FPS: 89.46 / 88.82 / 88.65 Avg CPU: 48.5% / 48.0% / 48.4% Avg GPU: 95.9% / 96.2% / 92.1% Wildlands stayed the same on the new 416.34 drivers too. Just only slightly lower average CPU usage (which is good, given the steady performance). Smoothness is great on both drivers. Once I updated my CPU to i7, those lag spikes I got on my previous test during fast driving scenes went away (coherent with the increased CPU bottleneck I had in my old i5 due to the Spectre patch). Notice how the performance here went up by a whooping 10% just by upgrading the i5 CPU to the i7. Again we can see how Dx11 games are much more reliant on CPU horsepower than Dx12 games.   Next is FarCry 5, a Dunia Engine game, a heavily modified fork of the original CryEngine from Crytek. Stunning graphics, not very hardware demanding, this time optimized by Ubi with a partnership with AMD instead of nVidia like previous FarCry games. Settings are 1080p, maxed ultra settings with TAA and FoV 90. FarCry 5 three consecutive runs with 416.16: Min FPS: 65 / 66 / 65 Avg FPS: 83 / 85 / 86 Max FPS: 106 / 107 / 107 FarCry 5 three consecutive runs with 416.34: Min FPS: 69 / 72 / 70 Avg FPS: 86 / 88 / 88 Max FPS: 108 / 109 / 109 FarCry 5 performance seems to have improved a small bit with this driver. Minimum, maximum and averages are consistlently up by a couple of frames per second. Smoothness perception and frame pacing is maybe a hair better too with 416.34. (As a sidenote, this is another Dx11 game in which I've got more than 10% extra performance just by upgrading the CPU)   Now an Unreal Engine game: Batman: Arkham Knight on 1080p, maxed settings and all Gamework options enabled. Batman: AK three consecutive runs with 416.16: Min FPS: 42 / 43 / 44 Max FPS: 124 / 124 / 125 Avg FPS: 83 / 84 / 84 Batman: AK three consecutive runs with 416.34: Min FPS: 42 / 43 / 43 Max FPS: 123 / 124 / 123 Avg FPS: 82 / 83 / 85 No changes at all with the new drivers and Arkham Knight.   And finally, Middle Earth: Shadow of Mordor, a LithTech Engine game. Settings are 1080p, maxed Ultra with FXAA antialiasing. Shadow of Mordor three consecutive runs with 416.16: Avg FPS: 131.41 / 131.26 / 131.21 Max FPS: 195.08 / 191.32 / 194.61 Min FPS: 75.66 / 87.11 / 82.55 Shadow of Mordor three consecutive runs with 416.34: Avg FPS: 130.95 / 130.93 / 130.83 Max FPS: 191.45 / 191.34 / 192.30 Min FPS: 73.50 / 85.06 / 83.17 And once again, Shadow of Mordor does not have any noticeable changes on the new driver. Average results are extremely stable and consistent between runs (awesome benchmark tool); thanks to this we can notice a tiny 0.3-0.4 FPS loss here (small but consistent), but the difference is so small that it may be anything in the environment.   Verdict: System stability has been good all around. I've got another crash-to-desktop while testing The Division, but it seems that this is pretty common under Pascal architecture cards and Dx12 since it got released a couple of years ago and under all driver branches. The rest of my usual test games ran fine: Wildlands, FarCry4, FarCry5, XCOM2, EVE: Online, Dauntless, Terraria, World of Tanks Blitz, Batman Arkham Knight, BattleTech, the Mass Effect trilogy, Monster Hunter: World and Middle Earth: Shadow of Mordor (short testing game sessions). Performance wise, this driver is stable. No significant gains or losses compared to 416.16, except maybe the couple of FPS up on FC5, which is always welcome. Some people are claiming some flickering issues on different scenarios, (and even the patch notes acknowledge them on FC5), but I've not yet found this bug myself. Also, for those playing Monster Hunter:World like myself, I've not noticed any difference before and after the driver update. ;)   Thank you for reading, and sorry for the longer-than-usual post.

Robert VanZant : I seen this in the 1080ti as well !!!!! Reverting to old driver

▐ ᴇʟᴍᴏᴅᴏ7▐ : I am an Nvidia and Intel user as of this generation because I heavily need some of their technologies. However, I can state the fact that this two companies are pretty much like apple in the way that the most they love is inflation and speculation on their chips pricing.

DryChicken : your video is wrong it got proven that it didnt downgraded the performence at all

David-Joe Hollingworth : Everyone is so quick to just on the "nVidia nerfing old cards" bandwagon, but it could just be a bad driver, it's happened before, and it's just coincidental with the release of RTX, I'm not a fanboy, I just thing everyone should give nVidia a chance to fix their mistake before calling them

Pat Gaming Apa's Style : Just know that I am here from TECH YES CITY. They called u out on ur flawed benchmark so pls next time double check it.

elcriC tcefreP A : I don't think this is anything new. I remember spending a whole day downloading drivers from nVIDIA's website to assess which one was the best for my card, a GeForce 5700 FX, back in 2003-2004. I did the same for ATi cards I had, like HD 2600 in 2008 or so. In most of the cases, the best performing (and even stable) driver was never the latest one. I don't know exactly why, but I suspect it has to do with drivers optimization for new graphics cards, as I don't see companies spending time to optimize software for old/older hardware. The way I see things is that they don't deliberately slow performance in new drivers, they are just simply optimized for the newest hardware (and this goes for both nVIDIA and AMD).

Last Ouji : This is why i use intel hd

David Košič : Now then that others found where you did your mistake you should just delete this video because it's misleading.

Rishi Kapadia : Silently THEY ARE TELLING US TO BUY RTX CARDS😂

ʁɔvʎнdǝвǝdǝu ʞин : Да за такие цены от NVIDIA их не только говном поливать нужно! Они как элитные сучки дорогие, обязаны терпеть любой хайп от клиентов!!!

Mr.07 : Rx 580

Devil Jin : Fuck nvidia

LtGoldenRod : Please remove this video and redo the test.

Howie Tark : Should report his video for misleading text.

BillnWa : It appears there is another explanation that doesn't have anything to do with Nvidia.

Dundi : 399.24 - 58% usage 416.16 - 45-48% usage guess that's enough to blame Intel instead of nvidia

M Ólliver : I've downgraded from 416.16 to 399.24 after watching this video and it's true. I've gained up to 10 fps in Assassin's Creed Unity and there are less bugs also.

Kingsley Ho : vega im comming

Caleb Able : Next they'll say "it was to improve the lifespan of the product."

All about Games : This is nothing new from Nvida. Thats their normal move when a new gpu gen is released. NEVER BUY NVIDIA!

supe kami : I hate nvidia. MAKE RADEON GREAT AGAIN !

Vexxed : I’m skeptical of these results. This seems almost too big to be true. I’ll wait for the tech press to investigate before pointing fingers though.

Caleb Able : Now my intel HD4400 graphics just got proportionally better compared to the Nvidia 10 series.

StreetFerrari : What is the best driver version to use with the 1060?

Moody's Mode : They always do the same and people still buying so why are surprised

dos54 : Pretty easy to see that people would notice something like this. Gamers pay attention to frame rate quite a bit. Nvidia is either quite possibly the stupidest company to ever exist with management that's stupider than anyone would've thought possible and a big giant death wish as a company... Or it's a bug. As in not intentional. Apple would've been able to at least explain them throttling as "extending battery life". This move would just be plain suicidal for Nvidia. I don't bite.

bob jones : Wow, nvidia=apple now. Great...

Athlaz : Seems like something went wrong when you installed the new driver, I tested same drivers on gtx 1080 and there was no difference between them outside of margin of error

Op Huskyy : Time to uninstall all my drivers for my GPUs

#3dik show : next GPU AMD.