GTA V GPU Performance Review
Final Thoughts
All that time put into testing and troubleshooting each card, coming together into three simple charts. So let’s break this down into three parts, NVIDIA, AMD and then AMD Vs NVIDIA. Please remember that my testing methods included running FRAPS over the black cut scenes, which shot up to 257FPS; this would then in turn increase my average FPS.
NVIDIA has really cemented themselves in the upper sections of the charts with the extremely well performing Maxwell based graphics cards. In the tests, they performed very well, giving some very playable FPS figures across the three display settings. However at 4k, the GTX 970 displays the VRAM issue as the ‘estimated’ VRAM was over 4.5GB, which as we all know from recent press; is 1GB over the threshold for the GTX 970.
Onto AMD, I am extremely surprised with how well the R9 290 and 290X have performed here. Something is telling me that the AMD driver has given these cards a massive performance boost, placing them well above where I was expecting them to come in the charts.
So let’s compare the two, the age old question of AMD Vs NVIDIA. Well in GTA V, with our test bench and chosen settings, the NVIDIA GTX Titan X dominates; what else were you expecting. However, AMD seems to have made their driver work wonders with their R9 290x and R9 290 cards, giving them performance boosts that dominate the rest of the NVIDIA Maxwell range and makes the Titan X sweat; not bad for cards that are around 18 months old and significantly cheaper.
So are/ have you bought GTA V? What graphics card are you using? Let us know in the comments and on our forums.
Thank you to all our partners who provided the hardware and software that made this driver analysis possible.
I got this crash very often… and it’s totally random when it will happen
ERR_GFX_D3D_INIT
I got that when my GPU drivers crashed. Perhaps pulling back the graphics in some way? Like lower resolution or something.
playing on 720p with Gtx 980? :/
It’s an option, but weird your getting it often when I have a GTX980 as well.
That happens when my video card driver resets because the card freaked out due to a too high overclock. I had to knock my overclock down almost 50% to get GTA to play consistently without doing this.
can it play on samsung Galaxy E7
bravo 290x, beating latest nvidia gpus with 2 year old hardware
since graph’s like that mean dick since the min was much lower, well below playable rates. I bet if you took the avg the 980 is much better overall. These kinda graph’s that show min and max mean dick.
post links of other reviews pls
Something isn’t right with these benchmark numbers. The only difference between the 290 and the 290x are the number of shaders and, depending on the model, the core clock – the 290x has 10% more SPUs so it should be getting a slightly better frame rate. The 290 should not be getting higher scores (minimum and average fps) then the 290x. Also, the minimum frame rate should be going down as the resolution increases, not staying around the same or higher. For example, the 290 gets a higher minimum framerate at 1440p then at 1080p (roughly a 50% increase).
Either something other then the GPU is limiting the game, the settings are being (automatically) changed or the benchmarks need to be run multiple times to get a better representation of how the game is running.
Have to agree. R9 290 getting a bunch more Min FPS at 1440p compared to 1080p with the same settings, doesn’t make any sense. Experimental common sense dictates that at least 3 loops must be made to check for consistency and reach the final average (to use on these type of graphics), and it’s really not logical that at least 3 correctly made runs would provide an Min average higher on 1440p compared to 1080p (and the 290X at the same time). It’s either a typo, or the run(s) wasn’t using the correct settings for some reason. Or the benchmarking proccess used isn’t consistent enough to provide stable and reliable values anyway.
In the second graph, shouldn’t the bars be the same length if they’re the same score?
Just saying. That’s how bar graphs tend to work.
You don’t want to look biased or anything 😉
I like you Eteknix but I’m calling BS on your results after doing my own testing with a Titan X on the same settings and using the benchmark tool and nowhere does the Titan X ever get to 90fps at 1440p and no the 290x is nowhere near as quick as the Titan X running this game.
https://www.youtube.com/watch?v=9WmwlTBo6JY
I’ll just leave this here as proof. Cheers guys.
Yeah, the 290x is behind a 970 in this game, who in their right mind would believe a 290x beats a Titan X.
https://www.youtube.com/watch?v=k9walJtvG60
Yea i look at some other sites that did benchmarks are iffy. Problem with using graph’s that show min and max only. They are very deceptive as you can stand in 1 spot look at a wall and get great numbers for 1 card, The 290 cards doing sub 25-27fps at min vs 980 that isn’t even under 45 is def an issue to look at
Thanks for this link. Even though the game benchmark is horrible it still doesn’t show the 290x or the Titax X average 90fps like Eteknix stated.
GTX 750 same settings, 1920×1200 = 30fps-ish average. I’m good with that for now.
wtf why is 4k so shitty
Um because it’s twice the pixel density of full HD? Can’t you understand why that would torture most graphics cards?
shit all this if used would be very enjoyable, and certainly powerfull , Computer Tips , West Song Lyrics