✨ We've just launched our NEW website design!

Learn More Here
Graphics Cards

GTA V GPU Performance Review

Grand Theft Auto V – 1080p, 1440p and 4k Benchmarks


gta-v-ps4-xboxone-pc-1

Grand Theft Auto V is an open world, an action-adventure video game developed by Rockstar North and published by Rockstar Games. It was released on 17 September 2013 for the PlayStation 3 and Xbox 360. An enhanced version of the game was released on 18 November 2014 for the PlayStation 4 and Xbox One, and 14 April 2015 for Microsoft Windows. The game is the first main entry in the Grand Theft Auto series since 2008’s Grand Theft Auto IV. Set within the fictional state of San Andreas (based on Southern California), the single-player story follows three criminals and their efforts to commit heists while under pressure from a government agency. The open world design lets players freely roam San Andreas, which includes open countryside and the fictional city of Los Santos (based on Los Angeles).” From Wikipedia.

1080

1440

4k

Well well well, that’s a little surprising. The GTX Titan X obviously sitting at the top, but the R9 290X is really putting up a fight.

Previous page 1 2 3 4Next page

Related Articles

20 Comments

    1. I got that when my GPU drivers crashed. Perhaps pulling back the graphics in some way? Like lower resolution or something.

    2. That happens when my video card driver resets because the card freaked out due to a too high overclock. I had to knock my overclock down almost 50% to get GTA to play consistently without doing this.

    1. since graph’s like that mean dick since the min was much lower, well below playable rates. I bet if you took the avg the 980 is much better overall. These kinda graph’s that show min and max mean dick.

  1. Something isn’t right with these benchmark numbers. The only difference between the 290 and the 290x are the number of shaders and, depending on the model, the core clock – the 290x has 10% more SPUs so it should be getting a slightly better frame rate. The 290 should not be getting higher scores (minimum and average fps) then the 290x. Also, the minimum frame rate should be going down as the resolution increases, not staying around the same or higher. For example, the 290 gets a higher minimum framerate at 1440p then at 1080p (roughly a 50% increase).
    Either something other then the GPU is limiting the game, the settings are being (automatically) changed or the benchmarks need to be run multiple times to get a better representation of how the game is running.

    1. Have to agree. R9 290 getting a bunch more Min FPS at 1440p compared to 1080p with the same settings, doesn’t make any sense. Experimental common sense dictates that at least 3 loops must be made to check for consistency and reach the final average (to use on these type of graphics), and it’s really not logical that at least 3 correctly made runs would provide an Min average higher on 1440p compared to 1080p (and the 290X at the same time). It’s either a typo, or the run(s) wasn’t using the correct settings for some reason. Or the benchmarking proccess used isn’t consistent enough to provide stable and reliable values anyway.

  2. In the second graph, shouldn’t the bars be the same length if they’re the same score?

    Just saying. That’s how bar graphs tend to work.

    You don’t want to look biased or anything 😉

  3. I like you Eteknix but I’m calling BS on your results after doing my own testing with a Titan X on the same settings and using the benchmark tool and nowhere does the Titan X ever get to 90fps at 1440p and no the 290x is nowhere near as quick as the Titan X running this game.

    https://www.youtube.com/watch?v=9WmwlTBo6JY

    I’ll just leave this here as proof. Cheers guys.

      1. Yea i look at some other sites that did benchmarks are iffy. Problem with using graph’s that show min and max only. They are very deceptive as you can stand in 1 spot look at a wall and get great numbers for 1 card, The 290 cards doing sub 25-27fps at min vs 980 that isn’t even under 45 is def an issue to look at

      2. Thanks for this link. Even though the game benchmark is horrible it still doesn’t show the 290x or the Titax X average 90fps like Eteknix stated.

    1. Um because it’s twice the pixel density of full HD? Can’t you understand why that would torture most graphics cards?

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker!   eTeknix prides itself on supplying the most accurate and informative PC and tech related news and reviews and this is made possible by advertisements but be rest assured that we will never serve pop ups, self playing audio ads or any form of ad that tracks your information as your data security is as important to us as it is to you.   If you want to help support us further you can over on our Patreon!   Thank you for visiting eTeknix