✨ We've just launched our NEW website design!

Learn More Here
Graphics Cards

AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780

Introduction


4k_gaming_combined_sized

***Please see our latest 4K Gaming article here which adds GTX 780 Ti, GTX 780 Ti SLI and R9 290X CFX to the mix***

With GPUs getting more and more powerful and 4K monitors becoming available for consumer purchase we thought we’d use AMD’s R9 290X launch as a spring-board to look at the 4K Gaming performance of AMD and Nvidia’s top 2 single GPU graphics cards. Of course since writing this article Nvidia have revealed their intentions to release a GTX 780 Ti graphics card which is worth considering when looking at these benchmarks. AMD are also expected to reveal an R9 290 graphics card at some stage this year too. So this is by no means a comprehensive or complete look at 4K performance on high end AMD and Nvidia GPUs, but we think it is an interesting place to start.

Firstly let’s recap the graphics cards we’re using, all four are pictured above and they are:

  • AMD R9 290X – read our review here.
  • Nvidia GTX Titan – read our review here.
  • Nvidia GTX 780 – read our review here.
  • Sapphire Vapor-X OC AMD R9 280X –  read our review here.

Next we’ve managed to get a hold of a 4K monitor for this testing as AMD were kind enough to provide us with the Sharp PN-K321 4K monitor.

AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780

The Sharp PN-K321 uses a 32 inch IGZO panel providing a resolution of 3840 x 2160. Being a first generation 4K panel it uses two 1920 x 2160 displays stitched together with an advanced display controller chip. The 4K monitor is able to stream 4K at up to 60Hz which is best done through DisplayPort.

PNK321

We’ve used the usual selection of games that we’d normally do in our graphics card reviews so we’ve got a selection of 7 games and one synthetic benchmark to show you: Alien vs Predator, Bioshock Infinite, Hitman Absolution, Sleeping Dogs, Unigine Heaven 4, Tomb Raider, Dirt Showdown and Metro Last Light. Without any further ado let’s see exactly how these AMD and Nvidia GPUs got on at some 4K gaming.

1 2 3 4 5 6 7 8 9 10 11 12Next page

Related Articles

83 Comments

  1. Nvidia got anihilated, simple as that and got what was coming for a while to them. Definitely I’ll welcome to my builds AMD cards, qutie shameful to see GTX Titan, the oh-so-hyped graphics card currently on market, get beaten by R9 290X which is situated $350 lower.

    1. I wouldn’t say annihilated, AMD looking good this time around though, only took them 8 months to release something that actually has some balls. You know what everyone is thinking though – what will Nvidia’s response be? Price cuts?… not likely to be significant enough to sway anyone that wants the best. New card?.. hope so, but only if it isn’t a dual GPU card.

      1. I do believe nVidia are struggling to make a dual-GPU card with their current 700 series, they’re too much the blowtorch GPU for that.

    2. You forget that the GTX Titan came out just after GTX 680 and when it arrived it was about 1½ – 2 times as powerfull as the strongest single GPU on the market. Thus the hype… Ofc it starts getting beaten when Nvidia put out a new series of cards and AMD even later a new one too… “Nvidia got nihilated” are the words of a AMD fanboy and make no sense to put out now. The Nvidia prices will probably fall to match those of AMD and winter/spring 2014 Nvidia will release their 800 series….

      1. So you’re sitting behind the screen criticising me how much of an AMD fanboy I am (and I admit I am, built computers since 2005, back when having a Radeon 9550 was beyond god-tier), while you immediately defend Nvidia by the fact that they built the most powerful singe-GPU card (until AMD dished out R9’s). Sorry but that is a really weak argument, and if you find no sense, I don’t care, your lack of reading skills shines. When Nvidia launched the Titan, I knew and I’ve even bet money between my colleagues that AMD will push something much better and cheaper, and the current situation speaks by itself: Nvidia keeps hype along with their 4-figure premium card, while AMD will run away with the buyers who were patiently waiting. We are still waiting to see how will 780 Ti will be priced, but I could bet another series of money it will run higher in cost than R9 290X, just you watch, I’ve been too long in the computer business to not spot the trends.

        1. Im not a Nvidia fanboy, I’ve had cards from both manufactures and I’ve been equally pleased with both. Ofc the companies will keep beating eachother, if they didnt, one of the companies probably would be gone by now. But you could turn it around and say Nvidia will have the best cards in 6 months when they launch the 800 series (If 680 Ti doesnt take that spot). Its been that way, for a long time, Nvidia pushes the limits and makes the highest performance cards on the market at a very steep price, while AMD gives you alot more bang for the buk with their cards. The thing is, there is value in both, there will always be people wanting the best of the best, who have enough money to buy it, and there will always be people who need to get the most for the least money. If you think my point was to defend Nvidia you missunderstood me, my point was that its wrong to say that they got anihilated by AMD or that the Titan isnt what it was made out to be (overpriced or not). If the R9 290X had been out a month after the Titan I’d agree with you, but one company being over 6 months ahead of another is quite alot, which is also the reason for the almost insane price of a Titan. I wouldnt be suprised if the Titan false in price now and in sometime in Q1 2014 we will se a Titan 2 which again will be 1,5 as powerfull as the 290X, at a mad price no doubt.

          1. Thing is, currently you can pick up two R9-290x’s for the price of one Titan. So even if nVidia release a new all-killing GPU, which no doubt they will, AMD won’t go down because their prices are better. And that’s the real deal here. If you were to consider price to performance in this case, you could be looking at things a whole new way.

          2. these cards need to be half the price of a titan as they will have half the life span due to high temps and extreme power consumption that will take it’s toll on the cards components….

          3. Look, I know your bromance with nVidia means you get angry with any card that starts beating them, but still, try to understand. Most of these cards have a two or three year warranty, and if you intend to be running a card like this for more than that then, well, chances are it’s going to fail after a while be it nVidia or AMD. And a 95c operating temperature is nothing new – just because it’s rated to hit 95c doesn’t mean it always will, either – my A10 is rated for 115c. nVidia’s cards tend to run hotter by default, but they are not rated to hit as high temperatures. Same goes for Intel CPU’s. Not knocking either, just understand that the temperatures you are used to dealing with may not be the same as the temperatures you are likely to see on an AMD chip – AMD’s cards are rated for higher temperaturres and last longer at those temperatures than nVidia’s equivalents.

          4. firstly 95c is not the highest it is the average temp that these cards will run at whilst installed in the average persons pc just as 80c is the average for the Titan. The titans absolute max safe temp is 95c as stated on the Nvidia website, where as I can’t seem to find the R9 290x’s maximum temp.

            secondly I do not have a bromance with Nvidia I just appreciate good quality products with enough room in them to be tweaked without the need for custom cooling nor do i want them heating up the rest of my components by 10-15c thus causing the life of them to diminish.

            If this card produced the impressive results it has whilst using the same amount of power and running at the same temps as the Titan I would have been more impressed than I currently am. I just hope that when the likes of ASUS, MSI etc get hold of them they can sort the cooling out like AMD should have done in the first place.

            This is a good card no-one can deny it but in the 8 or so months that the titan has been out I would have thought they would have figured out how to make this piece of tech a little more economical and cooler.

          5. I know man, just teasing a little. Thing is though, the card running at 95c is probably not going to shorten the card’s life, nor reduce your bounds for overclocking – realistically though, reference coolers are never going to be that good for overclocking either way, and suppliers such as overclockers will supply cards with custom coolers preinstalled. The other components on the board are likely to be rated for temperatures up to 120c or higher. Until the next process is available – that is, until the next generation of cards – these cards are going to run hot. The only thing in your system that you should be concerned about is the CPU – and this being a leaf-blower fan, you don’t really need to worry yourself too much there. Motherboards are also designed to keep the ridiculously hot components apart from heat-sensitive components, either way. The only issue you might have is were you to run a Haswell i7 on a 120mm radiator in a cooling loop with four of these cards.
            But then, you wouldn’t do that with the Titan, either, would you?

          6. Other than Haswell and the ancient Pentium-D, that’s completely backwards on the CPU side. In general, Intel CPUs run cooler than AMD. And first through third generation Core i5/i7 processors overclock better than competing CPUs from AMD. Intel CPUs are historically better on thermals and power consumption. On the GPU side I agree… AMD’s cards are rated for higher temps, and typically run hotter.

          7. I really hope one of them just die, so we wont argue anymore. gtx 780ti will cost more than r9 290x and if it still below Titan performance, they may die now.

    3. All i’v seen so far is a very small and insignificant performance difference between a R9 290X and a Titan, migh also add the GTX 780 there, since it seems to just a bit weaker than 290. The price difference is a problem, but dont expect nvidia to just sit around watching AMD owning the market because they have low prices. A 5-10 FPS difference is not “annihilation”. I expected to see 20+ fps difference, maybe more ? Because with all the hype and the “Titan Killer” bs it sounded like it will be capable do to just that, what i get instead for now is something that just sits right there on par with the competitors performance wise, only the price is a good pro so far.

    4. Lol, “anihilated”…8 months later, by a card that uses more power, runs hotter, and much louder. Right..

      1. Maxwell is Q42014 minimum, they’ve taped out too late. Maxwell will probably come after AMD’s next gen GPU’s if the current info we know is correct.

          1. It’s normally about a one year time period before designs being finished/early engineering samples and the product reaching the consumer in an actual card. Also from what I can see there these are mobile rebrands and not actual Maxwell chips. AMD has had stacked memory ready and working well for a couple of years, and with Kaveri launching soon, if HSA picks up it wouldn’t be totally crazy if AMD released a new fully HSA compatible, TrueAudio compatible line using stacked memory to complement the 290X. They could do it in a hurry too, as they could use GCN designs on the newer denser(But still officially 28nm) process node Hawaii/Bonaire is on, most of the work would be on the PCB, which is relatively light.

          2. These are mobile samples for notebooks. Maxwell is coming 2H2014, Q3 at best. They’re both waiting TSMC volume production. Maxwell and GCN 2.0 are very similar from a top-down point of view. Nvidia is going the 1/4 fp route, AMD has moved the geometry units to the SIMDs and doubled the ROP count. The only major difference must be the Texture Units. AMD is aiming at smaller dies, Nvidia at less power for the same throughput at the cost of a larger die.

    5. Whatchatalkinbout Willis? The introduction of the GTX 780 pretty much killed the Titan’s appeal for single-card setups. Almost the same performance for a 40% lower price. The GTX 780 Ti will probably also beat the Titan- it’s a card that’s almost a year old and was overpriced from the start.

    6. Titan is not just a gaming card.. The cuda cores are very important for 3D rendering softwares.. it’s a mix of both Quadro and gaming cards whereas ATI is solely for gaming. Try rendering heavy scenes in Maya/Max/Vray… you’ll want to break the ATI cards into two! That said, it may be good from the gaming point of view… but that still excludes gorgeous visual effects that only nVidia provides… subtle differences, but if you’ve got the eye for details… nVidia is the way to go. Too less GOOD games are powered by ATI/AMD… it has always been nVidia. Of course, I’d love it if ATI gives them a tough competition, its better for all the consumers.. no point in arguing with each other 🙂

      1. Right, but the people who do 3D renderin are a gross minority, and I’ve done rendering using a render box packed with FirePro’s. Granted it became a habit for me to buy aftermarket cooling, mostly Corsair, but still they did a bloody good job rendering projects that had 10mil+ polys.

        1. nVidia GeForce is for people like me… I do rendering quite a bit… but I can’t afford to have 2 different cards… and I do a lot of gaming as well. My colleague chose one of the HD 7000 cards and he is miserable now! Cuz the HD series doesn’t do a good job of rendering when compared to GeForce cards. Of course it works great with games.
          FirePros and Quadros lie in the same category… both are bad at gaming… I’ve tried LA Noire on a FirePro and a Quadro FX4000 at my workplace… both suck at gaming… lol.

  2. this is good, so we dont need to get headache anymore, just go AMD, cheaper, stronger, what else do you need?

    1. If the Titan was to run using that much power and getting to those temps I’m pretty certain it would crush all cards easily……

      the r9 290x is too hot and too power hungry, the life of this card will be tiny in comparison to other cards….

      1. I understand your point but here’s how I see see it. You can ONLY buy reference GTX Titans, these cost $1000. Non-reference R9 290X cards will be about $575. These non-reference designs will eliminate temperature problems. With regards to power our results suggest about 15% more than the Titan at absolute peak load. In terms of performance our results show around 7.5-15% more than the Titan depending on the game. In my opinion a non-reference R9 290X would hands down be better than a GTX Titan at current pricing. It costs a lot less, has more performance, would have similar temps and noise (that will vary by partner solution) and will have more power consumption but more performance. Anyone that then pulls in the GTX 780 argument needs to acknowledge that this has higher power consumption than the Titan and less performance than the 290X for a higher price. The only reason the current “fanboy war” is able to perpetuate is because of two reasons:

        1) AMD isn’t letting AIBs release non-reference solutions (yet) so there’s no point of comparison
        2) Nvidia isn’t telling anyone how they are going to respond (in terms of new GPUs) or adjusted pricing.

        Both AMD and Nvidia need to get their act together because as soon as they do the consumer will benefit IMO.

        1. Actually not correct. There is a variation on Titan:
          http://www.gigabyte.cz/products/page/vga/gv-ntitanoc-6gd-b/
          Titan Overclocked and including custom cooler. (not mounted)
          Also of note is, that if you force fan to increase substantially speed (85%) thus trading cooling for noise, you can push Titan up to 950 or 1056MHz. (I have seen that and it improves performance)
          Note: Tested with stock Titan using Gigabyte’s OC program.

          1. I understand, the GTX Titan has a boost clock in a similar fashion to AMD’s R9 290X. The point is that Nvidia’s limit is 80 degrees so if you increased that to 90/95 degrees it would pretty much never reach a throttle point (as the Nvidia cooling solution is very good) but AMD’s R9 290X does because its cooler just sucks.

            With regards to your Gigabyte example, yes so what they sell the GTX Titan with a bundled cooler? Retailers also do that? My point is users don’t want to have to build their own cooling solution. It should come pre-fitted but Nvidia doesn’t allow that so non-reference designs are not allowed unless consumers buy it and DIY a cooler on.

          2. I raised only fan speed. (I tried few times to play with limit, but it didn’t change anything…) Forgot to note, boosted stable frequency was observed under Dirt 3 Showdown.
            My link goes to factory overclocked Titan. (Otherwise, OK, understood)

  3. I wonder if NVidia will issue 250-300 dollar credits back for those of us that got our wallets tromped? I still like Nvidia – for now…. but that’s the fun in competition – we will see!

    1. You’d be surprised actually. If you have a well ventilated case it often performs better than an open air test bench because you have concentrated and channelled air flow that removes heat effectively. (Providing you have 2 fans in the front and one at the back that can shift a decent amount of air). Of course if your case has poor airflow it will perform worse as you say, turning effectively into an oven/hotbox.

  4. r290x is 4k gaming gpu, u can see it shining at 2k and up. It was designed to beat titan in 1080p and it does, not completely, i call losing to titan by 5-10 fps a win because if you’r screen runs at 60hz and titan dishing out 80fps while 290x 70, what’s the point, u pay almost 50% less, also it’s future proof, 4k rez that is. The heat is a problem, u gotta go liquid cooling, but if you’re getting 290x you know what you’re getting into anyway, anyone buying a $500 dollar anything knows what they’re doing right?

  5. The Geforce 780Ti is faster from the R9 290x in most tests i have seen. You should remake the test and add the latest Geforce 780Ti in your benchies

      1. So weird that you read this review, yet posted “780 will handle 4k just fine”. Smh. People have to realize that for some people “just fine” doesn’t mean much more than “faster than a slideshow” I guess. In THAT review 780***Ti*** >>SLI<< isn't working "just fine" at 4k… Let alone a single anemic 780

  6. just out of curiosity i have a gtx 780 and i am thinking of buying a 4k monitor. will i have to buy another card and go sli or will the 780 do ok.?

    1. The GTX 780 will handle 4K just fine. However, if you want to hit those higher framerates (45+) then you’ll need to roll back some of those settings to medium-high, whereas if you’re happy with 30~ FPS then you can keep high-very high- ultra settings. A lot will vary by game and settings chosen. If you want every game to be playable at max settings you’ll need SLI 780s.

      1. if we’re being honest… my rig is extremely overclocked and overvolted. Everything is under water. Running Quad Titans. As luck would have it – nvidia sucks my sack and has yet to put out working drivers that will allow SLI to work while using surround in windows 8.1. Yea so i have this 10,000 dollar set up, and I get around 25fps in my game of choice. Rock on right?

        The reason im telling you this is while im running higher than 4k reso 4680×2560, I’m also running overclocked. So we can assume they cancel eachother out for the most part give or take. As the former poster mentioned – With maximum settings you will likely get around 30 fps give or take with one. That being said, some games will annihilate your framebuffer so keep an eye on that. Nothing you can do w/ adding another 780, your stuck as far as that goes. For comparisons sake, even in wow when using AA I use more than 3gb 😛

        In most games without AA you should be completely fine though and you can always add another 780 in the future. Just remember due to nvidia’s shoddy marketing tactics, you cant run 4, they put a limit to force people into buying titan to run 4 (worked on me and now 4 wont work in most games becuase im using surround and microsofts newest OS)

        Im in the middle of building a new rig and Im going AMD this time. Have 3 290x coming. Wont go nvidia again until far inthe future when I know the customer support and software/drivers have caughten up to the ridiculous price premium you pay.

        1. 4k is a waste of time and is uber expensive, better get a 1080p 120hz screen or 1440p and 1-2 high end cards to go with it.. 4k runs like crap on most newer games anyway.

          Some people are happy with 30fps but i want silky smooth fast games, not ones that run like a slug and look a bit sharper.

          1. Some people have the money. Judging from your comments, you obviously don’t.

            That does not mean it is a waste of time at all.

          2. It’s not a question of money… I’m running tri-sli Titan and evaluated a 4k monitor. Framerates were unacceptable to me so I went back to 1440P

            Some of us actually care about more than looking at a screenshot and actually play the game. Framerates all over the place on the histogram with an average barely touching 50fps unless you start dialing back detail *is not good*.

            Unless you’re just obsessed with res and don’t give a crap about how lousy the gameplay actually is, the current GPU gen is just NOT there yet. Money isn’t the issue

          1. will also gain a lot of heat, tuning it into a big heat box turning your room or house 100 degrees

      2. What 780 do you own? One from an alien planet? “780 will handle 4k just fine”… Said no one with a clue, no benchmarking site, and no gamer ever. What’s your definition of “just fine”? Medium detail and 35fps?

  7. 4k sucks, unless you are happy with like 20fps minimums i wouldn’t bother.. i’m stickin with 1080p for now and 120hz, may go 1440p max but that would require 1 high end card or 2 max to get “nice fps”

    Proof in the pudding, look at the benchmarks of a 780 running the UE4 elemental demo at 1080p, roughly 40fps minimum and over 60 average, i’d class that as pretty good because dips to 40 are tolerable but not the best.. now imagine upping the res to 4k it would run like ass.

    Well, i think i made my point.

  8. has anone tried a 4K monitor with 3D? I have not seen any active 3d monitors that are 4K but was wondering if anyone has tried using a 60Hz 4k monitor with passive glasses and what the frame rates were like

    1. Firstly, I doubt anything would be playable unless you owned a 3 SLI/CF build and even then, the game/card/movie would have to support it. 3D requires twice the amount of bytes as the original resolution.
      Secondly, even if you could DP1.2 or HDMI 2.0 probably couldn’t do it at 60Hz, given the amount of data that would have to stream through it.

      Keep dreaming big though.

    1. cause uh… the vapor-x is the better card? vapor-X cooler beats tri-X (unless its the dual fan vapor-X)

        1. wait what? its not like dual-x vs vapor-x where vapor-x was the superior cooler? i thought the vapor-x was supposed to be the top of the line cooler D:

          1. 3 fans always beat 2, vaporx is just a heatsink design no different to tri x except that its bigger with another fan. i was referring to 2 fan versions, there is apparantly 3 fan ones now, also referring to msi’s 290 cooler which is somewhat worse.

          2. The Vapor-X isn’t the top of the line cooler. The Toxic edition is clocked higher than the Vapor-X so the Toxic cooler has to be able to conduct more heat away than the Vapor-X. The three fans also run at a slower speed to achieve that superior level of cooling which makes them quieter. Even the Vapor-X was upgraded to 3 fans when it was refreshed.

          3. with acp application installed from am crimson driver you get 10 degrees cooler temps, 62c instead of 70-72c, it helps with oc stability as well, I can get 1150/1550 @ 80-84c

  9. i have the same system as you on the intel 8 core haswell (yes i spent like 5k on this computer) does not run most of the newest games ive tried lately in 4k. I went back to my 1080p monitor for that reason. IDK I have my doubts about what you say.

  10. THIS site proves him right and you wrong. What planet are you on? Just look at the 290X CF review. What kind of idiot truly believes that 4k max settings at a sustained 60fps is “easily doable?” Oh… The kind who thinks that a modern Intel CPU could “bottleneck” this gen of GPUs *at 4k max settings which is a far too intensive workload for current gen GPU processing power*

    EVERY benchmark from EVERY site shows you need AT LEAST two top end cards to get near playable sustained framerates (>40fps) on any modern game. Keep thinking you know what you’re talking about though… Don’t bother actually reading and learning

    1. Dude your dense I know I’m right as I have the cards and 4k monitor lol. This site and that you need 2 cards is assuming your using maxxed anti aliasing which with 4k you don’t need so instead of believr Inc stupidity go buy a 4k monitor and see for yourself your wrong

  11. it is funny that not more then 3 months ago in every review the Titan beat the R9 290x in all benchmarks, funny how that happened when Intel had no flaws and now AMD is catching up in yet to be released products all the benchmarks point to AMD, funny how this happens so often in hardware reviews.

    1. it doesn’t matter they’re all the same universal owned by one organization in this tech industry.

  12. dude its hard to read your word vomit because you don’t know how to use punctuation you just keep typing on and on although you did use one period at some point so it looked like you were going to catch yourself but no you just kept going kinda like I’m doing now isn’t this annoying to read when you have to insert the punctuation yourself because someone else won’t do it the right way but instead gets lazy and just types forever wow I don’t know how you can stand typing in this way it’s a serious strain for me to ignore punctuation like this and just keep typing so I’m going to stop it now but please for everyone’s sake in the future use proper punctuation I’m not a grammar nazi but it’s almost painful to read such ugly text walls when you write that way

  13. why cant you use a good hdmi cable that supports 4k @60Hz why is it so fucking hard to get 60Hz from a $80 hdmi cable and a 4k60Hz ready tv and gpu? WTF is going on here.

  14. Doesn’t matter anyways, these cards don’t do 4k @60Hz so basically it’s 30fps even if you do get above 30frames its not 60Hz. The new 480s do 120Hz now with the hdmi 2.0 and displayport 1.4. I wasted 400bucks on a sapphire r9 290 vapor-x oc 2-3 years ago 🙁 oh well.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker!   eTeknix prides itself on supplying the most accurate and informative PC and tech related news and reviews and this is made possible by advertisements but be rest assured that we will never serve pop ups, self playing audio ads or any form of ad that tracks your information as your data security is as important to us as it is to you.   If you want to help support us further you can over on our Patreon!   Thank you for visiting eTeknix