AMD R9 290X CrossFire V Nvidia GTX 780 Ti SLI
Introduction
When I wrote our first “4K Gaming Showdown” article, nearly 6 months ago, the article was very popular among our readers. However, one of the most common pieces of feedback I received was something to the tune of: “As 4K monitors are so expensive most people who would be buying a 4K monitor will also probably be going with SLI or CrossFire of high end graphics cards, can you test that too?”. To a large extent I agree with that idea, of course there are cheaper 4K monitors out there but most of the good quality 60Hz ones (which you really need for a fluid gaming experience) still cost thousands of dollars, if you’re willing to spend thousands of dollars on a monitor then you would be likely to spend a similar amount on GPUs. I’ve therefore taken this as an opportunity to see what SLI and CrossFire bring to the table for (60Hz) 4K gaming – as requested by our readers. In this article we will be doing a “smackdown” of Nvidia GTX 780 Ti SLI against AMD R9 290X SLI in a variety of games. Based on specified MSRPs the GTX 780 Ti SLI ($699 x 2) is a much more expensive option than the AMD R9 290Xs ($549 x 2) but with recent (mining induced) inflationary pricing on AMD R9 290X graphics cards the pricing is actually a lot closer than you might think and so it begs the question – which combination should you choose for the best 4K Gaming experience?
As you can see above we have a rather complicated mix of cards – sadly we do not have two identical cards for each GPU. That said we will both running all cards at reference clocks for the respective GPUs and the solitary reference graphics card, the Nvidia GTX 780 Ti, will be set to maximum fan speed to prevent thermal throttling as it is the only card out of the four where cooling is a limiting factor. With those modifications made we have 2 reference speed R9 290Xs and 2 reference speed GTX 780 Tis all able to performance at their maximum potential without any thermal throttling. Without any further ado let’s get on with looking at the monitor, the performance and my thoughts on 4K with dual GPUs.
Anyone interested in the reviews of the above graphics cards can find those listed below in order of their appearance in the above picture (top to bottom):
So, firstly, i hate you so much right now. Average day in the office, playing with such things. I just stare at databases all day….
now that’s out of the way
If i was choosing between the two, I would have to side with green team today. sure it costs more but as you said, you gain for it. I also think nvidia’s drivers are better made in general. (though i haven’t had an Nvidia card since the 8800GTS 512mb)
you should see (just because you can) how all the setups work with two or even 3 4K monitors OR 1 4K monitor and two rotated 1080 monitors, one on either side
Haha it isn’t as fun as you think, although I am hoping we can get a pair (or at least one) GTX Titan Black in for testing – now that would be fun! In response to your last request sadly we only have one 4K monitor, so that isn’t something we can really do. I can’t imagine two 1080p panels either side of 4K would be any good for gaming, maybe productivity but definitely not gaming – not sure you can even eyefinity/surround different res monitors.
I think you can eyefinity different res monitors as i seem to recall doing it myself by mistake once. I’ve got two monitors that run at 1080 and 1050 respectively. Either way, i would think it would work quite well if the sizes matched. seeing as the monitor you used is 1920 high and a normal 1080 monitor is 1920 wide, the resolutions should match, which means you’d only have to make sure they’re the same physical size. I think it would favor racing games more, especially if you have a smaller desk.
Green team.
More performance,
Less heat,
Less noise.
If you are going for dual cards and 4K, do it right and pay those few extra bucks.
nvidia sli beats amd by a few points. but the AMD spanks the sli buy 20 points!!
The R290X is very fast and cheaper but has some significant disadvantages like heat and a very noise fan. The 780Ti is more expensive but its cooling solution is much better
And it has a lot higher overclock potential.
I would like to point out that the weighted method is fine for most reviews, HOWEVER this review includes all games at the one resolution and all setups 780 TI SLI and 290X Crossfire. So if you took all of the GAMES and added up the frame rates the 290X comes in at 327, 290X Crossfire 591, and 780 Ti SLI 579. Adding in the BENCHMARK which you cannot play, brings the crossfire and SLI setup within 2 total frames/sec, but with the 290X still on top. I want you guys to realized that in GAMING, according to this websites own results, the 290X Crossfire is a bit better than the 780 Ti SLI setup. Each setup has its wins and losses, so choose your setup based on the games you play or go for the VALUE and realize that more times than not the AMD system wins or has a margin that is too small to FEEL.
Also, in terms of the runt frames for AMD, remember you can always use those shiny new beta drivers that eliminate almost all instances of them.
I like the testing method of this site, but it bugs me that some of the tests they perform are not included in the performance analysis and because of this there is a MUCH different conclusion.
It’s also worth noting that 780 Ti SLI hits a framerate cap caused by a glitch in Hitman Absolution which balances out the 780 Ti SLI advantage in Unigine. We’ve never tested 290X or 780Ti SLI before so not sure what your comment “according to this website’s own results” is meant to mean. I’m also unsure as to what your last comment means “I like the testing method of this site, but it bugs me that some of the tests they perform are not included in the performance analysis and because of this there is a MUCH different conclusion.” We haven’t left anything out, we simply re-ran the same tests in our previous 4K analysis and added BF4 but couldn’t retest on all graphics cards as some of them we no longer have so we left that out of the index.
I did not know about the Hitman frame cap for Nvidia, I was just saying based on your own results in THIS article there was a different performance conclusion than what was stated if you used all of the results. I am of the opinion that they are, for all intents and purposes, equal at 4K. You quoted “I like the testing method of this site, but it bugs me that some of the
tests they perform are not included in the performance analysis and
because of this there is a MUCH different conclusion” Here I am talking about how you did your FINAL performance analysis, your weighted average. Because certain games are omitted from this particular article’s final weighted performance, even though all of the games were fully tested at 4K as this was just a VS article and not a regular graphics review article. So in my opinion, because this is not a standard graphics review article, all games should have been included in your final performance because this shows that they are essentially identical in performance with the ‘winner’ jumping back and forth based on the game being tested.
What I am trying to say is it looks like you, the reviewer, are being biased towards Nvidia because of how the FINAL performance review is CALCULATED. Once again I did not read anywhere in your article about the artificial frame cap for Nvidia on Hitman, if it was stated I am sorry for that and personally I think the game should be removed if there is a glitch hindering the performance of the game with one graphics card manufacturer or another.
Interesting points but what games have I omitted? Only Battlefield 4 was omitted because we didn’t have results for all the cards. If anything most observers would say this review is bias towards AMD because of the number of AMD optimised titles (Sleeping Dogs, Dirt Showdown, Bioshock Infinite). If we include the same number of Nvidia optimised titles (like Batman Arkham Origins, Call of Duty Ghosts, etc) then Nvidia’s lead would be even more. I’m really struggling to see how this is an Nvidia bias article. And I will also add that we work very closely with both AMD and Nvidia so there is no reason for us to actively try and make one look artificially better than the other.
According to testing procedure.
Those 7 weights are 1 for each test we do excluding Battlefield 4 and
Hitman Absolution (so 6 games and 1 benchmark) which are omitted as we
do not have results for all the graphics cards in those games.
So according to this you did not include Battlefield 4 and Hitman.
I should note that I do not think that you did this on purpose. You just used what you normally do because you have not tested older equipment on the newer games.
My point is that this was a specific two man battle between 290X crossfire and 780Ti SLI, at least this is what I read from the title, and because of this fact the two that matter are fully tested on all games at 4K. So why not include all in the performance conclusion or add a section that does or something along those lines. Have the performance conclusion that is the standard but also include that 290X Crossfire does win if ALL games are included.
I also want to note that I am not biased, I am just an engineer and looked at the full data noticed that they are so close yet the recommendation goes to Nvidia for ultimate performance at 4K, while i would say it is a toss up, too close to call, or however you think it should be worded.
Each definitely has their pro’s and con’s. I would recommend to buy based on the games you play because they switch sides or are nearly even on so many occasions.
Once again I am not attacking your article or writing style or your conclusion, I am just trying to point out that your method of calculating final performance might not be the most accurate for this particular article as this one is a kinda of special article talking about one resolution and two graphics setups.
Adding a clarification.
I look at the data not the words, your data shows that the SLI setup is better by a statistically significant margin. I know that the last thing you say is it is a tough choice but your final calculated DATA shows that 780Ti is the much better option for maximum performance.
All is associated to the calculated data because graphs speak volumes, at least to engineers like me.
Okay I see your point – you think that the inclusion of Unigine and the exclusion of BF4 and Hitman skews the results towards Nvidia – I guess it does BUT all games will serve to skew the results in either direction unless they are “neutral games”. The way I see it is we have Tomb Raider, Bioshock Infinite, Sleeping Dogs that skew towards AMD while only the Unigine benchmark skews towards Nvidia. I also understand your point that Unigine isn’t strictly a game – in an ideal world we would have had more Nvidia games to choose from to make it balanced (that is something we’re implementing in upcoming reviews) but I guess the testing is what it is right now. I appreciate your feedback and your time, do you think that Unigine should be removed from the summary? I won’t reinstate Hitman though because of the aforementioned FPS glitch on Nvidia cards.
I have to agree with hizzyshizzlizz. Let’s start with why Unigine is a bad choice to benchmark: it uses tesselation up the wazoo without them being observed by the human eye. If we can’t see that much tesselation, why is it there in the first place? To simulate a potential workload I guess. But if it’s not visible, then it’s a useless workload, and therefore I disagree with its use in benchmarks in 2014.
As for non-DirectX 11 games: drop them. AvP is from 2010 and it’s not even an MMO or MOBA, meaning it’s only interesting to niche fans who never got to get the game when it came out in the first place. Useless game to benchmark in 2014.
If you want to test non-DirectX11 games, then go with the really popular ones, like World of Warcraft or Starcraft 2 or something. Otherwise no one in their right mind would waste any time playing an old game at 4k with the cream of the crop graphics cards, unless all my friends are still playing it.
Adding more “nvidia friendly” games is the last thing that you should ever do. Like mentioned above, one should be guided by the popularity of a game when not testing a specific technological achievement in a game. Battlefield 4 is an important game because it’s the “latest & greatest” in a very popular series, not because it’s “AMD friendly”.
You also left Mantle out of the picture. For people who play a lot of EA games, then choosing the 290X is a complete no-brainer, namely to people who will play Dragon Age when playing something in single-player and Battlefield 4 when playing something in multiplayer. For those who WON’T play either the decision logic should be different. Now, I know there’s only BF4 and Thief supporting Mantle at the moment, but do keep this in mind for 2014 and 2015.
780Ti GHZ is maximum tier GTX 780Ti. Is a GTX 780Ti performing up to 20% faster than the original 780Ti.
Yes but it is also 20% more expensive from the original. The price tag is extremely high….
And? All cards were ran at reference clocks as stated in the review. Quote “As you can see above we have a rather complicated mix of cards – sadly
we do not have two identical cards for each GPU. That said we will both
running all cards at reference clocks for the respective GPUs”
Good.
Respect for reviewing $8,000 worth of hardware. I’m sure it took a very long time because once it was posted, a lot has changed to the better in this week in particular!
Starting with the display, Samsung is selling a 28″ 60Hz 4K monitor for $700 on Amazon. The Power Color R9 290X PCS+ was listed for $580 on Newegg earlier, and it overclocks on air like no other 28nm card that exists today. CrossFiring it on a 7 slot ATX board does worry me but a 140mm quality case fan shooting air at the cards should keep them cool.
Holy sh!t that’s good pricing on a 60Hz 4K monitor…link? I might add it into the article, could help a lot of people out.
Of course and thanks for the reply!
http://www.amazon.com/gp/product/B00IEZGWI2/ref=s9_simh_gw_p147_d0_i1?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=1FCSJ6CBVM0D7MQN14DR&pf_rd_t=101&pf_rd_p=1688200382&pf_rd_i=507846
And the 290X PCS+
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131548
Thank you very much, I’ve added in the information at the end of the article for other readers.
i use this to test … i don’t see nothing out ordinary.. : ) and i can add receipt Plus pic my samsung 4k : ) is a Excellent 4k! haha (well i get what i pay for )
http://www.youtube.com/watch?v=WexJRnud32U
According to a user on avsforum his has bad backlight bleeding. I’d wait for more feedback.
http://www.avsforum.com/t/1521823/just-bought-a-samsung-u28d590d-28-4k-60hz-display
hmm.. i have no backlight bleeding i think his just unlucky ? 😀
Backlight bleeding is a potential issue for essentially all modern panels. Luck of the draw, just like dead pixels.
They will normally replace monitors with bad bleeding anyways.
i own Samsung U28D590D (buy $565 (RM1,840) in Malaysia, shop Jayacom) good 4K lcd!! brilliant color production(for TN panel) fast respond lcd… oh ability to have 60Hz(oc to 75hz :P) but at moment im using a single Sapphire R9 290X TriX Oc (but still geting 35-40Fps on 4K as i disable MSaa kinda useless in 4K my opinion)
Yes, higher resolutions need less AA without question.
Considering how utterly ridiculous and inefficient the concept of AA is to begin with, I would say you’ve hit the most relevant aspect of 4k…
Getting rid of trashy AA, especially blurry crap like FXAA and TXAA.
Two things I missed in this review;
Another configuration I would have liked to see on your 4K display is the R9 290 Crossfired and for reasons other than price alone. I strongly feel the 290 will punch in the same weight class as the two more expensive cards but I won’t take the plunge until someone proves it.
The 290X and 780ti in a dual card setup were putting out more than 60 FPS on a 4K display in most games. However your 60hz display can not show you those frames and this is the highest end of the high end. Each frame costs a lot of money, and the processing power that generates frames that can not be displayed is wasted and misused power!
A paying customer would raise the graphics detail and perhaps add a little AA, I mean find the optimal setting to put out 60 FPS.
Yes we are currently looking into R9 290 CFX and GTX 780 SLI to update our 4K results, we’re just having trouble sourcing 2 of each cards whereas we had two R9 290Xs and two GTX 780 Tis at hand to test with. Yes I agree with you about frame rates going above 60 but obviously it makes more sense for us to show the maximum it can do rather than all our graphs stopping at 60 because of V-Sync! The reason the settings are as they are is because during the first stages of our 4K testing we were doing single GPUs so we had to find a balance between playability and quality.
Worth noting that as all the numbers are averages a portion of the power would still be put to use as min FPS would still be dropping below 60 if the average is approx 60. I’d love to see some min FPS numbers as that’s the number one thing that would effect playability on a 4k monitor 🙂
I’ve never really had someone request minimum FPS figures from us before, however, it is something I will look into for future testing. Thanks for the feedback.
Umm… It’s really the only metric of any relevance if we want to talk about play-ability.
Average to some extent and peak are completely useless metrics….
You do not feel that spike in FPS as anything but roughness, it would feel far better and smoother to eliminate upwards spikes…
Do you think the more variable input latency is a characteristic anyone wants when both solutions provide comparable minimums? It’s pointless, feels bad and throws off the interaction timing.
Average metrics give very false impressions of how well a card actually performs, unless you do something like minimum weighted average.
these is just pitiful.. NVIDIA really? 2, 4 frames highest is 9 frames difference with AMD.. two thumbs up for AMD!!! not only it is cheaper but in two games the difference is 20+ and 18 frames against NVIDIA.. nvidia 2 to 9 frames difference is nothing.. the difference in performance is not worth the money plus it is more expensive and consumes more power.. TRULY GREAT JOB AMD!!!
We must be seeing different benchmarks. What I saw was NVIDIA matching or outdoing AMD in most of the benchmarks. And “worth the money”? The cards are both $700!
hmm i don’t know… Powercolor R9 290X PCS+ is selling for $550(http://www.newegg.com/Product/Product.aspx?Item=N82E16814131548) meanwhile cheapest GTX 780 TI selling $ 690+ … this price base on Newegg pricing… : D but there good chance you are Green fan.. and please not meant to start gpu bashing just what current pricing is..
x
True GTX 780 Ti performance 5-8FPS better then AMD R9 290X but… do 5-8 Fps worth the extra $100-140 ? ? 😀
hmm… you are right… but nvidia still has better drivers as well as physx. 290x also runs hotter and louder. however, 290x might be a better option if going strictly for price:performance.
Get .You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr
The real issue with benchmarks is what versions of the GPU? my 290 PCS+ performs like a 290x, so how would a 290x PCS+ perform like? on top of things with Omega drivers out now, AMD has taken a software advantage of Nvidia at this point and time.
AMD has a 3D Technology (AMD HD3D), AMD has a G-Sync rival (FreeSync, not available yet but likely coming this year and is open source not proprietaru), AMD doesn’t have PhysX because it is an Nvidia software (AMD has things like TressFX though – neither is really relevant as DX11.2 is the programming standard of choice,AMD also has better OpenCL/GL support) and AMD’s Eyefinity is more flexible than Nvidia’s Surround (Eyefinity supports more displays and is more mature). Shadow Play is Nvidia’s one advantage out of that I’d agree with you on, it’s a very strong piece of hardware level software. G-Sync is a win for Nvidia right now because you can buy G-Sync monitors but FreeSync is not ready yet.
‘Shadowplay’ is no longer an advantage. Since your post, AMD has released their own GPU hardware encoded recording system (the basic functionality has been there for a long time).
It’s part of Raptr, called ‘Game DVR’, with essentially zero measurable effect on performance, just like nvidia’s GPU encoding.
Nothing you really want to play supports physx and even where it does it’s not exactly killer feature time. Shadow play? Can live without that all day long.
.Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr..
In Canada the R9 290X averages for $599.99 – $629.99 while the Nvidia GTX 780 Ti is $769.99 – $799.99. I’ll save the nearly $200 and lose the few frames.
so when NVIDIA outperforms AMD, its only by 3-6 frames, but when AMD outperforms, its by 8-10 frames, considering that most games are optimised for NVIDIA cards as opposed to AMD, I guess I’m gonna be getting me me some AMD cards.
thanks for helping me make my decision .. you’re awesome.
Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr..
Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr
You are missing the fact that 4k monitors cap at 60hz, what really matters is consistency, how often you dip below the 60hz treshold.
Almost never. Lowest is about 45 fps.
Nvidia paid the display so no wonder its shown to won at all aspects, and they didn’t even try or speak about mantle.
You’re such a moron. Nvidia did not pay the display, they supplied one which we used and later was returned so other media could use it. In our first 4K article AMD provided us with a display. It makes no difference who provides the display, the display is for testing, not a bribe, I think your statement just reflects your own character and assumptions. Oh boo hoo we didn’t talk about Mantle in our article, big deal – only one game on the list supports it. No we just wrote an entire article about Mantle here, obviously that’s because we love Nvidia so much and because they buy our displays…..-_- : http://www.eteknix.com/testing-amds-mantle-battlefield-4-thief-and-pvz-garden-warfare/
Well he wasn’t being rude and then you come up with this argument, I don’t think I will trust a word you write from now on. Next time try and be respectful to the people that offer constructive criticism.
Accusing me of being bought by Nvidia and having bias in favour of them without any evidence is constructive criticism? I don’t think so.
Wondering if you turned off AA or kept it on?, because I watched a video on Youtube where the guy stated in the description that at 4K resolution it is no longer necessary because hard edges aren’t visible any more which makes sense as there’s no need to smooth out already smooth edges.
You are exacly right. Turn off AA at any point above 1080p. This is why AMD is so good these resolutions, AA tends to be there weekness
Holy shit…. talk about being unprofessional. He probably is an AMD fanboy (which is fine, I am too), and while his once sentence opinion about your review may have been a little off, it did not constitute you acting like a 15 year old. This is how people lose their jobs, Ryan. Smooth.
You’re right, I’ll make a concerted effort to not respond to comments from now on.
Very hard to understand how the inferior 780ti beats the 290x in 4k. 3GB ram with 384 bit pass is not enough to do 4k against the 290x. Very wierd i get better prformance on a singe R9 290 PCS+ that these are getting in xFire/SLI
It is hilarious to read this article again from a year ago. The R9 290X can be had for $270 now O.O That’s a decrease in price of over $600 per card – damn! On the contrary, the GTX780Ti still costs around $500, a decrease of only $200. That price difference is ludicrous. I’m an Nvidia fanboy and own three GTX780s, but can NOT deny the fact that AMD has the win when it comes to what you get for your money. I am seriously considering switching back to the red team just for the savings alone. Case in point, the Titan X was just released for an MSRP of $1000 and only performs marginally better than a factory overclocked GTX980 card. Why the hell would anyone pay $400+ for a minimal performance gain. You would be far better off buying two GTX980s ($550-600/each) or better yet two R9 290X cards ($270/each) and get better performance and have change to spare if you went the AMD way. It’s difficult to understand why retailers continue to sell the Nvidia cards for such a high price. I know the demand has to be there, but wow – it really is a HUGE price difference for similiar performance.
I sold the 780’s to buy 290x and I lost joy and money. The 290x’s gave me “black screens” (the video drops out in the middle of a gam. The driver had no scaling built in so I had to play everything at native resolution with no overclocking. They were also obnoxiously noisy.
some Hawaii GPU had this problem, was solved with bios update. You can add 0.2 mV on the core to solve it…
I sold the 780’s to buy 290x and I lost joy and money. The 290x’s gave me “black screens” (the video drops out in the middle of a gam. The driver had no scaling built in so I had to play everything at native resolution with no overclocking. They were also obnoxiously noisy.
I sold the 780’s to buy 290x and I lost joy and money. The 290x’s gave me “black screens” (the video drops out in the middle of a gam. The driver had no scaling built in so I had to play everything at native resolution with no overclocking. They were also obnoxiously noisy.
I have been a ATI/AMD fan since the early 90s. Although I only switched back to AMD processors in 2010, my graphics has always been ATI/AMD. I love my graphics. Compared to my friends who use high end nVidia cards, I have the most stable system of all. I do agree that problems may pop-up, but that’s mostly for people who don’t know, or shouldn’t work a computer. There is a very small performance difference, but the reduced pricing of the Red Team, makes it all worth it. In my humble opinion, why pay thousands for a top of the line hex core Intel when a AMD product with 5% to 15% less performance could be had for less than a quarter of the price. The same goes for the graphics.
I haven’t looked but would it not be cheaper to buy an R9 295×2 as it’s two cards on one board ?
I’m also a massive Nvidia fanboy(bloke) but I would never actively belittle someone for choosing an AMD card (not saying anyone here has).
I prefer Nvidia as they are more stable than the AMD cards purely due to the drivers – Admittedly this has massively improved in recent times but you only have to look at the recent GTA V release and all the issues that the AMD users encountered untill the updated drivers were issued (the steam forums were full of fixes and solutions). I have said it before though that no game should be released in a state where one set of people can not legitimately play the game because of their choice of hardware.
I prefer AMD’s commitment to new technologies as they are trying to get better gaming results than better “first quarter” results (Nvidia I’m looking at you here)…… Freesync vs Gsync is a perfect example of this.
Power requirements for a R9 295×2 would require a power supply upgrade for some, which adds to the total cost.
I am running a r9-290x and no issue with GTA-V.
You get this kind of information from an NVIDIA fanboy or a real living AMD user?
this comment makes no sense to me ??
no issue with GTA-V with AMD cards.
hahahahahahaha nice try m8.
a simple search of just the steam forums (haven’t even checked R* yet) showed MASSIVE issues for AMD users when the game was released. Please go and research more.
Its looks like the nVidia card holds its value better to me.
One lost $600 of value
Other only lost $200
Which one has better resale value?
Its looks like the nVidia card holds its value better to me.
One lost $600 of value
Other only lost $200
Which one has better resale value?
Its the VRAM. People are getting ready for 4k gaming. Titan x is a singl gpu that has 12gibs and vram doesn’t stack with multiple cards.
And this is the most useless review ever. Where the hell are the -MINIMUM- FPS figures, which is all anyone ever cares about.