Graphics Cards
Nvidia GTX 1080 Ti CPU Showdown
Hitman
At stock, the Ryzen chip was head to head with the 7700K. Pushing the overclocks means that it beat out the 5820K by 10FPS, and it also maintained the highest minimum frame rate by an impressive amount too.
Neck and neck for an overclocked 1800X and the stock 7700K here. However, we see Ryzen maintaining a better minimum frame rate, which in many ways is more important than the average or maximum; dropped frames are any PC gamers worst nightmare.
Very impressive, with a similar result, neck and neck with Intel, but a much higher minimum frame rate; we’re starting to notice a trend here!
Great review. but im curius as to why the 4.1ghz 1800x often beats the 1800x non oc with so much and in some benchmarks gets beat by the non oc. its so strange.
Well It just goes to show you that BIOS/OS/memory issues/threading issues are not resolved fully yet and that these games aren’t made 100% with ryzen architecture in mind. (Peter even mentioned this in the conclusion)
Once that improves, expect even higher performance from AMD camp.
Yup, some benches freaked out, I’m not sure it’s the CPU at all, most likely a BIOS and memory issue. Fingers crossed for a fix asap.
Once they fix the damn memory issue, too right I’ll be re-testing!
We recently tested the 1800X and were very impressed with its performance; you can check out the review here. We saw many comments about Ryzen not being ideal for gaming, which we think relates to bugs with memory performance. By diving into these further tests today, we hope to clear some of that up and see how it does at 1080P, 1440P and 2160P (better known as 4K to the cool kids).
its possible the 1800x oc is thermal throttling(even on water) at some points of high % and then everything gets fubar.
Thermal throttling seems to be the issue, or that the game engine just wasn’t able to find any benefit from the faster CPU and relied more heavily on the GPU in that particular scenario.
I know the result looks stupid, but the numbers are the numbers, it’s just what I go 🙂
You need to make your page selector / Next Page >> navigation bar a bit more noticeable! Nearly thought this was a one page article 😀
‘I’ll give you a bit of a tip, high frame rates are great, but they’re not all that. You
can run some games as low as 20FPS and it’ll play great, just look at
classics like Zelda 64 and Goldeneye, they ran at 20FPS and they felt
pretty smooth.’
Sweet lord. This is one shocking article, all I can say.
he is an fucking nintoddler idiot, nice faked benchmark as always which favours amd. Where are video which shows that 1800x has more fps? reddit amd retards will wank to this trash benchmark for weeks now.
‘People are often quick to leap to the maximum frame rate’. What? Who? no one even reports on maximum frame rate anymore, its redundant.
There are so many mistakes and bizarre results in this article that i cant even get my head around it. 20fps is smooth? the writer is confusing frame differential with playable frame rates and saying a drop from 100 to 60fps is more noticeable than a steady 20fps? Fuck me, I thought Peter was actually ok sometimes, but this article has made me re-evaluate eteknix as even a tech site.
AMD must be behind this, its so blatantly bonkers that I can’t even form the sentences to get how mind blowingly WRONG the editorial is.
I own Ryzen and am waiting on a motherboard, but holy christ guys this feels like such an AMD shill piece I am speechless.
Shill, not really. We’ve just published out GTX 1080 Ti review, praising Nvidia… soooo, what’s your point?
How you are the only website on earth to record the Ryzen beating the 7700k by sometimes 30%? I am actually fairly sure some of those numbers are made up. How does increasing the frequency of Ryzen by 10% increase the minimum in Hitman at 1080p by around 40%?
I could be here all week trying to find out how you got the results, but the conclusion and the 20 fps is super smooth comment makes me feel that you are being paid by AMD for this. I can’t believe you are actually that stupid.
The Rise Of The Tomb raider results are completely wrong (apart from the 4k) – 1440p and 1080p is complete crap. I would assume you are using DX12 (no details on your test procedures). It pushes all the data through a single core first – Rise Of The Tomb Raider scales best with clock speed and IPC.
Maybe because the 7700k already bottlenecks the 1080, what do you think is going to happen with a 1080ti 😉
https://www.reddit.com/r/Amd/comments/5xcc0r/jokers_bench_showing_7700k_cpu_bottleneck/
Well we are on eteknix, its hard to think what is going to happen with a 1080ti – perhaps a new follow up article tomorrow will show it being outperformed by a new GTX950Sxi just released, but it won’t matter anyway according to Peter, as 20 fps is all we need, its super smooth.
As for the 7700k bottlenecking the 1080, yes of course, we all know how many games use all the cores flat out so going to Ryzen with the extra cores will give 10,000% more performance, just like X99 always has done as its the ‘gamers platform’ no one has ever noticed either.
There was me thinking that the AMD briefing they showed online and specific situations in which the 7700k had all cores maxed out at 100% when gaming AND streaming was an unusual situation, but we have gone from that fairly realistic situation into a new one, where games suddenly all max out 8 threads on a 7700k even at 4.8ghz overclock.
I think for my sanity ill bugger off now – I do have to thank eteknix though for the read, its given me faith that AMD dollars can make a difference. Just a shame this shill piece is being pushed as something ‘real’. Any idiot can see the figures are just fabricated. Shame on you Peter.
Ive just put my 1080 on ebay, to get a GTX 950 – after all 20 is the new 100 (as in FPS). No one needs high frame rates anymore!
You completely mis read his comment. He’s making the point that the drops from the average are what you notice. going from 60-40, or a 100 to below 60. The point was the 20FPS was consistent.
Ok lets copy and paste then directly from the last page.
control c, control v.
‘they ran at 20FPS and they felt pretty smooth.’
another one. control c control v
‘You can run some games as low as 20FPS and it’ll play great’
His point is bollocks, not sure how to make it sound nice.
You would notice 20fps solid (or does he mean 22-20 or 20-18? he doesn’t tend to post much information) as unplayable rather than a little judder from 100fps to 60fps. Unless you are running a 20hz panel that is, and then they would all play beautifully well? Perhaps I missed that one.
But its not 1967.
Well not the last time I checked, perhaps I am caught in some hideous alternative world were everyone is an idiot, people can’t read, 20fps is smooth and 10% overclocks on a processor magically give 40% increased results in minimum frame rates.
But no, its still bollocks. And this is coming from an early Ryzen adopter, waiting on a motherboard to run his 1800X. Not a fanboy (which seem to be coming over hysterically to say how Peter is right – (perhaps the Roy Taylor squad to spread the word?).
the thing is,the game was made to run at 20-30fps,on not as fast controllers like mouse/keyboard it was working “fine”(bad),but on PC especially with fps games if you are to use mouse its unplayable.
Context is important to each statement. That’s what you are not reading. Your normalcy bias is being shattered. I understand your frustration.
Thanks, At least someone understands my frustration! and your statement actually makes more sense than Peters article so a big fat kudos and thumbs up from me. I will let you go polish your Ryzen avatar now – job done, report back to the bosses – you managed to derail the conversation on idiotic statements in an article to idiotic statements from the readers. Class act!
the only shill i see is you john. jesus crhsit, never seen someone so mad at an article. just because he states that constant fps is better than major fps dips, somehow makes this wrong? he used it as an example, but you being the idiot that you are are taking it and using that as a reason for this whole article being fake.
so fuck off back to /V/ with your bullshit.
THANK YOU CK, an example, apparently not a good one lol, but my point is constant frame rates > that ones that differ between a wider max/min. Ryzen had a higher minimum so it’s a smoother gaming experience, that’s about the cut and shut of it really 🙂
We listed the hardware, we detail the game settings, anyone wants to test them, go nuts!
His comment about 20 FPS desperately needs clarifying, but that doesn’t make him an AMD shill. I think such a bold statement requires more evidence. I’m not saying you are an Intel fanboy so please don’t get me wrong. I understand all the points you are making and I agree with them. I just don’t agree with calling someone a shill without definitive evidence; to me I don’t think there is any. It’s quite possible that his 7700K was experiencing issues. Or maybe his Ryzen system is without any of the flaws that the rest of the tech community experience. Check out Guru3D’s review of the 1080ti. Near the end of the article there is a page on Ryzen 1800X performance. While the 5960X is not a 7700K, it is a fast CPU for gaming especially at 4.3Ghz. The 1800X is not far behind it. In Dishonored 2 it is within margin of error at 1080p. In Watch Dogs 2 it is only slightly behind. The other 1080p tests show the 1800X clearly being beaten by the 5960X.
“Smooth” means with little or no variation. Consistency is what is meant by “smooth”.
They are only showing minimum and average frame rates. Were you just looking at the graphs? Those results match other reviews which bothered to record and publish those metrics. For maximum framerates, the 7700k is king–no question. But that’s not what this article is about.
Read there methodology and you would find out why the skewed the results by overclocking only the Ryzen and turning features off to make the Intel lower lol Its just like Adoredtv’s benchmarks he did the same.
Good god, I did not say that 20fps is all we need, I guess some people just skimmed and went bat shit crazy because I didn’t say what we all know, 60FPS+ is superior, duh.
Throw back to the days of the games I mentioned, Zelda 64 for example. Did that game feel like a juddering mess, no, it had a steady frame rate. A sturdy minimum frame rate, eg 60fps, is what you want, not a card/chip combo that’s giving mostly very high frame rates, but with drops below your refresh rate, such as 60fps/hz.
I feel most people grasped what I was talking about, but others are foaming at the mouth, it’s amazing the disparity I’m seeing here and elsewhere from a select few readers…
I stand by my benchmarks, the numbers are what they are, until someone else does the same tests (since we are the first), then you’ll just have to take my word for it.
Also, to imply that I just busted my ass off benching these all bloody weekend just to be called a liar, feels good, thanks a lot.
Hmmmm
That is quite an indictment. An indictment I might add that is backed by zero supporting evidence. On the other hand Microsoft corporation has publicly acknowledged there is a bug which negatively effects some aspects of Ryzens performance, particularly in games………..Now one giant and one major corporation have publicly discussed this bug and its negative effects on gaming performance. Not to mention a slew of reputable online gaming and PC sites.
With less than 300 comments on disqus in 7 years and a clear bias against AMD your comments ring more than a little hollow here…..
Actually, other sites are showing the Ti performs better with the 1700x/1800x than the 7700k does even in 1080p. Just to point that out. No idea why… perhaps the extra CPU power is pushing things more? Who knows.
Novidia’s driver, compared to amd’s driver, utilizes more sources on the pc.
Higher cpu usage and higher ram usage.
There are a couple of benches out in the wild comparing drivers’ system utilization when running games.
More cores means that the novidia driver can be parked at a core or two and the rest of tue game can use 2nor 4 or 6 physical cores.
Tl;tr: 4 cores are not enough for gaming
Can you share other sites/benchmarks showing this?
Have to look again, the easiest one to find is Linus tech tips on YouTube (off memory). Also, go to eteknix and look at theirs too.
https://www.extremetech.com/gaming/245604-review-gtx-1080-ti-first-real-4k-gpu-drives-better-amd-intel
That link is against the 6900k which the 1800x just about matches, the 7700k easily beats the 6900k so we also know it easily beats the 1800x. Simple really.
Hehe, looking through your posts on Disqus made me giggle. How anyone can be such a fan of a few companies is beyond me. You seriously need help.
Or maybe they are under the employ of PR agencies who engage the services of internet posters for very low wages, sometimes some free hardware goes their way. The amount of time invested to troll the forums and website vs pay and/or hardware is just not worth it going by sheer numbers but as you can see the level of diction and general IQ is rather low ergo the appeal to do these kinds of low level jobs for such PR agencies.
You sound smart.
Care to elaborate, or you just going to take partial quotes and give no context of your own?
Great review thanks!
Im an AMD guy but when you have one preocessor OCed in a graph you must also oc the others.
to be fair its only ocd by 100 mhz. and its to show the difference between the oc and non oc on that particular cpu.
if you want to have a non overclocked comparision then you have that as well.
it’s much higher than 100mhz, it’s clocked 500Mhz higher
where is the 7700k @ 5Ghz in this review?
this bench is a joke
lol Triggered, You got Rekt
nice
You do realize that at stock the 7700K is still clocked higher than the Oc’d R7 chip right? So why would it matter, if anything wouldn’t it just be a closer comparison since its closer to stock clocks on the 7700K?
When the 2.5% OC results in a~ 2.5% increase in FPS you say indicates “no cpu bottleneck”. Why?
It’s not just over clocked 100 MHz.
It’s overclocked 500 MHz since it’s a 4.1 GHz on all cores. Including XFR 4.1 GHz on one core isn’t really an over-clock either, just extended turbo.
We tested all three at stock, so there’s that metric. There was word that AMD Ryzen could keep up with other Intel chips (at stock) while overclocked, so I wanted to test how much of a difference that would actually be. People know the Intel chips are up to the cut, this test was really all about proving that Ryzen is a good gaming chip, or at least finding out if it can be… which it can.
I’ll be doing more tests though, the comments are not going unread 🙂
That shows bias though doesn’t it? Sounds like you want to prove something and not present it.
Regardless I’m just happy for the competition.
Not one word about the win 10 scheduler bug, tests were ran on Win 10’s bugged scheduler instead of Win 7’s correct one which yields boosts on minimum fps upto 26% and on average fps upto 9%.
And that’s just one of the many bugs right now, there’s also the CCX one which isn’t patched on windows 7 either, these AMD/Intel comparisons are useless in their current state, run the tests on a non-bugged scheduler, at the very least, or don’t even bother.
If in current state 1800X already ahead of 7700K, logically once all the bugs fixed for Ryzen, the delta would be even bigger. So the same results apply: 1800X beats 7700K. One exception though if Far Cry Primal result will be turned in favor of 1800X.
Hey dude, you’re right, they were all run on the same OS level. There’s a few bugs out there right now, and trust me, this isn’t the last batch of testing. Got a lot of other product launches to get around this month, but rest assured we’ll be testing as much as possible to hammer out the finer details as things develop.
Im a AMD guy. Im still running a fx 8530 even. She still holds it own with everything I though at it and has for years. Im a gamer and most Iv heard is how they are not that good in that area. Every test I have seen they seem to do just fine little behind the Intel chip but not by much. Give them a min and a couple updates and Im sure that will be taken care of in no time. But with that said everything else besides gaming the AMD chip excels at. Streaming in one of them. Now I dont know about you. But If Im a gamer that does a lot of streaming and what not the AMD chip might just be better at handling the game, stream, chat program and whatever else you might have going. It takes works loads a lot better. Plus to me its a huge upgrade to a 8350 dancing in the street here.
i’ll be waiting for Zen + let them fix all the bugs and what not now next year i’ll upgrade to Ryzen.
Conclusion, Ryzen is hell of a cpu!
https://uploads.disquscdn.com/images/86c5674a6215e7f3c54fa92639d696067bd171847e43e805148f2543d5a120f6.jpg
get yours at : https://teespring.com/en-GB/PCMRTEES 😀
why not overclock 7700k also to its maximum potential like 4.9ghz or 5ghz
The oc means, the XFR feature that AMD had. Not manual overclocking.
@LoLyeah:disqus: Unlikely no, since that would be stock.
4.1 GHz definitely will mean 4.1 GHz all-core OC.
To gimp the intel cpu. That’s when you move on and read somewhere else.
The i7-7700k can be overclocked to 5.0Ghz on air. But it will not last long under those conditions, and the manufacturer does not provide any warranties. Out of the box, the Ryzen is faster, with a valid warranty.
None of what you said is accurate. For one, the 7700K can run at 4.9ghz on aftermarket air all day long without issue. And the 7700K pushes 4.2ghz base/4.5 ghz turbo out of the box compared to 1800x’s 3.6ghz base/4.0ghz turbo.
And why does over clocking void warranty then, if clearly there are no issues. The manufacturer certainly disagrees don’t you think?
and AMD provides warranty for the overclocked Ryzen?
i call bs on these skewed results
The Ryzen X types (XFR) over clock automatically, so yes they are under warranty. Obviously over clocking any cpu manually voids warranty.
@arminbarron:disqus I’ve ran my i7 2600K since 2011 in active daily use at 4.5GHz stable and works fine still with GTX 980… If you overclock properly its very small chance to “kill” your CPU.
Why only Overclock the 1800x and turn off features in Intel, easy answer to skew the results in AMD’s Favour.
I am an AMD guy for the record. But these tests show a different picture than everyone else. Linus for example ran his tests in 4k for some of these games, and the 7700k in those tests performed much better. What’s up with that?
I want AMD to look good of course because it’s the product I’m interested in, but I am more interested in truth. NOT SAYING ANYONE FABBED, LIED, OR FIXED. Don’t bite my head off! I’m more thinking overlooked mistakes or something? Or have the intel fanboys just wrecked Ryzen so bad for the past week or two it’s hard to see positive results?? :
That’s easy – the bottleneck @ 4k is the GPU, and the clock difference becomes a bit more important.
That’s why if you’re really gonna test CPU power in games, do it with the 1080 Ti @ 1080p – Guru3D did this, and found that pairing the 1080 Ti with an 1800X actually drops performance to GTX 1070-levels for certain games.
As the resolution goes up and the burden is shifted heavily on the GPU, CPU performance is nearly equalized among all CPUs – i5s, i7s, and Ryzen 7s perform similarly now. Not a very good indicator of how good your CPU is, because often, even an FX-8350 performs fairly well in these conditions.
Read there methodology they skewed the numbers by turning off things in Intel and overclocking the 1800x only.
That’s why its different then every other Review.
Imo Gtx 1080TI Unleash the power of Ryzen. 8 cores of Ryzen matter with this powerful video card.
http://www.3dmark.com/compare/spy/1329439/spy/1330283/spy/1324506/spy/1326272
But I want to see if is just from Gtx or else.Will be good to see a test in WIndows 7 with Gtx 980TI or Radeon Fury X.
Only idiot would bottleneck his system by installing Windows 7 on it today. MS does not support these new CPU’s on anything but Windows 10, you will be getting worse performance on Windows 7. AMD cannot make Ryzens 100% compatible with Windows 7 without MS providing a patch, which they will never do. And its good thing. Its about time people moved to Windows 10.
Could we have the CPU usage,just in case the 1080ti is still hold back the ryzen
lol I doubt the Ryzen is holding back the ti looking at Intel’s CPU’s
As seen in BF1 and GTA V,the 7700k was almost at peak CPU usage while the Ryzen wasn’t even close to that with a 1080.
The ti with 35% more than the 1080 and Ryzen with similar in reserve in CPU usage.
Id like to know
Nvidia doesn’t have real async so it uses the overhead of the cpu that’s why useage goes up on cpu.
It seems the Intels are at the end of their ropes,let see how much overhead the Ryzen can absorb.
Id still like to see
@zacharyrhodin:disqus Only true in case of DX12 and Vulkan titles. Most titles are still DX11 and async doesn’t work on DX11. On DX11 NVIDIA graphics driver has less CPU overhead than AMD graphics driver.
Sorry but those are fake results.
Hitman and Rise of Tomb Raider minimum FPS @ 1080p are lower than minimum FPS than 1440p? LOL
Your results are skewed and the FPS numbers are way different than other reviewers are getting for the 1080 Ti
CPUs are hit their limits(maxing out) in the 1080p
Didn’t take long for the 7700k to get knocked of its perch
That doesn’t explain the minimum FPS numbers @ 1080p to be lower than minimums at 1440p
Even if the CPU is hitting its limit at 1080p, overall FPS for both averages and minimums should be lower at higher resolutions, but difference between other faster CPUs would be smaller.
Those are made up numbers.
Check the techspot review and compare the 1080 ti numbers
http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/
Sorry,but more bad news
Review: The Nvidia GTX 1080 Ti is the first real 4K GPU, but who drives it better, AMD or Intel?
https://www.extremetech.com/gaming/245604-review-gtx-1080-ti-first-real-4k-gpu-drives-better-amd-intel
And if you look at the Tom Clancy “are fake results”
http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page3.html
Your point is?
6900K still beats it in most games, and margin is high at 1080p and 1440p, they are tied at 4K because of GPU bottleneck
But 6900K is already behind 7700K in games because of IPC and higher clock, which still makes the benchmarks here fake.
AMD themselves admitted that in gaming it would be around 15-20% slower than Broadwell-E (around 5% less IPC and 10-15% clock speed)
Listen to AMD’s call with gamernexus.
Ryzen is a great CPU, however for gaming intel stil have the edge for now, till either they resolve the latency/scheduler issues or with Zen2
You’re getting it wrong, the minimum FPS @1440p (Ryzen 88 FPS) is lower than the minimum at @1080p (Ryzen 120 FPS), even the maximum @1440p is close to the the minimum @1080p .
The difference between min and max FPS becomes less at higher resolutionts because CPUs perform worse at it, making them getting closer to their absolute minimum performance. At lower resolutions their maximum performance spikes up alot, but not their minimum.
nope, I am talking about 7700K numbers, not Ryzen
7700K is getting higher minimums @ 1440p than 1080p in Rise of Tomb Raider and Hitman?
Which really doesn’t make sense or even possible.
Have you ever ran the RotTR benchmark twice?
Have fun, it’s the worst benchmark I’ve ever seen.
All reputable tech reviewers have the same conclusion for Ryzen, I don’t believe they are all payed by Intel.
Conclusion is: Ryzen is the best CPU for productivity for the money hands down, it crushes all Intel offerings, and it is very good for gaming, but for pure gaming Intel still takes the cake.
Whether or not that would change with future patches and microcode/windows update have yet to be seen.
I think with Zen+/Zen2, it is going to be much much better.
Is there a reason why you overclocked the AMD cpu but not the Intel ones?
because AMD Overclocking is cheaper than intel, the B350 Motherboard is very cheap and OC able. intel Z series motherboard is expensive, it’s about 1:2 or 1:3 for Z series intel and B350 AM4
Well yes however the intel proc is cheaper compared to the AMD. In the end it evens out. Also aples being cheaper then oranges does not mean you have to compare them to each other 😉
Isn’t it obvious to change the results in AMD’s favour, same reason they did not overclock both to the max on air or water then run benchmarks which Intel are miles ahead in.
Same reason these results don’t match any other reviewers as well lol Same as Adoredtv where he turns off Intel features and underclocks it so it matches Ryzen then runs benchmarks lol, Not intels fault Ryzen can’t overclock or low clock at stock.
Price difference is mentioned
There are benchmarks that show the performance delta between overclocked and stock 7700K/5820K already. This article is about Ryzen, you can compare them to stock Intel CPUs and they overclocked the 1800X because it might be interesting to see.
I think they just couldn’t be bothered. Benchmarking takes a long-ass time.
Comparing an overclocked proc to a stock one is useless and the article is comparison so about which CPU fits best with the 1080 Ti –> “Nvidia GTX 1080 Ti CPU Showdown”.
In the end the result is useless. Because they were lazy…
It would be interesting to see how the 1080 Ti does paired with each gen of Intel processors. Like say the k series i5 and i7 going back to say the 2500k / 2600k going up to the 7600k / 7700k. It would be a ton of work, but it would help people know if they need to upgrade their CPU if they want to upgrade to a 1080 Ti.
That would be amazing.
That is no use that cpu would neck the gpu, BUT YES i agree it would be fun for sum of my nerdgasms!
Nice, but I run my 5820K at 4,5ghz easily. Im fine with Intel for now.
butthurt
I find it very odd that every other site shows Intel in the lead…
Not joker
Its not odd, you should watch the resoultion being used. All you see is 1080p comparisons. Also most (almost all) games love to drive on a single thread and intel = boss at that 🙂
I have yet to find a single reason online to upgrade my i7-5930. I would almost be downgrading if I chose a brand new CPU especially the raisin.
Ryzen 1800x is better…But there isn’t a reason to upgrade a higher end system like yours.. Maybe Ryzen 2 would be a better time to make a jump…On the other hand, people who invested in higher end systems like yours, you should be happy they are holding up just fine even with today’s CPUs…I built a 6600k system and am considering a Ryzen 8 core system..I would only need the CPU and motherboard..I’m comparing that cost ($500) to a 7700k ($340)
better put that 160$ into better accessories, like cooling kits, better rams etc.
what about a faster ssd or gpu? after 8gb ram @ 2100+ u wont gain alot more… tho you could double up the amount to 16
Don’t do it, I just went from a i5 haswell to a 1700X and got worst performance, for me it is only slightly better than my FX6300 due to the fact it is crap at overclocking.
I got a high end ROG board and top memory too, it is all boxed up to RMA now, back to the i5.
My question is did you play ABOVE 1080? probably not so yes the single thread of any intel i5-i7 will beat the hell out of a ryzen :), most games are programmed for single threads
My buddy is ordering me a 7700k through his work for $241, no way in hell I’m adopting this new tech and all it’s quirks. A lot of software has to catch up to Ryzen now and even Windows. These results are skewed too, minimum frame rates being higher at 1440p than 1080p. Ryzen all of a sudden being faster at 1080p despite me watching live benchmarks and seeing that’s not the case.
You are wrong, resolutions up to 1080 are heavy for a cpu, after you go higher the gpu likes to take over and therefor the tests on 1440p ARE faster than smaller resolutions. However i still have my doubts bout the outcome. You can sit with pgr, do some studies.
Eh until you get 25-35% better IPC on top of that, not worth it IMO. So, Coffee Lake or Ice Lake.
ie- next year or 2019.
http://wccftech.com/amd-ryzen-performance-negatively-affected-windows-10-scheduler-bug/
what is the performance difference between win7 and win 10 ?
Your tests are seriously broken if you can somehow get better minimums performance in 1440p than 1080p lol
You might want to check again and try again.
You are just mad that your 7700k trash lost in virtually EVERY test!
Deal with it!
I am only pointing out the obvious problems with the benchmarks. It is clear they have a real error in these numbers and that isn’t based on any other CPU except just by looking at the 7700k numbers in isolation which are somehow better in higher resolutions!
But everyone knows 7700k is faster in every game than any of the Ryzen processors too.
Everyone else’s figures back this up. That’s only another reason to point out why these are anomalous.
Taking the 7700K numbers in isolation and looking at no other numbers in the charts, you can see something is wrong.
Under NO circumstances should the setup be faster in 1440p in Tomb Raider than it is in 1080p. There is either a major mistake here or the results have problems.
Something has gone weird here in this test and isn’t right.
Not least that in 99/100 other reviews the 7700K beats the 1800X anyway in games, despite being $200 cheaper. I didn’t even mention Ryzen’s results to be fair, just pointed out a serious error.
Benchmarks are ****** up. They also go against what many other sites have found. AMD doing an Intel and splashing the cash around?
You are wrong, resolutions up to 1080 are heavy for a cpu, after you go higher the gpu likes to take over and therefor the tests on 1440p ARE faster than smaller resolutions. However i still have my doubts bout the outcome.
Nope. No setup should have a higher minimum framerate in 1440p than it does in 1080p. It’s an obvious flaw in the results/data.
Notice how ALL the other processors tested exhibit the expected framerate drop, as per normal. There is a serious flaw here.
That’s not how it works. Why are people so emphatic when they are clearly just repeating hearsay and as such must realize that they *dont* really understand how it works?
1080p isn’t “heavy for the CPU”. It’s *light for a really fast GPU*
It’s not really that complicated… In modern 3D applications the vast majority of work is done on the GPU. The CPU runs game logic, tracks position within the world, and as a result manages what is going to show up in the players “window” on the game world. It then stages the basic geometry for the GPU along with loading the API the game engine uses with various parameters the GPU will need in order to render the scene. That’s it. The GPU does the rest.
Again… very simple example. Two CPUs. One is capable, in a given game, of managing 100fps and the other is capable of handling 75fps.
If paired with a GPU that, at a given res and level of detail, can’t break 50fps, both CPUs look equal. If the GPU can handle 150fps, you then see the difference between the two CPUs.
So no… there is no universe where a 7700k in a given game does BETTER at 1440p than 1080p regardless of the GPU. The lowest res, paired with the fastest GPU, represents *the best the CPU can ever manage to do in that game*.
Thinking it could, and that this isn’t either a typo or a broken test, demonstrates a really fundamental lack of understanding of how rendering is done.
+KieraDanvers You’re joking right?
Actually many bechmarks on the 1080Ti get better results and 1440p compared to 1080p. Don’t ask me why though. It’s probably optimized for high res gaming.
Why would you overclock the Ryzen but not the Intel chips? That is kind of disingenuous, I was hoping to see real world fair comparisons of what I could achieve with a 1080 Ti.
LOLLLL , just 4.1 ghz vs 4.5 ghz ??? XDDDD and you say that ? intel fanboy rekt
All caps and emoticons, the mark of a true fanboy with the IQ of a rock.
#1 I have a Ryzen 1700 at 3.9ghz.
#2 Fanboys like you and other’s on any side are morons who pledge allegiance to share holders who don’t give a crap about you; I’m simply a fan of technology.
#3 Since when does asking for a fair comparison make anyone a fan boy? They don’t, but dumb replies like yours do. I wanted to be able to see what overclock vs overclock can do on each system.
#4 They state right in the beginning that both Intel’s are stock, 3.3ghz, and 4.2ghz. While the 1800x is pegged to maximum. So where did you get 4.5 from?
You don’t even deserve a real reply but I just wanted to put your fanboy antics on blast in this comment. Thanks for displaying what a true fanboy is. Sorry, that might have been too much for you to read without lapsing in attention span.
He probably means the boost clock which is 4.5 GHZ also he seems to disregard that the higher core count intel is also clocked lower and wouldn’t be able to reach 4.5 Ghz with all probability.
It is a strange comparison to say the least.
I read in another review that if you bring the 7700K up to 5Ghz it provides a 25% boost in FPS on average using a 1080 Ti. So that would actually come close to the increase of the risen which is far more expensive.
Agreed Mrx David is a clear fanboy. Well said.
Actually he makes a good point. They’re comparing two products. So having the Ryzen run overclocked and the intel stock is a strange choice. Much better would’ve been to test both stock and both overclocked to max stable providing a usefull result instead of comparing apples and apples modified to look like oranges 😉
7700k is the best chip in the world. LOL
Difference in these tests and every one else’s is the 2400 ram speeds on all systems tested where as all the other reviewers tested intel systems with the fastest ram possible.
Indeed, we’ll be retesting once we’ve got our Ryzen benches working with higher speed ram.
what bullcrap that wont add any fps difference in games….
Games dont use 8 cores
Why would you overclock the Ryzen but not the Intel chips? That is kind of disingenuous, I was hoping to see real world fair comparisons of what I could achieve with a 1080 Ti.