Battlefield 1 DirectX 11 & 12 Performance Analysis
Battlefield 1 – 1080p, 1440p and 4K DirectX 11 Benchmarks
At 1080p, almost every card was capable of running the game past 60FPS. The R9 Fury X and above are beating an average of 100FPS, and the 980 Ti and above managed a minimum of 100FPS and above, so the game certainly runs very well, even at ultra settings on a wide range of graphics cards.
Pushing the resolution up to 1440p, the game is a lot more demanding, dropping the top card by around 30FPS on average. Even going down to the GTX 970, the game is pretty smooth sailing, and a few tweaks to the graphics settings would easily keep you above that target of 60FPS.
Now we separate the men from the boys, with none of the cards maintaining a minimum frame rate above 60FPS, even 30FPS minimum is a challenge for some of the cards. Ideally, you’ll want to be running the Fury X or GTX 1070 minimum for 4K, although some improvements could be made by using reduced graphics settings, as 4K ultra is pretty taxing.
Would’ve been interesting if you’d done the test vs. a system with only an i5, for instance.
With a GTX 1080 (just upgraded from 980ti) I ALWAYS try to avoid DX12 and opt for DX11. Because I tend to lose up to 10fps just by using DX12….
the only upside to using DX12 is that I won’t get that stutter every 3-4 Seconds like you’ll get by using DX11 in processor-heavy games like GTA5, for example.
At the end of the day, when hardware and software don’t match up right, you’ll get crap like that^^^ (i.e. DX12 -> CUDA [nvidia] // DX12 -> SM Unit [AMD])
Because DirectX is written by Microsoft, and we all know how great Microsoft is at coding…. /s
The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen
“The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen”
AMD partially did this with Mantle, now Vulkan. I think that counts.
The dx12 api may give you some gpu saving on getting ride of some dx11 implicit barriers and cache flush plus usually improve driver madness on heavy traffic in regards to texture streaming that could hitch on a dx11 implementation. BUT, it is an api to improve cpu, it does not really matters to test at 4K, you would have better indicators if you run at 720p on mid range CPUs.