Battlefield 1 DirectX 11 & 12 Performance Analysis
Introduction
The Battlefield franchise has a rich heritage on the PC platform and pushed the visual boundaries further with each release. DICE’s commitment to creating the most realistic cinematic audio and astounding lighting effects never ceases to amaze. Over the years, the Battlefield games have embraced destructible environments and a chaotic multiplayer experience. The latest entry is based on the First World War and aims to bring trench warfare to life. Traditionally, developers shied away from this historical period because attritional warfare wasn’t particularly exciting and difficult to emulate in a video game. Despite this, DICE forged ahead and ensured Battlefield 1 was a relatively accurate historical portrayal without impacting on the overall fun factor. As expected, the game received widespread critical acclaim and is commonly perceived as a modern classic.
From a technical standpoint, Battlefield 1 is a modern marvel and supports the latest API. Given AMD’s strong showing in other DirectX 12 performance assessments, I’m interested to see how their graphics range compares against NVIDIA offerings. Also, it’s important to analyse Battlefield 1’s optimisation and provide readers with an indication of the kind of experience they can expect across contrasting performance tiers. Traditionally, Battlefield games run quite well although Battlefield 4 was a noticeable exception and marred by irritating crashes. Thankfully, Battlefield 1 resigns this into the history books and helps to restore DICE’s reputation.
Would’ve been interesting if you’d done the test vs. a system with only an i5, for instance.
With a GTX 1080 (just upgraded from 980ti) I ALWAYS try to avoid DX12 and opt for DX11. Because I tend to lose up to 10fps just by using DX12….
the only upside to using DX12 is that I won’t get that stutter every 3-4 Seconds like you’ll get by using DX11 in processor-heavy games like GTA5, for example.
At the end of the day, when hardware and software don’t match up right, you’ll get crap like that^^^ (i.e. DX12 -> CUDA [nvidia] // DX12 -> SM Unit [AMD])
Because DirectX is written by Microsoft, and we all know how great Microsoft is at coding…. /s
The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen
“The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen”
AMD partially did this with Mantle, now Vulkan. I think that counts.
The dx12 api may give you some gpu saving on getting ride of some dx11 implicit barriers and cache flush plus usually improve driver madness on heavy traffic in regards to texture streaming that could hitch on a dx11 implementation. BUT, it is an api to improve cpu, it does not really matters to test at 4K, you would have better indicators if you run at 720p on mid range CPUs.