✨ We've just launched our NEW website design!

Learn More Here
Graphics Cards

Battlefield 1 DirectX 11 & 12 Performance Analysis

Battlefield 1 – 1080p, 1440p and 4K DirectX 11 Benchmarks


At 1080p, almost every card was capable of running the game past 60FPS. The R9 Fury X and above are beating an average of 100FPS, and the 980 Ti and above managed a minimum of 100FPS and above, so the game certainly runs very well, even at ultra settings on a wide range of graphics cards.

bf1

Pushing the resolution up to 1440p, the game is a lot more demanding, dropping the top card by around 30FPS on average. Even going down to the GTX 970, the game is pretty smooth sailing, and a few tweaks to the graphics settings would easily keep you above that target of 60FPS.

bf2

Now we separate the men from the boys, with none of the cards maintaining a minimum frame rate above 60FPS, even 30FPS minimum is a challenge for some of the cards. Ideally, you’ll want to be running the Fury X or GTX 1070 minimum for 4K, although some improvements could be made by using reduced graphics settings, as 4K ultra is pretty taxing.

bf3

 

Previous page 1 2 3 4 5Next page

Peter Donnell

As a child in my 40's, I spend my day combining my love of music and movies with a life-long passion for gaming, from arcade classics and retro consoles to the latest high-end PC and console games. So it's no wonder I write about tech and test the latest hardware while I enjoy my hobbies!

Related Articles

4 Comments

  1. With a GTX 1080 (just upgraded from 980ti) I ALWAYS try to avoid DX12 and opt for DX11. Because I tend to lose up to 10fps just by using DX12….

    the only upside to using DX12 is that I won’t get that stutter every 3-4 Seconds like you’ll get by using DX11 in processor-heavy games like GTA5, for example.

    At the end of the day, when hardware and software don’t match up right, you’ll get crap like that^^^ (i.e. DX12 -> CUDA [nvidia] // DX12 -> SM Unit [AMD])
    Because DirectX is written by Microsoft, and we all know how great Microsoft is at coding…. /s

    The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen

    1. “The only way to remedy these trade offs is for AMD or Nvidia to write their own respective API’s. but that will never happen”

      AMD partially did this with Mantle, now Vulkan. I think that counts.

  2. The dx12 api may give you some gpu saving on getting ride of some dx11 implicit barriers and cache flush plus usually improve driver madness on heavy traffic in regards to texture streaming that could hitch on a dx11 implementation. BUT, it is an api to improve cpu, it does not really matters to test at 4K, you would have better indicators if you run at 720p on mid range CPUs.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker!   eTeknix prides itself on supplying the most accurate and informative PC and tech related news and reviews and this is made possible by advertisements but be rest assured that we will never serve pop ups, self playing audio ads or any form of ad that tracks your information as your data security is as important to us as it is to you.   If you want to help support us further you can over on our Patreon!   Thank you for visiting eTeknix