Battlefield 1 (DX11)

Battlefield 1 leads off the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. In light of DX12-related performance issues in this title, DX11 is utilized for all cards.

The Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates.

Battlefield 1 - 3840x2160 - Ultra QualityBattlefield 1 - 2560x1440 - Ultra QualityBattlefield 1 - 1920x1080 - Ultra QualityBattlefield 1 - 99th Percentile - 3840x2160 - Ultra QualityBattlefield 1 - 99th Percentile - 2560x1440 - Ultra QualityBattlefield 1 - 99th Percentile - 1920x1080 - Ultra Quality

Battlefield 1 has shown itself to be rather favorable on Vega hardware, and against Vega 56 at 4K, the GTX 1070 Ti FE can only manage a draw. At lower resolutions, the Vega 56 loses its advantage, but the difference is slim.

The Test Ashes of the Singularity: Escalation
Comments Locked

78 Comments

View All Comments

  • BrokenCrayons - Thursday, November 2, 2017 - link

    This review was a really good read. I also like that the game screenshots were dropped from it since they didn't exactly add much, but do eat a little of my data plan when I'm reading from a mobile device.

    As for the 1070 Ti, agreed its priced a bit too high. However, I think most of the current-gen GPUs are pushing the price envelope right now. Except maybe the 1030 of course which has a reasonable MSRP and doesn't require a dual slot cooler. That's really the only graphics card outside of an iGPU I'd seriously consider if I were in the market at the moment, but then again I'm not playing a lot of games on a PC because I have a console and a phone for that sort of thing.
  • Communism - Thursday, November 2, 2017 - link

    Literally the same price as a 1070 non-Ti was a week ago.

    Those cards sold so well that retailers are still gouging them to this day.
  • Communism - Thursday, November 2, 2017 - link

    And I should mention that the only reason that retail prices of 1070, Vega 56, and Vega 64 went down is due to the launch of 1070 Ti.
  • timecop1818 - Thursday, November 2, 2017 - link

    Still got that fuckin' DVI shit in 2017.
  • DanNeely - Thursday, November 2, 2017 - link

    Lack of a good way to run dual link DVI displays via HDMI/DP is probably keeping it around longer than originally intended. This includes both relatively old 2560x1600 displays that predate DP or HDMI 1.4 and thus could only do DL-DVI, and cheap 'Korean' 2560x1440 monitors from 2 or 3 years ago. The basic HDMI/DP-DVI adapters are single link and max out at 1920x1200. A few claim 2560x1600 by overclocking the data rate by 100% to stuff it down a single link worth of wires; this is mostly useless though since other than HDMI1.4 capable displays (which don't need this) virtually no DVI monitors can actually take a signal that fast. Active DP-DLDVI adapters can theoretically do it for $70-100, but they all came out buggy to one degree or another and sales were apparently too low to justify a new generation of hardware that fixed the issues.
  • Nate Oh - Saturday, November 4, 2017 - link

    This is actually precisely why I don't mind DVI too much, because I have and still use a 1-DVI-input-only Korean 1440p A- monitor from 3 years ago, overclocked to 96Hz. DVI probably needs to go away at some point soon, but maybe not too soon :)
  • ddferrari - Friday, November 3, 2017 - link

    So, that DVI port really ruins everything for ya? What are you, 14??

    There are tons of overclockable 1440p Korean monitors out there that only have one input- DVI. Adding a DVI port doesn't increase cost, slow down performance, or increase heat levels- so what's your imaginary problem again?
  • Notmyusualid - Sunday, November 5, 2017 - link

    @ ddferrari

    What are you - ddriver incarnate?

    I'm older than 14, and I wish the DVI port wasn't' there, as it is work to strip them off when I make my GPUs into single-slot water-cooled versions. Removing / modifying the bracket is one thing, but pulling out those DVI ports is another.
  • Silma - Thursday, November 2, 2017 - link

    How can you compare the Vega 56 to the GTX 1070 when it's 7 dB noisier and consumes up to 78 watts more ?
  • sach1137 - Thursday, November 2, 2017 - link

    Because the MSRP's of both the cards are same. Vega 56 beats 1070 in almost all games.yes it consumes more power and noiser too. But for some people it doesnt matter it gives 10-15% more performance than 1070. When Overclocked you can extract more from Vega 56 too.

Log in

Don't have an account? Sign up now