The Test

Returning to the topic of Spectre and Meltdown, there has been a flurry of security-related activity to address these exploits. The security fixes ultimately incur a performance penalty, though the penalty only measurably affects select cases, such as certain database and I/O-heavy workloads.

More relevant to us, is the workstation-level mitigations; in this case, Windows Updates and BIOS updates with microcode changes. And the various mitigations have run into a number of complications, such as random reboots and data loss on Intel processors, and freezing on AMD ones. And so there has been the emergency Windows patch days after Spectre and Meltdown were publicly disclosed, and just this week Microsoft released an emergency patch to undo an Intel microcode update that was responsible for the rebooting and potential data corruption issues. Now, the current industry guidance is to hold off on firmware updates until stable tested updates are available.

Incidentally, NVIDIA has also patched up their driverset to harden them against Spectre attacks. It was earlier misinterpreted that the GPUs themselves were vulnerable, but to reiterate quickly, GPUs do not engage in speculative execution, which is what these vulnerabilities apply to.

Suffice to say, we are looking into the effect of Spectre and Meltdown mitigations on our GPU benchmarks. For the time being however, including this review, benchmarks are being run without any Meltdown/Spectre mitigations enabled, allowing them to be comparable to our existing dataset.

For our review of the EVGA GeForce GTX 1070 Ti FTW2, we have used NVIDIA's 388.71 driver. The 2017 benchmark suite remains identical to the one described in the GTX 1070 Ti Founders Edition review. Like all our other GPU reviews, gaming results are average framerates and/or framerates at the 99th percentile.

As always, we try to use the best performing API for a particular graphics card.

CPU: Intel Core i7-7820X @ 4.3GHz
Motherboard: Gigabyte X299 AORUS Gaming 7 (BIOS version F7)
Power Supply: Corsair AX860i
Hard Disk: OCZ Toshiba RD400 (1TB)
Memory: G.Skill TridentZ DDR4-3200 4 x 8GB (16-18-18-38)
Case: NZXT Phantom 630 Windowed Edition
Monitor: LG 27UD68P-B
Video Cards: AMD Radeon RX Vega 64 (Air Cooled)
AMD Radeon RX Vega 56
AMD Radeon RX 580
AMD Radeon R9 Fury X
NVIDIA GeForce GTX 1080 Ti Founders Edition
NVIDIA GeForce GTX 1080 Founders Edition
EVGA GeForce GTX 1070 Ti FTW2 iCX
NVIDIA GeForce GTX 1070 Ti Founders Edition
NVIDIA GeForce GTX 1070 Founders Edition
NVIDIA GeForce GTX 980
Video Drivers: NVIDIA Release 388.71
AMD Radeon Software Crimson ReLive Edition 17.12.2
OS: Windows 10 Pro (Creators Update)
Meet The EVGA GTX 1070 Ti FTW2: Precision XOC Battlefield 1
Comments Locked

47 Comments

View All Comments

  • DnaAngel - Tuesday, May 22, 2018 - link

    I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.

    To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
  • DnaAngel - Monday, May 21, 2018 - link

    AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.

    Navi will just be playing catchup to Volta anyway.
  • Hixbot - Thursday, February 1, 2018 - link

    Soo.. what you're saying is mining is the problem. OK got it.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.

    And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
  • Tetracycloide - Friday, February 2, 2018 - link

    TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.

    There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)

    Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.

    But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.

    If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
  • mapesdhs - Tuesday, February 6, 2018 - link

    I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:

    https://www.youtube.com/watch?v=PkeKx-L_E-o

    When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).

    I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
  • boozed - Wednesday, January 31, 2018 - link

    Magic beans
  • StevoLincolnite - Wednesday, January 31, 2018 - link

    I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.

    Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
    The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
  • IGTrading - Wednesday, January 31, 2018 - link

    What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .

    I guess all those nVIDIA buyers feel swindled now ....

Log in

Don't have an account? Sign up now