The EVGA GeForce GTX 1070 Ti FTW2 Review: iCX Brings the Lights and Sensors
by Nate Oh on January 31, 2018 9:00 AM EST- Posted in
- GPUs
- EVGA
- GeForce
- NVIDIA
- GTX 1070 Ti
A few months ago, NVIDIA released the GeForce GTX 1070 Ti series with their Founders Edition card, accompanied by a number of partner boards. Targeting the competing Radeon RX Vega 56, the launch put the GTX 1070 Ti right between the GeForce GTX 1080 and 1070 in terms of price and performance, filling a gap that was not particularly wide in the first place. That level of performance was achieved through a new 19 SM configuration of GP104 with a 180W TDP and 1607MHz core clock over the GeForce GTX 1070 with its 15 SMs, 150W TDP, and 1506MHz core clock.
And in consideration of that, all partner GTX 1070 Ti cards adhere to the reference 1607MHz core and 1683MHz boost clocks. In this way, the model could exist in the $450 – $500 MSRP window without significantly cannibalizing sales of the neighboring GTX 1080 and 1070. So for EVGA, they rolled out four GeForce GTX 1070 Ti models at launch, all featuring the same clocks. But as NVIDIA did with the Founders Edition, EVGA is also pushing overclocking as one of the selling points, leaning on their Precision XOC utility and its GTX 1070 Ti specific overclock autoscan.
Going straight to the higher-end with the FTW model, today we are taking a look at the EVGA GeForce GTX 1070 Ti FTW2, equipped with the iCX temperature sensor and cooling system.
GeForce GTX 1070 Ti Specification Comparison | |||||
EVGA GTX 1070 Ti FTW2 |
NVIDIA GTX 1070 Ti Founders Edition |
EVGA GTX 1070 Ti SC Black Ed. |
|||
CUDA Cores | 2432 | 2432 | 2432 | ||
Texture Units | 152 | 152 | 152 | ||
ROPs | 64 | 64 | 64 | ||
Core Clock | 1607+MHz | 1607MHz | 1607+MHz | ||
Boost Clock | 1683+MHz | 1683MHz | 1683+MHz | ||
Memory Clock | 8Gbps GDDR5 | 8Gbps GDDR5 | 8Gbps GDDR5 | ||
Memory Bus Width | 256-bit | 256-bit | 256-bit | ||
VRAM | 8GB | 8GB | 8GB | ||
TDP | 180W | 180W | 150W | ||
Power Connectors | 2x 8-pin | 1x 8pin | 1x 8pin | ||
Cooling | Dual fan open air | Blower | Dual fan open air | ||
GPU | GP104 | GP104 | GP104 | ||
Manufacturing Process | TSMC 16nm | TSMC 16nm | TSMC 16nm | ||
Launch Date | 11/02/2017 | 11/02/2017 | 11/02/2017 | ||
Launch MSRP | $499 | $449 | $469 | ||
Current MSRP | $569 | - | $519 |
Because of the enforced reference clocks, we have the interesting scenario where EVGA’s factory overclock tiers of SC to FTW do not actually denote factory overclocks, though presumably the GTX 1070 Ti FTW2 remains capable of higher manual overclocks than the others. Without such factory overclocks, the distinguishing elements of the GeForce GTX 1070 Ti FTW2 come down to the iCX cooler, power system, dual BIOS, and, naturally, RGB LED capability, a featureset identical to EVGA’s GTX 1070 and 1080 FTW2 iCX models.
In any case, manual overclocking is still permitted, which EVGA has tried to make as straightforward as possible with their confusingly named Precision XOC feature “XOC Scanner” that is exclusive to the GTX 1070 Ti for the time being. In short, XOC Scanner will scan, test, and apply in a single step as opposed to the multiple steps needed in utilizing OC ScannerX normally. Without manual overclocking, the GTX 1070 Ti FTW2 is nominally specified at reference clocks even though the FTW2 features a higher power limit, improved cooling, and better power subsystem than the Founders Edition, so performance will be very similar out-of-the-box.
With the exception of the EVGA’s hybrid cooled GTX 1070 Ti, the launch cards and later FTW2 Ultra Silent are still priced conforming to the GTX 1070 Ti’s price window, and reflecting the quality and feature differences. Except quite noticeably, the EVGA store pricing at the time of writing is very much inflated from launch, coupled with almost every product being out-of-stock.
For those who haven’t come across the extraordinary amount of cryptocurrency news coverage of the past few months, demand for graphics cards in mining cryptocurrency continues to exceed sensibility and has somehow reached a new high. The end result is that current GTX 1070 Ti FTW2 pricing is not applicable at all to more normal circumstances, something that is especially significant to the GTX 1070 Ti lineups given their price window between the GTX 1080 and 1070.
In more concrete terms, the going rate for GTX 1070 Ti models at the time of writing is in the $500 to $1000 range – assuming they can be found in stock. The specific card of today’s review, the EVGA GeForce GTX 1070 Ti FTW2, is currently listed at $1300 on Amazon, $100 more expensive than purchasing a Titan Xp directly from NVIDIA. The pricing inflation is such that prebuilt gaming PCs may provide more value than a marked-up graphics card. Though it is safe to say that in this market, there is little concern of sales cannibalization by the GeForce GTX 1070 Ti. As for its nominal competitor, the Radeon RX Vega 56 is listed around $1000 or more, and seems to be much shorter supply.
The other current event, no less significant in impact, is the outing of the Spectre and Meltdown CPU exploits and the corresponding performance-affecting security patches, which we will touch upon in a later section.
Early 2018 GPU Pricing Comparison (Crypto-INSANITY Edition) | |||||
AMD | Price | NVIDIA | |||
Radeon RX Vega 64 | $1200+ | GeForce GTX 1080 Ti | |||
Radeon RX Vega 56 | $1000+ | GeForce GTX 1080 | |||
$650+ | GeForce GTX 1070 Ti | ||||
Radeon RX 580 (8GB) | $500+ | GeForce GTX 1070 | |||
$300+ | GeForce GTX 1060 (6GB) |
47 Comments
View All Comments
DnaAngel - Tuesday, May 22, 2018 - link
I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
DnaAngel - Monday, May 21, 2018 - link
AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.Navi will just be playing catchup to Volta anyway.
Hixbot - Thursday, February 1, 2018 - link
Soo.. what you're saying is mining is the problem. OK got it.JoeyJoJo123 - Monday, February 5, 2018 - link
Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
Tetracycloide - Friday, February 2, 2018 - link
TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.JoeyJoJo123 - Monday, February 5, 2018 - link
Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)
Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.
But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.
If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
mapesdhs - Tuesday, February 6, 2018 - link
I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:https://www.youtube.com/watch?v=PkeKx-L_E-o
When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).
I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
boozed - Wednesday, January 31, 2018 - link
Magic beansStevoLincolnite - Wednesday, January 31, 2018 - link
I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
IGTrading - Wednesday, January 31, 2018 - link
What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .I guess all those nVIDIA buyers feel swindled now ....