Meet The GeForce GTX 980 Ti

Like the rest of NVIDIA’s high-end cards in this generation, the reference GeForce GTX 980 Ti is launching with NVIDIA’s standard metal cooler. This design has served NVIDIA well since the launch of the GTX Titan in 2013 and continues to be the blower design to beat the high end, easily handling the 250W TDP of NVIDIA’s high-end cards without generating a ton of noise in the process.

As with so many other aspects of the GTX 980 Ti, the GTX 980 Ti’s cooler and build is a near-copy of the GTX Titan X. The only difference in the cooler is the paint job; GTX Titan X got a unique black paint job, while GTX 980 Ti gets the more standard bare aluminum finish with black lettering and a black-tinted polycarbonate window.

Otherwise there’s very little to be said about the GTX 980 Ti’s design that hasn’t been said before, so we’ll just recap what we said about the cooler design from our review of the GTX Titan X.

For GTX 980 Ti, NVIDIA has opted to leave well enough alone, having made virtually no changes to the shroud or cooling apparatus. And truth be told it’s hard to fault NVIDIA right now, as this design remains the gold (well, aluminum) standard for a blower. Looks aside, after years of blowers that rattled, or were too loud, or didn’t cool discrete components very well, NVIDIA is sitting on a very solid design that I’m not really sure how anyone would top (but I’d love to see them try).

In any case, our favorite metal shroud is back once again. Composed of a cast aluminum housing and held together using a combination of rivets and screws, it’s as physically solid a shroud as we’ve ever seen. The card measures 10.5” long overall, which at this point is NVIDIA’s standard size for high-end GTX cards.

Drilling down we have the card’s primary cooling apparatus, composed of a nickel-tipped wedge-shaped heatsink and ringed radial fan. The heatsink itself is attached to the GPU via a copper vapor chamber, something that has been exclusive to NVIDIA’s 250W cards and provides the best possible heat transfer between the GPU and heatsink. Meanwhile the rest of the card is covered with a black aluminum baseplate, providing basic heatsink functionality for the VRMs and other components while also protecting them.

Finally at the bottom of the stack we have the card itself, complete with the GM200 GPU, VRAM chips, and various discrete components. The GM200 PCB places the GPU and VRAM chips towards the front of the card, while the VRMs and other discrete components occupy the back. As with the GTX Titan X, GTX 980 Ti features NVIDIA’s reworked component placement to improve airflow to the discrete components and reduce temperatures, along with employing molded inductors.

NVIDIA once again employs a 6+2 phase VRM design, with 6 phases for the GPU and another 2 for the VRAM. This means that GTX 980 Ti has a bit of power delivery headroom – NVIDIA allows the power limit to be increased by 10% to 275W – but hardcore overclockers will find that there isn’t an extreme amount of additional headroom to play with. Based on our sample the actual shipping voltage at the max boost clock is a bit higher than GTX Titan X, coming in at 1.187v, so in non-TDP constrained scenarios there is some additional headroom through overvolting, up to 1.23v in the case of our sample.

In terms of overall design, unlike GTX Titan X and its 24 VRAM chips, for the GTX 980 Ti NVIDIA only needs to use 12 VRAM chips to get the card’s 6GB of VRAM, so all of the VRAM is located at the front of the card. Halving the RAM capacity simplifies the card a bit – there are now no critical components on the back – and it brings down the total VRAM power consumption slightly. However despite this, NVIDIA has not brought back the backplate from the GTX 980, having removed it on the GTX Titan X due to the VRAM chips it placed on the rear.

Moving on, in accordance with GTX 980 Ti’s 250W TDP and the reuse of the metal cooler, power delivery for the GTX 980 Ti is identical to its predecessors. This means a 6-pin and an 8-pin power connector at the top of the card, to provide up to 225W, with the final 75W coming from the PCIe slot.

Meanwhile display I/O follows the same configuration we’ve seen on the rest of the high-end GTX 900 series. This is 1x DL-DVI-I, 3x DisplayPort 1.2, and 1x HDMI 2.0, with a total limit of 4 displays. In the case of GTX 980 Ti the DVI port is somewhat antiquated at this point – the card is generally overpowered for the relatively low maximum resolutions of DL-DVI – but on the other hand the HDMI 2.0 port is actually going to be of some value here since it means GTX 980 Ti can drive a 4K TV. Meanwhile if you have money to spare and need to drive more than a single 4K display, GTX 980 Ti also features a pair of SLI connectors for even more power.

Finally, taking a look at the long term, I wanted to quickly hit upon the subject of the VRAM capacity difference between the GTX 980 Ti and the GTX Titan X. Essentially NVIDIA’s only remaining selling point for the GTX Titan X, the Titan will remain their only 12GB card for some time to come. For NVIDIA this means that they can pitch the GTX Titan X as a more future-proof card than the GTX 980 Ti, as it would be hard-pressed to run out of VRAM.

The question for the moment then is whether 12GB is worth a higher premium, let alone the GTX Titan X’s $350 premium. The original GTX Titan by comparison was fortunate enough to come out with 6GB right before the current-generation consoles launched, and with them their 8GB memory configurations. This lead to a rather sudden jump in VRAM requirements in games that the GTX Titan was well positioned to handle, whereas GTX 780 Ti and its 3GB of VRAM can struggle in the very latest games at 4K resolutions. Much like 6GB in 2013, 12GB is overkill in 2015, all the while 6GB is a more practical amount for a 384-bit card at this time.

But to answer the question at hand, unlike the original GTX Titan, I suspect 12GB will remain overkill for a much longer period of time, especially without a significant technology bump like the consoles to drive up VRAM requirements. And consequently I don’t expect GTX 980 Ti to have any real issues with VRAM capacity in games over the next couple of years, making it better off than the GTX 780 Ti, relatively speaking.

The NVIDIA GeForce GTX 980 Ti Review NVIDIA's Computex Announcements & The Test
Comments Locked

290 Comments

View All Comments

  • Laststop311 - Monday, June 1, 2015 - link

    how is 6GB the minimum ram needed till finfet gpus? Even at 1440p with max settings no game requires 6GB of ram. Even if a game can use 6GB of ram the way some games are programmed they just use up extra ram if it is available but that used ram isn't crucial to the operation of the game. So it will show a high ram usage when in reality it can use way less and be fine.

    You are overly paranoid. 4GB of ram should be just fine to hold u off a year or 2 till finfet gpus comes out for 1440p res. If you are smart you will skip these and just wait for 2h 2016 where 14/16nm finfet gpu's are going to make a large leap in performance. That generation of gpu's should be able to be kept long term with good results. This is when you would want an 8GB card to keep it running smooth for a good 3-4 years, since you should get good lifespan with the first finfet gpu's.
  • chizow - Monday, June 1, 2015 - link

    Again, spoken from the perspective of someone who doesn't have the requisite hardware to test or know the difference. I've had both a 980 and a Titan X, and there are without a doubt, games that run sluggishly as if you are moving through molasses as soon as you turn up bandwidth intensive settings, like MSAA, texture quality and stereo 3D and hit your VRAM limits even with the FRAPs meter saying you should be getting smooth frame rates.

    With Titan X, none of these problems and of course, VRAM shoots over the 4GB celing I was hitting before.

    And why would I bother to keep running old cards that aren't good enough now and wait for FinFET cards that MIGHT be able to run for 3-4 years after that? I'll just upgrade to 14/16nm next year if the difference is big enough, it'll be a similar 18-24 month timeframe when I usually make my upgrades anyways. What am I supposed to do in this year while I wait for good enough GPUs? Not play any games? Deal with 2-3GB slow cards at 1440p? No thanks.
  • Refuge - Monday, June 1, 2015 - link

    So you are saying I shouldn't be asking questions about something I'm spending my hard earned money on? Not a small sum of which at that?

    You sir should buy my car, it is a great deal, just don't ask me about it. Because that would be stupid!
  • Yojimbo - Monday, June 1, 2015 - link

    He's not questioning your concern, he's questioning your criteria.
  • Peichen - Sunday, May 31, 2015 - link

    Why is the most popular mid-high card: GTX 970, not on the comparison list? It is exactly half the price as 980 Ti and it would be great to see if it is exactly 50% the speed and uses half the power as well.
  • dragonsqrrl - Sunday, May 31, 2015 - link

    It's definitely more than 50% the performance and power consumption, but yes it would've been nice to include in the charts.
  • PEJUman - Monday, June 1, 2015 - link

    Ryan's selection is not random. it seems he selects the likely upgrade candidates & nearest competitors. it's the same reasoning why there is no R9 290 here. most 970 and R9 290 owners probably know how to infer their card performance from the un-harversted versions (980 and 290x).

    Granted, it's odd to see 580 here and 970 will be more valuable technically.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Plus, most requests I've seen on forums have been for 970 SLI results rather than a 970 on its own, as 970 SLI is the more likely config to come anywhere a 980 Ti, assuming VRAM isn't an issue. Data for 970 SLI would thus show where in the various resolution/detail space one sees performance tail off because it needs more than 4GB.
  • bloodypulp - Monday, June 1, 2015 - link

    The 295X2 still crushes it. But blind Nvidia fanboys will claim it doesn't matter because it is either a) not a single GPU or b)AMD (and therefore sucks).
  • PEJUman - Monday, June 1, 2015 - link

    I owns 290 crossfire currently, previously a single 780 TI. Witcher 3 still sucks for my 290 CF, as well as the 295X2. so... depends on your game selections. I also have to spend more time customizing most of my games to get the optimal settings on my 290 CF than my 780TI.

Log in

Don't have an account? Sign up now