Meet The GeForce RTX 2060 (6GB) Founders Edition Card

As for the card itself, we've already seen the scheme with the RTX 2080 Ti, RTX 2080, and RTX 2070 Founders Editions, the main highlight being the new open air cooler design. This time around, the RTX 2060 Founders Edition has stock reference clocks and presumably stock TDP.

Like the RTX 2070 Founders Edition, the RTX 2060 Founders Edition has a single 8-pin power connector at the front of the card, and lacks the NVLink SLI connectors as only the RTX 2080 and above support SLI. Internally, the board appears very similar to the RTX 2070 Founders Edition. Like the other RTX 20 cards, the RTX 2060 has followed with increasing TDP, standing at 160W compared to the 120W of the GTX 1060 6GB. I/O-wise is the same story, with the DVI port customary for mid-range and mainstream cards, which are often paired with budget DVI monitors, particularly as a drop-in upgrade for an aging video card.

This is also in addition to the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.

As mentioned in the other RTX 20 series launch articles, the reference design change poses a potential issue to OEMs, as unlike blowers, open air designs cannot guarantee self-cooling independent of chassis airflow. As a higher-volume and nominally mainstream part, the RTX 2060 Founders Edition would be the more traditional part found in OEM systems.

The GeForce RTX 2060 (6GB) Founders Edition Review The Test
POST A COMMENT

134 Comments

View All Comments

  • B3an - Monday, January 7, 2019 - link

    More overpriced useless shit. These reviews are very rarely harsh enough on this kind of crap either, and i mean tech media in general. This shit isn't close to being acceptable. Reply
  • PeachNCream - Monday, January 7, 2019 - link

    Professionalism doesn't demand harshness. The charts and the pricing are reliable facts that speak for themselves and let a reader reach conclusions about the value proposition or the acceptability of the product as worthy of purchase. Since opinions between readers can differ significantly, its better to exercise restraint. These GPUs are given out as media samples for free and, if I'm not mistaken, other journalists have been denied pre-NDA-lift samples by blasting the company or the product. With GPU shortages all around and the need to have a day one release in order to get search engine placement that drives traffic, there is incentive to tenderfoot around criticism when possible. Reply
  • CiccioB - Monday, January 7, 2019 - link

    It all depends on what is your definition of "shit".
    Shit may be something that for you costs too much (so shit is Porche, Lamborghini and Ferrari, but for some else, also Audi, BMW and Mercedes and for some one else also all C cars) or may be something that does not work as expected or under perform with respect to the resources it has.
    So for someone else it may be shit a chip that with 230mm^q, 256GB/s of bandwidth and 240W perform like a chip that is 200mm^2, 192GB/s of bandwidth and uses half the power.
    Or it may be a chip that with 480mm^2, 8GB of latest HBM technology and more than 250W perform just a bit better than a 314mm^2 chip with GDDR5X and that uses 120W less.

    On each one its definition of "shit" and what should be bought to incentive real technological progress.
    Reply
  • saiga6360 - Tuesday, January 8, 2019 - link

    It's shit when your Porsche slows down when you turn on its fancy new features. Reply
  • Retycint - Tuesday, January 8, 2019 - link

    The new feature doesn't subtract from its normal functions though - there is still an appreciable performance increase despite the focus on RTS and whatnot. Plus, you can simply turn RTS off and use it like a normal GPU? I don't see the issue here Reply
  • saiga6360 - Tuesday, January 8, 2019 - link

    If you feel compelled to turn off the feature, then perhaps it is better to buy the alternative without it at a lower price. It comes down to how much the eye candy is worth to you at performance levels that you can get from a sub $200 card. Reply
  • CiccioB - Tuesday, January 8, 2019 - link

    It's shit when these fancy new features are kept back by the console market that has difficult at handling less than half the polygons that Pascal can, let alone the new Turing CPUs.
    The problem is not the technology that is put at disposal, but it is the market that is held back by obsolete "standards".
    Reply
  • saiga6360 - Tuesday, January 8, 2019 - link

    You mean held back by economics? If Nvidia feels compelled to sell ray tracing in its infancy for thousands of dollars, what do you expect of console makers who are selling the hardware for a loss? Consoles sell games, and if the games are compelling without the massive polygons and ray tracing then the hardware limitations can be justified. Besides, this hardly can be said of modern consoles that can push some form of 4K gaming at 30fps of AAA games not even being sold on PC. Ray tracing is nice to look at but it hardly justifies the performance penalties at the price point. Reply
  • CiccioB - Wednesday, January 9, 2019 - link

    The same may be said for 4K: fancy to see but 4x the performance vs FulllHD is too much.
    But as you can se, there are more and more people looking for 4K benchmarks to decide which card to buy.
    I would trade better graphics vs resolution any day.
    Raytraced films on bluray (so in FullHD) are way much better than any rasterized graphics at 4K.
    The path for graphics quality has been traced. Bear with it.
    Reply
  • saiga6360 - Wednesday, January 9, 2019 - link

    4K vs ray tracing seems like an obvious choice to you but people vote with their money and right now, 4K is far less cost prohibitive for the eye-candy choice you can get. One company doing it alone will not solve this, especially at such cost vs performance. We got to 4K and adaptive sync because it is an affordable solution, it wasn't always but we are here now and ray tracing is still just a fancy gimmick too expensive for most. Like it or not, it will take AMD and Intel to get on board for ray tracing on hardware across platforms, but before that, a game that truly shows the benefits of ray tracing. Preferably one that doesn't suck. Reply

Log in

Don't have an account? Sign up now