As we mentioned back on Monday, NVIDIA was going to be making some kind of GeForce announcement this evening at the NVIDA Gaming Festival 2012 in Shanghai, China. NVIDIA’s CEO Jen-Hsun Huang has just finished his speech, announcing NVIDIA’s next ultra-premium video card, the GeForce GTX 690.

Launching later this week, the GeForce GTX 690 will be NVIDIA’s new dual-GPU flagship video card, complementing their existing single-GPU GeForce GTX 680. Equipped with a pair of fully enabled GK104 GPUs, NVIDIA is shooting for GTX 680 SLI performance on a single card, and with GTX 690 they just might get there. We won’t be publishing our review until Thursday, but in the meantime let’s take a look at what we know so far about the GTX 690.

  GTX 690 GTX 680 GTX 590 GTX 580
Stream Processors 2 x 1536 1536 2 x 512 512
Texture Units 2 x 128 128 2 x 64 64
ROPs 2 x 32 32 2 x 48 48
Core Clock 915MHz 1006MHz 607MHz 772MHz
Shader Clock N/A N/A 1214MHz 1544MHz
Boost Clock 1019MHz 1058MHz N/A N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 3.414GHz GDDR5 4.008GHz GDDR5
Memory Bus Width 2 x 256-bit 256-bit 2 x 384-bit 384-bit
VRAM 2 x 2GB 2GB 2 x 1.5GB 1.5GB
FP64 1/24 FP32 1/24 FP32 1/8 FP32 1/8 FP32
TDP 300W 195W 375W 244W
Transistor Count 2 x 3.5B 3.5B 2 x 3B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 40nm TSMC 40nm
Launch Price $999 $499 $699 $499

First and foremost, the GTX 690 won’t be launching until this Thursday (May 3rd), and while we won’t be able to publish our review until then NVIDIA has provided a bounty of information on the GTX 690 ahead of the formal launch. Specs wise – and something they’re trying to make clear from the start – unlike what they did with the GTX 590 NVIDIA is targeting close to full GTX 680 SLI performance here. As GK104 is a much smaller and less power hungry GPU from the get-go, NVIDIA doesn’t have to do nearly as much binning in order to get suitable chips to keep their power consumption in check. With GTX 690 NVIDIA will be able to reach their target TDP of 300W with all functional units enabled and with clockspeeds above 900MHz, which means performance should indeed be much closer to the GTX 680 in SLI than the GTX 590 was to its SLI counterparts.

Altogether, for the GTX 690 we’re looking at a pair of fully enabled GK104 GPUs (1536 CUDA cores) clocked at 915MHz, paired with 4GB of 6GHz GDDR5 (2GB per GPU) all on a single card. The GPU boost target will be 1019MHz, though until we have a chance to review the card it’s hard to say what the average clockspeeds will be in most games. Taken altogether, this means the GTX 690 should be able to reach at least 91% of the GTX 680 SLI’s performance and probably closer to 95% depending on where GPU boost tops out.

As for power consumption, NVIDIA is designing the GTX 690 around a 300W TDP, with a typical board power (and default power target) of 263W. Compared to the 375W TDP GTX 590 this will allow the card to be used in more power/cooling limited computers, something NVIDIA lost going from the GTX 295 to the GTX 590. The tradeoff of course being that clockspeeds had to be lowered compared to GTX 680 to pull this off, which is why even with a more liberal GPU boost both the base and boost clocks are slightly below the GTX 680. As always NVIDIA is going to be doing some binning here to snag the best GK104 GPUs for the GTX 690, which is the other factor in bringing down power consumption versus the 2x195W GTX 680 SLI.

With that said, similar to what AMD did with their dual-GPU Radeon HD 5970 and 6990 NVIDIA is in practice targeting two power/performance levels with the GTX 690: a standard performance level and a higher performance level for overclockers. On the hardware side of things the card itself is equipped with 2 8-pin PCIe power sockets, enabling the card to safely draw up to 375W, 75W over its standard 300W TDP. Delivering that power will be 10 power phases (5 per GPU), so the GTX 690 will have power delivery capabilities nearly identical to the GTX 680.

Meanwhile on the software side the GTX 690 will have an adjustable power target and clock offsets just like the GTX 680. NVIDIA is giving the GTX 690 a max power target of +35%, which given the card’s default power target of 263W means it can be set to draw up to 355W. Until we have a chance to review the card it’s not clear just how many bins higher than the boost clock GPU boost can go, but even if the GTX 690 was unable to quite catch the GTX 680 SLI at standard settings the combination of a higher power target and a core clock offset should be more than enough to make sure it can be overclocked to GTX 680 specs. On paper at least some further overclocking should even be possible, as standard power target for GTX 680 is 170W; so 2x170W for GTX 680 SLI means that all else held equal, there’s still at least 15W of additional headroom to play with.

Speaking of specifications and performance, for those of you are curious about PCIe bridging NVIDIA has finally moved away from NF200 in favor of a 3rd party bridge chip. With the GTX 590 and earlier dual-GPU cards NVIDIA used their NF200 bridge, which was a PCIe 2.0 capable bridge chip designed by NVIDIA’s chipset group. However as NVIDIA no longer has a chipset group they also no longer have a group to design such chips, and with NF200 now outdated in the face of PCIe 3.0, NVIDIA has turned to PLX to provide a PCI 3.0 bridge chip.

As for the card’s construction, while we don’t have the complete specifications on hand we know that the basic design of the GTX 690 is very similar to the GTX 590. This means it will have a single axial fan sitting at the center of the card, with a GPU and its RAM at either side. Heat from one GPU goes out the rear of the card, while the heat from the other GPU goes out the front. Heat transfer will once again be provided by a pair of nickel tipped aluminum heatsinks attached to vapor chambers, which also marks the first time we’ve seen a vapor chamber used with a 600 series card.

NVIDIA tells us that they’ve done some further work here to minimize noise by tweaking their fan ducting to reduce obstructions – primarily by eliminating variations in baseplate height that had previously been necessary to accommodate the GPUs – and are claiming that the GTX 690 should be notably quieter than the GTX 680 SLI. The GTX 590 was already a small bit quieter than the GTX 580 SLI, so given the quieter nature of the GTX 680 SLI this is something we’ll be paying particular attention to.

Elsewhere, compared to the GTX 590 the biggest change most buyers will likely notice is that NVIDIA has replaced the shrouding material. NVIDIA has replaced the GTX 590’s plastic shroud with a metal shroud, specifically a mixture of cast aluminum parts and injection molded magnesium parts. Ostensibly the use of metals as opposed to plastic further reduces noise, but along with the polycarbonate windows over the heatsinks I suspect this was largely done to further reinforce its ultra-premium nature and to make the card look more lavish.

Meanwhile when it comes to display connectivity NVIDIA is using the same 3x DL-DVI and 1x miniDP port configuration that we saw on the GTX 590. This allows NVIDIA to drive 3 3D Vision monitors over DL-DVI – the first DisplayPort enabled 3D Vision monitors just started shipping – with the tradeoff being reduced external ventilation.

Finally, we have the matter of pricing and availability. In typical NVIDIA fashion, NVIDIA has given us the pricing at the last moment. The MSRP on the GTX 690 will be $999, exactly double the price of the GTX 680. Given what we know of the specs of the GTX 690 this doesn’t come as any great surprise, as NVIDIA has little incentive to price it significantly below a pair of GTX 680 cards ($1000) since performance will be within 10%, particularly since AMD’s own dual GPU card has yet to launch. This makes the GTX 690 the most expensive GeForce ever, eclipsing even 2007’s $830 GeForce 8800 Ultra.

Spring 2012 GPU Pricing Comparison
AMD Price NVIDIA
  $999 GeForce GTX 690
  $499 GeForce GTX 680
Radeon HD 7970 $479  
Radeon HD 7950 $399 GeForce GTX 580
Radeon HD 7870 $349  
  $299 GeForce GTX 570
Radeon HD 7850 $249  
  $199 GeForce GTX 560 Ti
  $169 GeForce GTX 560
Radeon HD 7770 $139  

Availability will also be a significant issue. As it stands NVIDIA cannot keep the GTX 680 in stock in North America, and while the GTX 690 may be a very low volume part it requires 2 binned GPUs, which are going to be even harder to get. We’ll know more on Thursday, but as it stands this will probably be the lowest volume ultra-performance card launch in years. While I have no doubt that NVIDIA can produce these cards in sufficient volume when they have plenty of GPUs, until TSMC’s capacity improves NVIDIA has no chance of meeting the demand for GK104 GPUs, and that bodes very poorly for GTX 690. Consequently while this technically won’t be a paper launch it’s certainly going to feel like one; coupled with the low supply  only a couple major retailers will have cards on May 3rd, with wider availability not occurring until May 7th.

Wrapping things up, while we have the specs for the GTX 690 this is only beginning to scratch the surface. Specs won't tell you about real world performance; for that you'll have to check back on May 3rd for our complete review.

Comments Locked

109 Comments

View All Comments

  • plopke - Sunday, April 29, 2012 - link

    My gaming pc's is most of the time build with a budget around 1000 dollar :). But I guess some people out there feeling a bit nutty. Now I only wished the nextgen console would come out :\. These kind of cards for me personally are getting each new iteration more silly because of very very aging console hardware that come with ports that utilies only part of what a modern gaming pc can do. Plus the best/funniest games for me personally these days for pc thanks to steam are small indie games.

  • StevoLincolnite - Sunday, April 29, 2012 - link

    Well. Their is a market for these kinds of cards, those with only 1 PCI-E slot being one of them, and small form factor cases.

    Also... Getting 2 of these cards and chucking them in SLI would probably do some wonders. (And use allot less space and power than 4x 680's.)

    However, these days the high-end GPU's can slaughter any game at 1080P, so I would hazard a guess this is more targeted towards the 2560x1440, 2560x1600 and Surround Vision/Eyefinity users.
    Which lets face it... If you can afford that kind of Display set-ups you can afford a high-end GPU and vice versa.

    Also can't forget the GPU compute market either, packing 2 GPU's in a single card is a good way to drive up performance per card.
  • Granseth - Sunday, April 29, 2012 - link

    I partly agree with you, but this time around you don't get a compute platform, if you want to GPU compute you have to go back to 580/590 or AMD
    And curious why they skimped on the memory and didn't go for 4GB for each GPU since that might be an upcoming problem and you are paying a premium for these cards.

    But I see the point with 690, if you are going SLI, or have a small form factor you might as well get this card.
  • Sabresiberian - Sunday, April 29, 2012 - link

    I imagine that they didn't want to increase the noise level of the GTX 690, and more memory would have meant more power, so more heat, and a noisier card.

    FXAA is much more memory efficient, the GTX 680 with 2GB standing up to the Radeon 7970 with 3GB is a strong indication there. Still, the last video card I bought was a bit of a mistake because it doesn't have quite enough memory, and I'm a bit gun-shy about getting a video card with "only" an effective 2GB capacity.

    ;)
  • Nfarce - Sunday, April 29, 2012 - link

    "However, these days the high-end GPU's can slaughter any game at 1080P"

    I just got my hands on an EVGA 680 Superclocked Signature card (with backplate) after replacing a 570SC (camping on NewEgg for half a day hitting F5 paid off). For now I'm running a 25.5" 1920x1200 monitor, and whereas I was running high-30s in frames on Crysis 1 & Warhead at 4xx, I'm now right at about 50FPS with the EVGA overclocked card. That's hardly slaughtering the frames. I still have some fine tuning to do and haven't even fired up BF3 yet. Moving up to three 24" screens at 5760x1080 or one 30" 2560x1600 (which I'm still trying to decide on for the same price) will require another card in SLI to keep the frames up.
  • Hrel - Sunday, April 29, 2012 - link

    Utilize* "These kinds* of cards, for me personally, are getting more silly with each new iteration. Because of a very very aging console hardware that comes with ports that utilize* only part of what a modern gaming PC can do".

    Don't even say I'm a grammar Nazi, I left mistakes in there. Just trying to help.
  • Sabresiberian - Sunday, April 29, 2012 - link

    Clearly you aren't bothering to try to understand why, so I'm not sure what your point is unless you think it's proper to brag about spending less money on gaming than someone else.

    It's not; it just shows your ignorance.
  • Latzara - Sunday, April 29, 2012 - link

    @Sabresiberian ... I don't think he's bragging -- personaly i don't think that being able to perceive value opposed to price is anything to brag about quite the opposite we should all be more atuned to this than just checking price tags ...

    This card, to me ofc, just isn't worth it's price and i will never pay for a top of the line card cause they are always, without fail, ovepriced - an upper mid range card is a better value for money and has never failed me in terms of performance needed and i don't think it would fail the vast majority of users based on the performance/cost/availbale titles scale -- you can always pump up the resolution to something that, again, 95%(or more) of users don't use to somewhat explain where the money is going but i still think it's pure waste ...

    There is always a market for them to be sure, who wants to pay more than it's worth purely for bragging rights is welcome to them
  • Blindsay04 - Sunday, April 29, 2012 - link

    At that price id rather just get 2x 680s
  • B3an - Sunday, April 29, 2012 - link

    Well if this turns out to be quieter than two 680's as is indicated, then i'd go for this instead. It seems it will have almost identical performance as 2x 680's (memory clocks are the same, and GPU boost clock is very nearly as high). And most people, if not everyone, will noticed a quieter card over something like a 1 - 2% frame rate increase, which is basically nothing.

    And with two of these GPU's it's going to run literally every game at 60+ FPS even at 2560x1600, plus being as most monitors dont have a higher refresh rate than 60hz at native res then it would be impossible to notice any FPS increase from 2x 680's anyway.

Log in

Don't have an account? Sign up now