NVIDIA’s GeForce GTX 590: Duking It Out For The Single Card King
by Ryan Smith on March 24, 2011 9:00 AM ESTIt really doesn’t seem like it’s been all that long, but it’s been nearly a year and a half since NVIDIA has had a dual-GPU card on the market. The GeForce GTX 295 was launched in January of 2009, the first card based on the 55nm die shrink of the GT200 GPU. For most of the year the GTX 295 enjoyed bragging rights as the world’s fastest video card; however the launch of the Radeon HD 5000 series late in 2009 effectively put an end to the GTX 295’s run as a competitor.
Even with the launch of the GTX 400 series in March of 2010, a new dual-GPU card from NVIDIA remained the stuff of rumors—a number of rumors claimed we’d see a card based on GF10X, but nothing ever materialized. Without a dual-GPU card, NVIDIA had to settle for having the fastest single-GPU card on the market through the GTX 480, a market position worth bragging about, but one that was always shadowed by AMD’s dual-GPU Radeon HD 5970. Why we never saw a dual-GPU GTX 400 series card we’ll never know—historically NVIDIA has not released a dual-GPU card for every generation—but it’s a reasonable assumption that GF100’s high leakage made such a part unviable.
But at long last the time has come for a new NVIDIA dual-GPU card. GF100’s refined follow up, GF110, put the kibosh on leakage and allowed NVIDIA to crank up clocks and reduce power consumption throughout their GTX 500 lineup. This also seems to have been the key to making a dual-GPU card possible, as NVIDIA has finally unveiled their new flagship card: GeForce GTX 590. Launching a mere two weeks after AMD’s latest flagship card, the Radeon HD 6990, NVIDIA is gunning for their spot at the top back. But will they reach their goal? Let’s find out.
GTX 590 | GTX 580 | GTX 570 | GTX 560 Ti | |
Stream Processors | 2 x 512 | 512 | 480 | 384 |
Texture Address / Filtering | 2 x 64/64 | 64/64 | 60/60 | 64/64 |
ROPs | 2 x 48 | 48 | 40 | 32 |
Core Clock | 607MHz | 772MHz | 732MHz | 822MHz |
Shader Clock | 1214MHz | 1544MHz | 1464MHz | 1644MHz |
Memory Clock | 853MHz (3414MHz data rate) GDDR5 | 1002MHz (4008MHz data rate) GDDR5 | 950MHz (3800MHz data rate) GDDR5 | 1002Mhz (4008MHz data rate) GDDR5 |
Memory Bus Width | 2 x 384-bit | 384-bit | 320-bit | 256-bit |
VRAM | 2 x 1.5GB | 1.5GB | 1.25GB | 1GB |
FP64 | 1/8 FP32 | 1/8 FP32 | 1/8 FP32 | 1/12 FP32 |
Transistor Count | 2 x 3B | 3B | 3B | 1.95B |
Manufacturing Process | TSMC 40nm | TSMC 40nm | TSMC 40nm | TSMC 40nm |
Price Point | $699 | $499 | $349 | $249 |
Given that this launch takes place only two weeks after the Radeon HD 6990, it’s only natural to make comparisons to AMD’s recently launched dual-GPU card. In fact as we’ll see the cards are similar in a number of ways, which is a bit surprising given that the last time both companies had competing dual-GPU cards, the GTX 295 and Radeon HD 4870X2 were quite different in design.
But before we get too far, let’s start at the top with the specs. As is now customary for dual-GPU cards, NVIDIA has put together two of their top-tier GPUs and turned down the clocks in order to make a power/heat budget. In single card configurations we’ve seen GF110 hit 772MHz for the GTX 580, but that was for a card that can hit 300W load under the right/wrong circumstances. For the GTX 590 the clocks are down to 607MHz, while the functional unit count remains unchanged with everything enabled. Meanwhile memory clocks have also been reduced to the lowest clocks we’ve seen since the GTX 470: 853.5MHz (3414MHz data rate). NVIDIA has never hit very high memory clocks on the GTX 500 series, so it stands to reason that routing two 384-bit busses only makes the job harder.
All told at these clocks comparisons to the GTX 570 are more apt than comparisons to the GTX 580. Even compared to the GTX 570, per-GPU GTX 590 only has 83% the rasterization, 88% of the shading/texturing capacity and 99.5% the ROP capacity. Where the GTX 590 has the edge on the GTX 570 on a per-GPU basis is that with all of GF110’s functional units enabled and a 384-bit memory bus, it has 108% of the memory bandwidth and 120% the L2 cache. As a result while performance should be close to the GTX 570 on a per-GPU basis, it will fluctuate depending on the biggest bottleneck, with shading/texturing being among the worst scenarios, and L2 cache/memory bandwidth being among the best. Consequently, total performance should be close to the GTX 570 SLI.
As was the case with the 6990, NVIDIA is raising the limit on power consumption. The GTX 590 is rated for a TDP of 365W, keeping in mind that NVIDIA’s definition of TDP is the maximum power draw in “real world applications”. The closest metric from AMD would be their “typical gaming power”, for which the 6990 was rated for 350W. As a result the 6990 and GTX 590 should be fairly close in power consumption most of the time. Normally only Furmark and similar programs would generate a significant difference, but as we’ll see the rules have changed starting with NVIDIA’s latest drivers. Meanwhile for the idle TDP NVIDIA does not specify a value, but it should be under 40W.
With performance on paper that should rival the GTX 570 SLI—and by extension the Radeon HD 6990—it shouldn’t come as a big surprise that NVIDIA is pricing the GTX 590 to be competitive with AMD’s card. The MSRP of the GTX 590 will be $699, the same as where the 6990 launched two weeks ago. The card we’re looking at today, the EVGA GeForce GTX 590 Classified, is a premium package that will be a bit higher at $729. EVGA won’t be the only vendor offering a premium GTX 590 package, and while we don’t have a specific breakdown based on vendors, EVGA isn’t the only vendor with a premium package, so expect a range of prices. Ultimately for cards at the $699 MSRP, they will be competing with the 6990, the 6970CF, and the GTX 570 SLI.
As for availability, it’s a $700 card. NVIDIA isn’t expecting any real problems, but these are low-volume cards, so it’s possible and quite likely they’ll go in and out of stock.
March 2011 Video Card MSRPs | ||
NVIDIA | Price | AMD |
GeForce GTX 590
|
$700 | Radeon HD 6990 |
$480 | ||
$320 | Radeon HD 6970 | |
$240 | Radeon HD 6950 1GB | |
$190 | Radeon HD 6870 | |
$160 | Radeon HD 6850 | |
$150 | ||
$130 | ||
|
$110 | Radeon HD 5770 |
123 Comments
View All Comments
Cali3350 - Thursday, March 24, 2011 - link
Last Page you have a seeming paragraph that says "Quickly, let's also..." and then stops.tipoo - Thursday, March 24, 2011 - link
Also "Unlike AMD isn’t using an exotic phase change thermal compound here" on the meet the card pagetipoo - Thursday, March 24, 2011 - link
Another one "This doesn’t the game in any meaningful manner, but it’s an example of how SLI/CF aren’t always the right tool for the job." on the computation page.ahar - Thursday, March 24, 2011 - link
Page 2"...NVIDIA’s advice about SLI mirror’s AMD’s advice..."
mirrors
beepboy - Thursday, March 24, 2011 - link
"Quickly, let's also"Nice review.
slickr - Thursday, March 24, 2011 - link
For $700 I'd rather buy a whole new PC.Whats the point of playing games at larger resolutions than 1600x1050.
In fact I'd say that 720p resolution is probably the best to play games at, because it tends to be easier to follow since pixels kind of move faster and you have more precision and smoother gameplay experience.
I'd be keeping my AMD 6870 that is for sure!
HangFire - Thursday, March 24, 2011 - link
I've once heard that the secret to happiness is learning to like the taste of cheap beer.nyran125 - Sunday, June 19, 2011 - link
did you know thats actually true lol. If you can have your coffee black, then if milk runs out you still get to enjoy life......cjl - Thursday, March 24, 2011 - link
That depends entirely on your GPU. Several can push high resolutions at >60fps, and it's just as smooth. Gaming at 2560x1600 is just an awesome experience.Azethoth - Sunday, March 27, 2011 - link
Exactly, some of us have panels with native 2560x1600. I _could_ game at some miserable 1600x1050 resolution, or I could play at my native resolution. I choose 2560x1600 and ignore all review results at inferior resolutions. Damn you Crysis, damn you!