Meet The 2013 GPU Benchmark Suite & The Test

Having taken a look at the compute side of Titan, let’s finally dive into what most of you have probably been waiting for: our gaming benchmarks.

As this is the first major launch of 2013 it’s also the first time we’ll be using our new 2013 GPU benchmark suite. This benchmark suite should be considered a work in progress at the moment, as it’s essentially incomplete. With several high-profile games due in the next 4 weeks (and no other product launches expected), we expect we’ll be expanding our suite to integrate those latest games. In the meantime we have composed a slightly smaller suite of 8 games that will serve as our base.

AnandTech GPU Bench 2013 Game List
Game Genre
DiRT: Showdown Racing
Total War: Shogun 2 Strategy
Hitman: Absolution Action
Sleeping Dogs Action/Open World
Crysis: Warhead FPS
Far Cry 3 FPS
Battlefield 3 FPS
Civilization V Strategy

Returning to the suite will be Total War: Shogun 2, Civilization V, Battlefield 3, and of course Crysis: Warhead. With no performance-demanding AAA strategy games released in the last year, we’re effectively in a holding pattern for new strategy benchmarks, hence we’re bringing Shogun and Civilization forward. Even 2 years after its release, Shogun 2 can still put an incredible load on a system on its highest settings, and Civilization V is still one of the more advanced games in our suite due to its use of driver command lists for rendering. With Company of Heroes 2 due here in the near future we may finally get a new strategy game worth benchmarking, while Total War will be returning with Rome 2 towards the end of this year.

Meanwhile Battlefield 3 is still among the most popular multiplayer FPSes, and though newer video cards have lightened its system-killer status, it still takes a lot of horsepower to play. Furthermore the engine behind it, Frostbite 2, is used in a few other action games, and will be used for Battlefield 4 at the end of this year. Finally we have the venerable Crysis: Warhead, our legacy entry. As the only DX10 title in the current lineup it’s good for tracking performance against our oldest video cards, plus it’s still such a demanding game that only the latest video cards can play it at high framerates and resolutions with MSAA.

As for the new games in our suite, we have added DiRT: Showdown, Hitman: Absolution, Sleeping Dogs, and Far Cry 3. DiRT: Showdown is the annual refresh of the DiRT racing franchise from Codemasters, based upon their continually evolving racer engine. Meanwhile Hitman: Absolution is last year’s highly regarded third person action game, and notably in this day and age features a built-in benchmark, albeit a bit of a CPU-intensive one. As for Sleeping Dogs, it’s a rare treat in that it’s a benchmarkable open world game (open world games having benchmarks is practically unheard of) giving us a rare chance to benchmark something from this genre. And finally we have Far Cry 3, the latest rendition of the Far Cry franchise. A popular game in its own right, its jungle environment can be particularly punishing.

These games will be joined throughout the year by additional games as we find games that meet our needs and standards, and for which we can create meaningful benchmarks and validate their performance. As with 2012 we’re looking at having roughly 10 game benchmarks at any given time.

Meanwhile from a settings and resolution standpoint we have finally (and I might add, begrudgingly) moved from 16:10 resolutions to 16:9 resolutions in most cases to better match the popularity of 1080p monitors and the recent wave of 1440p IPS monitors. Our primary resolutions are now 2560x1440, 1920x1080, and 1600x900, with an emphasis on 1920x1080 at lower setting ahead of dropping to lower resolutions, given the increasing marginalization of monitors with sub-1080p resolutions. The one exception to these resolutions is our triple-monitor resolution, which stays at 5760x1200. This is purely for technical reasons, as NVIDIA’s drivers do not consistently offer us 5760x1080 on the 1920x1200 panels we use for testing.

As for the testbed itself, we’ve changed very little. Our testbed remains our trusty 4.3GHz SNB-E, backed with 16GB of RAM and running off of a 256GB Samsung 470 SSD. The one change we have made here is that having validated our platform as being able to handle PCIe 3.0 just fine, we are forcibly enabling PCIe 3.0 on NVIDIA cards where it’s typically disabled. NVIDIA disables PCIe 3.0 by default on SNB-E systems due to inconsistencies in the platform, but as our goal is to remove every non-GPU bottleneck, we have little reason to leave PCIe 3.0 disabled. Especially since most buyers will be on Ivy Bridge platforms where PCIe 3.0 is fully supported.

Finally, we’ve also used this opportunity to refresh a couple of our cards in our test suite. AMD’s original press sample for the 7970 GHz Edition was a reference 7970 with the 7970GE BIOS, a configuration that was more-or-less suitable for the 7970GE, but not one AMD’s partners followed. Since all of AMD’s partners are using open air cooling, we’ve replaced our AMD sample with HIS’s 7970 IceQ X2 GHz Edition, a fairly typical representation of the type of dual-fan coolers that are common on 7970GE cards. Our 7970GE temp/noise results should now be much closer to what retail cards will do, though performance is unchanged.

Unfortunately we’ve had to deviate from that almost immediately for CrossFire testing. Our second HIS card was defective, so due to time constraints we’re using our original AMD 7970GE as our second card for CF testing. This has no impact on performance, but it means that we cannot fairly measure temp or noise. We will update Bench with those results once we get a replacement card and run the necessary tests.

Finally, we also have a Powercolor Devil13 7990 as our 7990 sample. The Devil13 was a limited run part and has been replaced by the plain 7990, the difference between them being a 25MHz advantage for the Devil13. As such we’ve downclocked our Devil13 to match the basic 7990’s specs. The performance and power results should perfectly match a proper retail 7990.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards:

AMD Radeon HD 7970
AMD Radeon HD 7970 GHz Edition
PowerColor Radeon HD 7990 Devil13
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX Titan

Video Drivers: NVIDIA ForceWare 314.07
NVIDIA ForceWare 314.09 (Titan)
AMD Catalyst 13.2 Beta 6
OS: Windows 8 Pro

 

Titan’s Compute Performance, Cont DiRT: Showdown
Comments Locked

337 Comments

View All Comments

  • Ryan Smith - Thursday, February 21, 2013 - link

    PCI\VEN_10DE&DEV_1005&SUBSYS_103510DE

    I have no idea what a Tesla card's would be, though.
  • alpha754293 - Thursday, February 21, 2013 - link

    I don't suppose you would know how to tell the computer/OS that the card has a different PCI DevID other than what it actually is, would you?

    NVIDIA Tesla C2075 PCI\VEN_10DE&DEV_1096
  • Hydropower - Friday, February 22, 2013 - link

    PCI\VEN_10DE&DEV_1022&SUBSYS_098210DE&REV_A1

    For the K20c.
  • brucethemoose - Thursday, February 21, 2013 - link

    "This TDP limit is 106% of Titan’s base TDP of 250W, or 265W. No matter what you throw at Titan or how you cool it, it will not let itself pull more than 265W sustained."

    The value of the Titan isn't THAT bad at stock, but 106%? Is that a joke!?

    Throw in an OC for OC comparison, and this card is absolutely ridiculous. Take the 7970 GE... 1250mhz is a good, reasonable 250mhz OC on air, a nice 20%-25% boost in performance.

    The Titan review sample is probably the best case scenario and can go 27MHz past turbo speed, 115MHZ past base speed, so maybe 6%-10%. That $500 performance gap starts shrinking really, really fast once you OC, and for god sakes, if you're the kind of person who's buying a $1000 GPU, you shouldn't intend to leave it at stock speeds.

    I hope someone can voltmod this card and actually make use of a waterblock, but there's another issue... Nvidia is obviously setting a precedent. Unless they change this OC policy, they won't be seeing any of my money anytime soon.
  • JarredWalton - Thursday, February 21, 2013 - link

    As someone with a 7970GE, I can tell you unequivocally that 1250MHz on air is not at all a given. My card can handle many games at 1150MMhz, but other titles and applications (say, running some compute stuff) and I'm lucky to get stability for more than a day at 1050MHz. Perhaps with enough effort playing with voltage mods and such I could improve the situation, but I'm happier living with a card for a couple years that doesn't crap out because of excessively high voltages.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    " After a few hours of trial and error, we settled on a base of the boost curve of 9,80 MHz, resulting in a peak boost clock of a mighty 1,123MHz; a 12 per cent increase over the maximum boost clock of the card at stock.

    Despite the 3GB of GDDR5 fitted on the PCB's rear lacking any active cooling it too proved more than agreeable to a little tweaking and we soon had it running at 1,652MHz (6.6GHz effective), a healthy ten per cent increase over stock.

    With these 12-10 per cent increases in clock speed our in-game performance responded accordingly."

    http://www.bit-tech.net/hardware/2013/02/21/nvidia...

    Oh well, 12 is 6 if it's nVidia bash time, good job mr know it all.
  • Hrel - Thursday, February 21, 2013 - link

    YES! 1920x1080 has FINALLY arrived. It only took 6 years from when it became mainstream but it's FINALLY here! FINALLY! I get not doing it on this card, but can you guys PLEASE test graphics cards, especially laptop ones, at 1600x900 and 1280x720. A lot of the time when on a budget playing games at a lower resolution is a compromise you're more than willing to make in order to get decent quality settings. PLEASE do this for me, PLEASE!
  • JarredWalton - Thursday, February 21, 2013 - link

    Um... we've been testing 1366x768, 1600x900, and 1920x1080 as our graphics standards for laptops for a few years now. We don't do 1280x720 because virtually no laptops have that as their native resolution, and stretching 720p to 768p actually isn't a pleasant result (a 6.7% increase in resolution means the blurring is far more noticeable). For desktop cards, I don't see much point in testing most below 1080p -- who has a desktop not running at least 1080p native these days? The only reason for 720p or 900p on desktops is if your hardware is too old/slow, which is fine, but then you're probably not reading AnandTech for the latest news on GPU performance.
  • colonelclaw - Thursday, February 21, 2013 - link

    I must admit I'm a little bit confused by Titan. Reading this review gives me the impression it isn't a lot more than the annual update to the top-of-the-line GPU from Nvidia.
    What would be really useful to visualise would be a graph plotting the FPS rates of the 480, 580, 680 and Titan along with their release dates. From this I think we would get a better idea of whether or not it's a new stand out product, or merely this year's '780' being sold for over double the price.
    Right now I genuinely don't know if i should be holding Nvidia in awe or calling them rip-off merchants.
  • chizow - Friday, February 22, 2013 - link

    From Anandtech's 7970 Review, you can see relative GPU die sizes:

    http://images.anandtech.com/doci/5261/DieSize.png

    You'll also see the prices of these previous flagships has been mostly consistent, in the $500-650 range (except for a few outliers like the GTX 285 which came in hard economic times and the 8800Ultra, which was Nvidia's last ultra-premium card).

    You an check some sites that use easy performance rating charts, like computerbase.de to get a quick idea of relative performance increases between generations, but you can quickly see that going from a new generation (not half-node) like G80 > GT200 > GF100 > GK100/110 should offer 50%+ increase, generally closer to the 80% range over the predecessor flagship.

    Titan would probably come a bit closer to 100%, so it does outperform expectations (all of Kepler line did though), but it certainly does not justify the 2x increase in sticker price. Nvidia is trying to create a new Ultra-premium market without giving even a premium alternative. This all stems from the fact they're selling their mid-range part, GK104, as their flagship, which only occurred due to AMD's ridiculous pricing of the 7970.

Log in

Don't have an account? Sign up now