Comments Locked

157 Comments

Back to Article

  • ehpexs - Tuesday, February 19, 2013 - link

    Great card, too rich for my blood though. For those who can afford one or two (or four) enjoy. I'll stick to my $550 pair of 7950s.
  • Wreckage - Tuesday, February 19, 2013 - link

    Remember the 7970's were $1100 at launch.
  • TheCrackLing - Tuesday, February 19, 2013 - link

    Then how did I manage to pay only $1150 for 2 at launch?

    The 7970s were around $550-600 at launch, nowhere near $1100.
  • Wreckage - Tuesday, February 19, 2013 - link

    I was (obviously) responding to his statement "pair of 7950s". If I could edit my post I suppose I could change it to CF and up the price $50. Either way Titan is in line with AMD pricing.
  • Stuka87 - Tuesday, February 19, 2013 - link

    So according to your logic, we can expect to pay 2k for cards that are twice as fast as the titan in the future?
  • just4U - Tuesday, February 19, 2013 - link

    Well... I recall paying almost 800 for a Geforce3 at launch. So hmmm.. i don't think the high end has gone up much (if at all..) over the past decade. Sometimes it comes down if Nvidia/Amd are duking it out on pricing but overall it's remainded fairly consistant.
  • JonnyDough - Wednesday, February 20, 2013 - link

    And it shouldn't. These cards get cheaper and cheaper for them to produce. Their profit margins just continue to climb. In other words, we're getting poorer and poorer in comparison to the upper class but nobody is taking notice...game on, until you can't afford to live.
  • shompa - Wednesday, February 20, 2013 - link

    Cheaper and cheaper to produce? How do you know that? Do you have wafer prices from TSMC and Global foundries? Or are you like many uneducated people who believes everything gets automatic cheaper with smaller process technology? *hint* Wafer prices goes up for each shrink. Thats why the majority of all microprocessors are manufactured at 65-90nm!

    Look also at AMDs profit margins. Oh... They are loosing money. Guess they have zero profit.

    To many uneducated people on the internet!
  • rupert3k - Saturday, June 1, 2013 - link

    Is that tone really necessary?
  • TheJian - Wednesday, February 20, 2013 - link

    ROFL. AMD lost 1.18B this last year and NV only made 725mil (with a 300mil Intel payment).

    NV made ~800mil in FY07 and lost money in 08 & 09 FY's lost 100mil, '10 made 235, '11 530m, last year finally hitting 725...It's taken them 5yrs to come close to what they USED TO MAKE.

    You apparently don't read balance sheets or earnings reports.

    For AMD I'll just give you the sum of the last 5-6. They lost a total of 5.1B roughly...THEY HAVEN'T MADE MONEY over the last 5 years they lost their fabs, wrote down ATI, laid off 30% of employees etc...They just lost 1.18B last year for christ's sake. You'd better PRAY amd stops giving away games and raises the price of their gpus/cpus before they go bankrupt. At the rate they are burning cash now they will be out of funding by the end of the year. Do you understand that? 5 years=5B+ in losses. Read a balance sheet once in a while before you say junk like this.

    Also note, just looking at NV, they are getting ~300mil per year right now from INTC. So they aren't even making what I said! AMD's margins are at 15%! NV 53%. Regardless of what you think of the price, neither is sticking it to you compared to the performance gains they are giving you every year and what it costs to get them. AMD, looking at the entire life of the company (I'm not going to actually do it), I don't think has actually made a $1 profit...LOL. They're gouging you? Feel free to pull up every year of earnings they have had since existence. I think you'll find they have actually LOST money.
    http://investing.money.msn.com/investments/financi...
    AMD's 10yr, has lost at least a few billion with a quick look. Is this computing in your head yet? In the last 10yrs AMD total has lost ~3-4Billion dollars. They aren't making SQUAT! If it weren't for large backers they'd be bankrupt ages ago. They were just downgraded by Fitch TWO grades to JUNK BOND status (just like USA two downgrades since obama took office)...In financial terms, it means NOT INVESTMENT GRADE.

    They are getting killed by INTC/NVDA. Now a cadre of a good 5 players are entering their cpu/server business. They will continue to lose money and be lucky to make it to 2014 without yet another borrowing fiasco and this time and even higher rates of interest due to junk status.

    NVDA finally hit record cash/revenue this year (and margins, but by a decimal, helped by Intel 300mil), after 5 years! What margins are you talking about? While a FAR better company than AMD, gaining share, entering new markets etc, NV isn't getting RICH either. Their future looks bright (AMD looks bankrupt or bought by 2014 without help), but unless you can prove otherwise they are in no way ripping you off. BOTH companies should be charging $50+ more on every card under $500. Granted the high end is what it is (middle income people don't drive Lamborghini's either), but the low end is costing them both with their current war that's been on for ~5yrs.

    AMD has 1B in cash. If they lose another 1B this year that's gone, how do you think they run the company with no cash? No money coming from consoles will go on the books until the end of the year (and those sales won't be phenomenal IMHO, look at vita/3ds/wiiu failures and cuts), and mobile won't bring them a dime until mid 2014 at best on the books with no ARM until then. Are you doing the math here?

    Get a better job, or quit buying things your budget can't afford! While your at it vote in a president who is PRO BUSINESS and ANTI TAX/SPEND. Start voting for people who CUT GOVT SPENDING & TAXES, then maybe you'll pay a little less in taxes, and more of us we'll be working to cover it because guess what happens when you cut taxes? (see Coolidge presidency, or Reagan, Coolidge was Reagan's hero...LOL...well duh - heard of the ROARING 20's?). Companies hire workers, and people start small businesses...Which causes...Wait for it...REVENUE to come in to cover the tax cuts! It should go without saying you need to CUT spending also when doing this for it to work (they keep kicking the can down the road today, again next month).

    Guess what came after the roaring 20's? A TAX AND SPENDER! What do they create? THE DEPRESSION! LOL. What is Obama creating? The depression v2.0 (well, amped up to 16.5 trillion levels I guess depression v9.9?).

    http://www.calvin-coolidge.org/coolidge-administra...

    He didn't bail out flood victims, didn't bail out farmers (piss off), not even his hometown state when they had a natural disaster! He definitely would have kicked out every one of the 20mil illegals YOU & I are about to pay for medically etc (actually you're already paying, just not getting the service, and the high risk fund is already broke...LOL). When you wonder why your medical is so high in 2015 and why service sucks, you can thank your president and the illegal aliens. Coolidge vetoed every spending bill that was pork too! NO PORK passed his plate. That's how you get the DEBT DOWN.

    https://en.wikipedia.org/wiki/Herbert_Hoover
    1929, 8 months later depression...LOL. How does he respond, govt works projects (spend, govt growth) etc...FAILURE. Sound familiar? "SHOVEL READY" anybody?...ROFL. He raised the top tax bracket from 25-63%...ROFL. Sound familiar? Raised taxes on Business! Sound familiar?
    "Hoover's defeat in the 1932 election was caused primarily by his failure to end the downward economic spiral, although his support for strong enforcement of prohibition was also a significant factor". This country just voted in Hoover's big brother TWICE!

    So take away your money, take away your rights (no drinking for you!) and spend spend spend, tax tax tax...Sound familiar? Obama attacking guns, our constitution, spying on everyone, raising taxes, killing business, taxing rich who create jobs (upping the brackets) unemployment everywhere...See the problem?

    Get a better job, or vote your govt out. Those are your options...LOL. Your graphics card price is NOT the problem and even if they give it to you FREE you won't live a better life...ROFLMAO

    $20 says you voted obama.
    http://www.calvin-coolidge.org/president-calvin-co...
    "He was concerned about all Americans, especially the working class man and woman. He wanted them to be free, independent, and self-reliant. Like Jefferson, he wanted them to be able to rise as far as their abilities permitted."

    The exact opposite of WELFARE obama. "you didn't build that"

    A few more for ya: "He never sought to make history for himself."... "his conservative, Constitution-based principles of government". His nickname was SILENT CALVIN.

    Obama's on TV every chance he gets. Obama's nickname? THE WELFARE PRESIDENT. He golfed with tiger woods Sunday...ROFL spent 8hrs with Tiger's coach first, golfed 18 with Tiger who left, obama went another 9...LOL. No time to fix the debt though. Obama believes in destroying the constitution & your ability to be SELF RELIANT. If you rise too high, He'll have to take your money and re-distribute it...LOL. It's YOUR job to make a better life for yourself and the govt's job to get out of your way. I clearly see you think the rich OWE you and desire govt intervention to get what you think they owe you. I want to game on, but I keep having to pay for PEOPLE like you who won't look to themselves to improve their lives. :(
    Reality bites eh? ;)
  • WiNV - Wednesday, February 20, 2013 - link

    And the AMD fagbots has gone mad. lol
  • Asmodian - Wednesday, February 20, 2013 - link

    Wow! how did politics come into it? Just because it is here:

    "Calvin Coolidge (1872-1933), the 30th U.S. president, led the nation through most of the Roaring Twenties, a decade of dynamic social and cultural change, materialism and excess."

    Interesting how we went into a depression right after Coolidge, it somehow reminds me of 2001-2009.

    Hoover was Coolidge's Secretary of Commerce and "he promoted partnerships between government and business under the rubric 'economic modernization'. In the presidential election of 1928, Hoover easily won the Republican nomination."
  • Asmodian - Wednesday, February 20, 2013 - link

    Oh, and I am quite happy to buy a very nice $1000 video card. I plan to buy a Titan as soon as I can.
  • WiNV - Wednesday, February 20, 2013 - link

    Ne too, single monster graphics card far better than dual gpuz and crossfire and sli crap.
    Cant wait to get my hands on that puppy.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Exactly.

    Nothing wrong with that everything right with it instead.

    Can't count the thousands of dollars of computer equipment here, not sure where all the crybaby extreme poverty whiners come from.

    It's quiet, low heat, low electric usage, sipping idle power, and ready for awesome gaming - with low PS needed - and it's the best card there is.

    Awesome fps and the massive extra nVidia only features, with no more whining about tesselation losing as it's the king there.

    Feels well made in the hand, top end stuff.
  • Spunjji - Thursday, February 21, 2013 - link

    LOL no, what are these you bring to the discussion, facts?!? GODDAMMIT OBAMA

    etc.
    /satire
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Perhaps he was saying he knows why you poorfags cannot afford a video card and constantly whine it should be given to you extra cheap at a loss, at the price you demand and not a dime more.
    Take heart he was blaming the Commie Usurper not you crybaby whiners, who clearly, I blame 100%.

    At least he was nice enough to provide a scapegoat to blame the crybaby whiners poorboy problems on.
  • Gastec - Wednesday, February 27, 2013 - link

    You shameless bastard! Daring to call us poor when there's a crysis (3) out there!
  • CeriseCogburn - Monday, March 4, 2013 - link

    Never let a good 3 crying sighs go to waste!
  • connor4312 - Wednesday, February 20, 2013 - link

    You, sir, and incredible. Seriously, it's like ten different trolls and fanboys in one person. How you got from fanboying on nVidia to Calvin Coolridge's politics to beat on Obama for playing golf to moaning about welfare on an article about video cards I may never know (it's certianly not worth my time to read your whole solilioquy), but that is definitely one of the most impressive specimin trolling/spamming I've ever seen. Kudos to you.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    Congratulations to all you whining amd fanboy freaks who lost the POINT of the rebuttal....

    Some amd fruitcake OWS idiot squealed profits are going through the roof (jhonny dough boy)...

    It was pointed out AMD is a failing, debt ridden, dying, sack of in the red losses loser company just about blwon to bankruptcy smithereens...

    But then when do amd fanboys pay attention to their helping hand demise of their big fave AMD ?
    Answer: NEVER
    Reason: They constantly beg and bag for lower pricing and squeal amd does just that every time. Then they wait for nVidia to DESTROY amd pricing with awesome nVidia hardware... at that point they take their pauper pennies and JUMP on that AMD card-- raping the bottom line into more BILLIONS of losses for amd, while they attack nVidia and scream corporate profit...

    LOL
    Moral of the history and current idiocy of the flapping lipped fools who get it 100% incorrect while they bloviate for their verdetrol master amd.

    The amd fanboys have just about destroyed amd as a going venture...

    I congratulate the clueless FOOLS for slaying the idiot crash monkey lousy video card junk producer amd.
  • Scott586 - Thursday, February 21, 2013 - link

    Wow TheJian, shut up already. We stopped reading 1/2-way thru the 2nd sentence.
  • Spunjji - Thursday, February 21, 2013 - link

    Thanks for taking what was a relevant comment correcting the previous poster and turning it into an inaccurate, ill-informed vitriolic rant about politics. That was... unprecedented.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    I enjoyed it, a lot. It was not however unprecedented, mr liar.

    As I so clearly recall, very recently on another little review here, the left wing raging amd fan OWS protester went on a wild diatribe cursing out those who dared not use their powerful system to organize political action and other such ventures like curing cancer, thus "disusing" their personal systems...

    So no, the wacko collectivist beat him to it.
  • BurtGravy - Thursday, February 21, 2013 - link

    That escalated quickly.
  • iEATu - Wednesday, May 29, 2013 - link

    I stopped reading when you started talking about Reagan and cutting taxes creates businesses. Based off of history with Reagan, revenue did NOT come in because big businesses took advantage of the tax cuts and nothing changed except to make big business more powerful and the richest people richer.
  • 3DPA - Saturday, August 10, 2013 - link

    Wow....this certainly got off topic. The only hope for America is for the US to became a tax haven for the rest of the world...similar to what Singapore did (and they are hugely prosperous). When you factor in under funded pensions, social security and medicare, the real US debt comes to around $86 trillion. You need to drop the US corporate tax rate from the highest in the world (I think around 30% to 35%) to the lowest. When you look at electronics manufacturing, the cost of building off-shore in China really only saves you 7% in labor because 90% of electronic assembly is automated. So you save 7%, but then you lose 4% shipping it back to the US thanks to the price of oil. So you really end up saving 3%. Given all the hassles of dealing with manufacturing half-way around the world, 3% savings is really not that attractive. So why do companies move electronics manufacturing over-seas? Because they save on tax...to the tune of 10% to 15%. You want to bring electronics manufacturing jobs back to the US? Just lower the corporate tax rate to something more globally competitive like 15% to 20%. This will create HUGE influx of job growth in the US. And what happens when more people are working? You broaden the tax base so tax revenue increases. Also, remember that close to $1 Trillion of corporate profits is sitting in foreign banks ($100 Billion of that belongs to Apple) because they don't want to pay the huge US corporate tax rates. Lower that tax rate and all that money flows back into the US where it can be used for investment in the US...which creates more opportunity...etc....etc...etc...

    So don't believe it when the Obama administration can't solve the US debt issue. It can be solved. But it just kills a socialist/communist president to cut corporate America a break even though not to do so causes us even more harm.

    You know.....I have yet to meet anyone that actually thinks Obama is doing a good job or is good for the country. I hear more negative talk, more jokes, seen more negative books published about him and heard more dooms-day scenarios with his name at the middle of it than any other president in memory.

    So how did he get re-elected?

    Well...there is one good thing I can say about Obama: He made me appreciate Bill Clinton.
  • azixtgo - Sunday, November 10, 2013 - link

    higher prices mean fewer sales. Lower prices mean lower margins. They have to find some balance and I doubt anyone here knows enough to know where that balance is. My thinking is that both companies make too many chips (desktop market). They price desirable chips too high for the average person. Maybe they should make fewer chips and make upgrades more desirable.

    Both consoles have sold millions already. I doubt it will be that game changing for them. Another thing is that AMD is letting their desktop CPU market suffer. They have not put out better chipsets for their FX processors in a good while so even people who would buy the fx processors may not do so when they see what they get with the motherboard (they do not even have mATX motherboards for these processors AFAIK.) I doubt the APUs will make up for the FX shortfalls. Intel on the other hand offers a less complicated setup. Companies do not just lose profits on lower prices and I am inclined to think the prices can stand to be lower based on what both nvidia and amd have done recently

    eg. 280X is a 7970 being sold for $300. GTX 780 now sells for 499. Are they at the balance point? I don't know. i just don't think they are optimizing their market presence at all
  • Sabresiberian - Wednesday, February 20, 2013 - link

    No, the GK110 isn't cheaper to produce, not by a long shot, and Intel's profit margin for their CPUs are quite a bit higher than Nvidia's or AMD's for the graphics solutions. It's pretty amazing considering the GPUs are far bigger.

    I was hoping for a price around $800, but expecting it to be $900-1200. Sure, I wish it was less, but Nvidia isn't out of line here.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    I agree. It's quiet, it sips power, it's the fastest, it has the features amd lacks, which are many and considerable. It has stable drivers, it is well built, it has 6G of ram, thus making it the most future proof to date, just recently of utmost importance to all the amd fanboys 3G quacking Skyrim mods baggers, now they can eat dirt, then mud from the delicious tears.

    Yeah pretty sick of the constant CEO price "absolutears" here on every dang release, squealing out price perf bang for buck whines and lies and pauper protestations in betwixt bragging about their knowledge and prowess and spewing fanboy fave.

    If you're a fanboy get off your butt and toss a few papers so you can buy a ding dang video card without wailing like a homeless vagrant.
  • Olaf van der Spek - Tuesday, February 19, 2013 - link

    Who cares about your pair of cards? Nobody but you!
  • Iketh - Tuesday, February 19, 2013 - link

    lol hater!
  • CeriseCogburn - Sunday, February 24, 2013 - link

    As compared to the crawl into the street and stone yourself missive you throw in another post ? LOL

    You won the hate war bub !

    I thought the gentleman owning the two 7950's made a very decent comment.
    Yes it's shocking coming from someone with 2 amd cards, but for once, it occurred.
  • chizow - Tuesday, February 19, 2013 - link

    This is much worst than the Ultra imo, at least in the case of the 8800GTX/Ultra, the performance at least somewhat justified the price relative to the rest of the market. We are somewhat spoiled by the bevy of card releases in recent years, but that's also the curse of the 680 and now Titan, the performance difference is nowhere close to the increase in price tag.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    You're NUTS to pretend you deserve 1 to 1 price to perf pricing on up the line, or that it is even a standard or usual expected measured outcome.

    What you do have is years now of idiot web articles and postings from insanely focused miniscule minded scrooge like weirdos futzing around dicing up beans to fill web space. So now your brain is fried. FPS is all the drool cup can visibly contain.

    Congratulations on the complete brainwashing. When you screamed 100% in the prior threads, it was found to be 20%, 30%, etc. outlying 40%.

    Facts don't matter, all the bang for the buck historical fantasy BS in your gourd, does.
  • joqqy - Wednesday, February 20, 2013 - link

    I'll wait until price drops, quite content with what I have now.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Since you spent $550, you could spend a grand.
    I accept your notional decision, but it is not out of your price range, you are after all running 2x #2 flagships.

    In fact yours is the first and ONLY reasonable pricing comment (complaint) in the entire thread.

    Congratulations for that. Appreciate it. Happy gaming to you.
  • Deo Domuique - Friday, March 8, 2013 - link

    I strongly believe, currently the best setup one could have is what you have...

    2x 7950 the most bang for your buck! 2 great cards with great price... Although, I'm no fan of Crossfire or Sli. Still, even one 7950 it still holds the best spot in my mind.
  • sensiballfeel - Tuesday, February 19, 2013 - link

    $1000 for a card slower than a GTX 690?

    Running two 580s for years now and skipped 680 for being too slow expecting something else in the pipeline.This is it and NvidiA wants to double the price to $1000?

    Nvidia has lost their mind.Good card,the price is beyond ridiculous.Nice try nvidia,but no thanks.
  • Menty - Tuesday, February 19, 2013 - link

    Meh, it's a single card rather than an SLI-on-a-stick card. That makes it better, in my book.
  • Wolfpup - Tuesday, February 19, 2013 - link

    I think so too. And IMO this makes sense...no one NEEDS this card, the GTX 680 is still awesome, and still competitive where it is. They can be selling these elsewhere for more, etc.

    Now, who wants to buy me 3 of them to run Folding @ Home on :-D
  • IanCutress - Tuesday, February 19, 2013 - link

    Doing some heavy compute, this card could pay for itself in a couple of weeks over a 680 or two. On the business side, it all comes down to 'does it make a difference to throughput', and if you can quantify that and cost it up, then it'll make sense. Gaming, well that's up to you. Folding... I wonder if the code needs tweaking a little.
  • wreckeysroll - Tuesday, February 19, 2013 - link

    Price is going to kill this card. See the powerpoints in the previews for performance. Titan is not too much faster than what they have on the market now, so not just the same price as a 690 but 30% slower as well.

    Game customers are not pro customers.

    This card could of been nice before someone slipped a gear at nvidia and thought gamers would eat this $1000 rip-off. A few will like anything not many though. Big error was made here on pricing this for $1000. A sane price would of sold many more than this lunacy.

    Nvidia dropped the ball.
  • johnthacker - Tuesday, February 19, 2013 - link

    People doing compute will eat this up, though. I went to NVIDIA's GPU Tech Conference last year, people were clamoring for a GK110 based consumer card for compute, after hearing that Dynamic Paralleism and HyperQ were limited to the GK110 and not on the GK104.

    They will sell as many as they want to people doing compute, and won't care at all if they aren't selling them to gamers, since they'll be making more profit anyway.

    Nvidia didn't drop the ball, it's that they're playing a different game than you think.
  • TheJian - Wednesday, February 20, 2013 - link

    Want to place bet on them being out of stock on the day their on sale? I'd be shocked if you can get one in a day if not a week.

    I thought the $500 Nexus 10 would slow some down but I had to fight for hours to get one bought and sold out in most places in under an hour. I believe most overpriced apple products have the same problem.

    They are not trying to sell this to the middle class ya know.

    Asus prices the ares 2 at $1600. They only made 1000 last I checked. These are not going to sell 10 million and selling for anything less would just mean less money, and problems meeting production. You price your product at what you think the market will bare. Not what Wreckeysroll thinks the price should be. Performance like a dualchip card is quite a feat of engineering. Note the Ares2 uses like ~475watts. This will come in around 250w. Again, quite a feat. That's around ~100 less than a 690 also.

    Don't forget this is a card that is $2500 of compute power. Even Amazon had to buy 10000 K20's just to get a $1500 price on them, and had to also buy $500 insurance for each one to get that deal. You think Amazon is a bunch of idiots? This is a card that fixes 600 series weakness and adds substantial performance to boot. It would be lunacy to sell it for under $1000. If we could all afford it they'd make nothing and be out of stock in .5 seconds...LOL
  • chizow - Friday, February 22, 2013 - link

    Except they have been selling this *SAME* class of card for much <$1000 for the better part of a decade. *SAME* size, same relative performance, same cost to produce. Where have you been and why do you think it's now OK to sell it for 2x as much when nothing about it justifies the price increase?
  • CeriseCogburn - Sunday, February 24, 2013 - link

    LOL same cost to produce....

    You're insane.
  • Gastec - Wednesday, February 27, 2013 - link

    You forget about the "bragging rights" factor. Perhaps Nvidia won't make many GTX Titan but all those they do make will definitely sell like warm bread. There are enough "enthusiasts" and other kinds of trolls out there (most of them in United States) willing to give anything to show to the Internet their high scores in various benchmarks and/or post a flashy picture with their shiny "rig".
  • herzwoig - Tuesday, February 19, 2013 - link

    Unacceptable price.
    Less than promised performance.

    Pro customers will get a Tesla, that is what those cards are for with the attenuate support and drivers. Nvidia is selling this as a consumer gaming play card and trying to reshape the high end gaming SKU as an even more premium product (doubly so!!)

    Terrible value and whatever performance it has going for it is erroded by the nonsensical pricing strategy. Surprising level of miscalculation on the greed front from nvidia...
  • TheJian - Wednesday, February 20, 2013 - link

    They'll pay $2500 to get that unless they buy 10000 like amazon (which still paid $2000/card). Unacceptable for you, but I guess you're not their target market. You can get TWO of these for the price of ONE tesla at $2000 and ONLY if you buy 10000 like amazon. Heck if buying one Tesla, I get two of these, a new I7-3770K+ a board...LOL. They're selling this as a consumer card with telsa performance (sans support/insurance). Sounds like they priced it right in line with nearly every other top of the line card released for years in this range. 7990, 690 etc...on down the line.

    Less than promised performance? So you've benchmarked it then?

    "Terrible value and whatever performance it has going for it"
    So you haven't any idea yet right?...Considering a 7990 costs a $1000 too basically, and uses 475w vs. 250w, while being 1/2 the size this isn't so nonsensical. This card shouldn't heat up your room either. There are many benefits, you just can't see beyond those AMD goggles you've got on.
  • WhoppingWallaby - Thursday, February 21, 2013 - link

    Dude, you have some gall calling another person a fanboy. We could all do without your ranting and raving, so go troll elsewhere or calm down a little.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Oh shut up yourself you radeon rager.

    You idiots think you have exclusive rights to spew your crap all over the place, and when ANYONE disagrees you have a ***** fit and demand they stop.

    How about all you whining critical diaper pooping fanatics stop instead ?
  • IanCutress - Tuesday, February 19, 2013 - link

    It's all about single card performance. Everything just works eaier with a single card. Start putting SLI into the mix and you need to take into account for drivers, or when doing compute it requires a complete reworking of code. Not to mention the potentially lower power output and OC capabilities of Titan over a dual GPU card.

    At any given price point, getting two cards up to that cost will always be quicker than a single card in any scenario that can take advantage, if you're willing to put up with it. So yes, two GTX 680s, a 690, or a Titan is a valid question, and it's up to the user preference which one to get.

    I need to double check my wallet, see if it hasn't imploded after hearing the price.
  • wreckeysroll - Tuesday, February 19, 2013 - link

    lost their minds?
    how about fell and cracked their head after losing it. Smoking too much of that good stuff down there in California.

    How stupid do they take us for. Way to thumb your customers in the eye nvidia. $1000 on a single gpu kit.

    Good laugh for the morning.
  • B3an - Tuesday, February 19, 2013 - link

    Use some ****ing common sense. You get what you pay for.

    6GB with 386-bit memory bus, and a 551mm2 size GPU. Obviously this wont be cheap and theres no way this could be sold for anywhere near the price of a 680 without losing tons of money.

    Nvidia already had this thing in super computers anyway so why not turn it in to a consumer product? Some people WILL buy this. If you have the money why not. Atleast NV are not sitting on their arses like AMD are with no new high-end GPU's this year. Even though i have AMD cards i'm very disappointed with AMD's crap lately as an enthusiast and someone who's just interested in GPU tech. First they literally give up on competitive performance CPU's and now it's looking like they're doing it with GPU's.
  • siliconfiber - Tuesday, February 19, 2013 - link

    Common sense is what you are missing.

    GTX 580, 480, 285 were all sold to for much less than this card and were all used in HPC applications, had the same or much bigger dies sizes, and the same or bigger bus. DDR memory is dirt cheap as well

    I have seen it all now. Largest rip-off in the history of video cards right here.
  • Genx87 - Tuesday, February 19, 2013 - link

    Oh look I have never seen this argument before. Biggest rip off in history of video cards. Pre-ceded only by every high end video card release since the introduction of high end discrete GPUs. And will remain a ripoff until the next high end GPU is released surpassing this card ripoff factor.
  • Blibbax - Tuesday, February 19, 2013 - link

    It's not a rip off because you don't have to buy it. The 680 hasn't gotten any slower.

    Just like with cars and anything else, when you add 50% more performance to a high-end product, it's gunna be a lot more than 50% more expensive.
  • johnthacker - Tuesday, February 19, 2013 - link

    The largest rip-off in the history of video cards are some of the Quadro cards. This is extremely cheap for a card with so good FP64 performance.
  • TheJian - Wednesday, February 20, 2013 - link

    GTX580 (40nm) was not in the same league as this and only had 3b transistors. Titan has 7.1B on 28nm. 512cuda cores compared to 2880? It came with 1.5GB memory too, this has 6. etc etc..The 580 did not run like a $2500 pro card @ a 1500 discount either. Also a chip this complicated doesn't YIELD well. It's very expensive to toss out the bad ones.

    Do you know the difference between system memory and graphics memory (you said ddr). They do not cost the same. You meant GDDR? Well this stuff is 4x as much running 6ghz not 4ghz.

    Ref clock is 876 but these guys got theirs to 1176:
    http://www.guru3d.com/articles-pages/geforce_gtx_t...

    The card is a monster value vs. $2500 K20. Engineering is not FREE. Ask AMD. They lost 1.18B last year selling crap at prices that would make you happy I guess. That's how you go out of business. Get it? They haven't made money in 10yrs (lost 3-4B over that time as a whole). Think they should've charged more for their cards/chips the last ten years? I DO. If Titan is priced wrong, they will remain on the shelf. Correct? So if you're right they won't sell. These will be gone in a day, because there are probably enough people that would pay $1500 for them they'll sell out quickly. You have to pay $2500 to get this on the pro side.
  • chizow - Friday, February 22, 2013 - link

    Um, GF100/110 are absolutely the same league as this card. In the semiconductor industry, size = classification. This is not the first 500+mm^2 ASIC Nvidia has produced, the lineage is long and distinguished:

    G80, GT200, GT200b, GF100, GF110.

    *NONE* of these GPUs cost $1K, only the 8800Ultra came anywhere close to it at $850. All of these GPUs offered similar features and performance relative to the competition and prevailing landscape. Hell, GT200 was even more impressive as it offered a 512-bit memory interface.

    Increase in number of transistors is just Moore's law, that's just expected progress. If you don't know the material you're discussing please refrain from commenting, thank you.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Wait a minute doofus, you said the memory cost the same, and it's cheap.
    You entirely disregarded the more than double the core transistor footprint, the R&D for it, the yield factor, the high build quality, and the new and extra top tier it resides in, not to mention it's awesome features the competition did not develop and does not have, AT ALL.
    4 monitors out of the box, Single card 3d and surround, extra monitor for surfing, target frame rate, TXAA, no tesselation lag, and on and on.

    Once a product breaks out far from the competitions underdeveloped and undeveloped failures, it EARNS a price tier.

    You're living in the past, you're living with the fantasy of zero worldwide inflation, you'r living the lies you've told yourself and all of us about the last 3 top tier releases, all your arguments exposed in prior threads for the exaggerated lies they were and are, and the Charlie D RUMORS all you of the this same ilk repeat, even as you ignore the absolute time years long DEV time and entire lack of production capability with your tinfoil hat whine.

    The market has changed you fool. There was a SEVERE SHORTAGE in the manufacturing space (negating your conspiracy theory entirely) and still there's pressure, and nVidia has developed a large range of added features the competition is entirely absent upon.

    You didn't get the 680 for $350 (even though you still 100% believe Charlie D's lie filled rumor) and you're not getting this for your fantasy lie price either.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    NONE had the same or much bigger die sizes.
    NONE had 7.1 BILLION engineer traced research die points.
    NONE had the potential downside low yield.
    NONE had the twice plus expensive ram in multiples more attached.

    NONE is the amount of truth you told.
  • Stuka87 - Tuesday, February 19, 2013 - link

    Common sense would say nVidia is charging double what they should be.

    384bit memory is certainly not a reason for high cost as AMD uses it in the 79x0 series chips. A large die adds to cost, but the 580 had a big die as well (520mm2), so that cant be the whole reason for the high cost (the GK110 does have more transistors).

    So it comes down to nVidia wanted to scalp customers.

    As for your comments on AMD, what proof do you have that AMD has nothing else in the works? Not sure what crap you are referring too. I have had no issues with my AMD cards or their drivers (Or my nVidias for that matter). Just keep on hating for no reason.
  • AssBall - Tuesday, February 19, 2013 - link

    You speak of common sense, but miss the point. When have you ever bought a consumer card for the pre-listed MSRP? These cards will sell to OEM's for compute and to enthusiasts via Nvidia's partners for much less.

    So it comes down to "derp Nvidia is a company that wants to make money derp".

    Calling someone a hater for unrealistic reasons is much less of an offense than being generally an idiot.
  • TheJian - Wednesday, February 20, 2013 - link

    A chip with 7.1B transistors is tougher to make correctly than 3B. Which card has 6GB of 6ghz memory from AMD that's $500 with this performance? 7990 is $900-1000 with 6GB and is poorly engineered compared to this (nearly double the watts, two slots more heat etc etc).

    This is why K20 costs $2500. They get far few of these perfect than much simpler chips. Also as said before, engineering these are not free. AMD charges less you say? Their bottom line for last year shows it too...1.18B loss. That's why AMD will have no answer until the end of the year. They can't afford to engineer an answer now. They just laid of 30% of their workforce because they couldn't afford them. NV hired 800 people last year for new projects. You do that with profits, not losses. You quit giving away free games or go out of business.

    Let me know when AMD turns a profit for a year. I guess you won't be happy until AMD is out of business. I think you're hating on NV for no reason. If they were anywhere near scalping customers they should have record PROFITS but they don't. Without Intel's lawsuit money (300mil a year) they'd be making ~1/2 of what they did in 2007. You do understand a company has to make money to stay in business correct?

    If NV charged 1/2 the price for this they would be losing probably a few hundred on each one rather than probably a $200 profit or so.

    K20 is basically the same card for $2500. You're lucky their pricing it at $1000 for what you're getting. Amazon paid $2000ea for 10000 of these as K20's. You think they feel robbed? So by your logic, they got scalped 20,000 times since they paid double the asking here with 10000 of them?...ROFL. OK.

    What it comes down to is NV knows how to run a business, while AMD knows how to run one into the ground. AMD needs to stop listening to people like you and start acting like NV or they will die.

    AMD killed themselves the day they paid 3x the price they should have for ATI. Thank Hector Ruiz for that. He helped to ruin Motorola too if memory serves...LOL. I love AMD, love their stuff, but they run their business like idiots. Kind of like Obama runs the country. AMD is running a welfare business (should charge more, and overpays for stuff they shouldn't even buy), obama runs a welfare country, and pays for crap like solyndra etc he shouldn't (with our money!). Both lead to credit downgrades and bankruptcy. You can't spend your way out of a visa bill. But both AMD and Obama think you can. You have to PAY IT OFF. Like NV, no debt. Spend what you HAVE, not what you have to charge.

    Another example. IMG.L, just paid triple what they should have for the scrap of MIPS. I think this will be their downfall. They borrowed 22million to pay 100mil bid for mips. It was worth 30mil. This will prove to be Imaginations downfall. That along with having chip sales up 90% but not charging enough to apple for them. They only made 30mil for 6 MONTHS! Their chip powers all of apples phones and tablets graphics! They have a hector ruiz type running their company too I guess. Hope they fire him before he turns them into AMD. Until Tegra4 they have the best gpu on a soc in the market. But they make 1/10 of what NV does. Hmmm...Wrong pricing? Apple pockets 140Bil over the life of ipad/iphone...But IMG.L had to borrow 22mil just to buy a 100mil company? They need to pull a samsung and raise prices 20% on apple. NV bought icera with 325mil cash...Still has 3.74B in the bank (which btw is really only up from 2007 because of Intel's 300mil/yr, not overcharging you).
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Appreciate it. Keep up the good work, as in telling the basic facts called the truth to the dysfunctional drones.

    no physx
    no cuda
    no frame rate target (this is freaking AWESOME, thanks nVidia)
    no "cool n quiet" on the fly GPU heat n power optimizing max fps
    no TXAA
    no same game day release drivers

    EPIC FAIL on dual cards, yes even today for amd

    " While it suffers from the requirement to have proper game-specific SLI profiles for optimum scaling, NVIDIA has done a very good job here in the past, and out of the 19 games in our test suite, SLI only fails in F1 2012. Compare that to 6 out of 19 failed titles with AMD CrossFire."

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    nVidia 18 of 19, 90%+ GRADE AAAAAAAAAA

    amd 13 of 19 < 70% grade DDDDDDDDDD
  • Iketh - Tuesday, February 19, 2013 - link

    please drag yourself into the street and stone yourself
  • CeriseCogburn - Sunday, February 24, 2013 - link

    LOL awww, now that wasn't very nice... May I assume you aren't in the USA and instead in some 3rd world hole with some 3rd world currency and economy where you can't pitch up a few bucks because there's no welfare available ? Thus your angry hate filled death wish ?
  • MrSpadge - Tuesday, February 19, 2013 - link

    Don't worry.. price will drop if they're actually in a hurry to sell them.
  • mrdude - Tuesday, February 19, 2013 - link

    I doubt it, given the transistor count and die size. This thing isn't exactly svelte, with 7.1Billion transistors. The viable-chips-per-wafer must be quite low, hence the price tag.

    What I don't understand is why people would buy a a $1000 GPU for compute? I can understand why somebody buys a ~$300 GPU to add a little extra horsepower to their small selection of applications, but if you're paying $1000 for a GPU then you're also expecting a decent set of drivers as well. But both AMD and nVidia have purposely neutered their consumer cards' performance for most professional tasks and applications. As a result, you can buy a cheaper FirePro or Quadro with professional drivers based on the smaller die/GPU (like a 7850 or 660Ti) that will outperform this $1000 single GPU card in a variety of software.

    If I'm paying upwards of $1000 for a GPU, it sure as hell has to work. Buying a consumer grade GPU and relying on consumer (gaming) drivers just means that you'll almost never hit anywhere near the max theoretical throughput of the card. In essence, you're paying for performance which you'll never get anywhere close to.

    This is a perfect card for the fools who overspend on their gaming GPUs. For everyone else it's just a high-priced bore.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    All those fools, we have been told over and over, and in fact very recently by the site's own, are here !

    That's what this is for, dimwit. Not for crybaby losers who can barely scrape up an HD 5750.

    Let's face it, every one of you whining jerks is drooling uncontrollably for this flagship, and if you're just a loser with a 450W power supply, no worries, they're being sold in high priced systems with that.

    You'd take in a minute, happily, and max out your games and your 1920x1080 monitor in MOST games.

    I mean I have no idea what kind of poor all you crybabies are. I guess you're all living in some 3rd world mudhole.
  • madmilk - Thursday, February 21, 2013 - link

    They're clearly not in any kind of hurry, given how well Tesla is selling at 3 times the price. These are probably just the rejects, set to a higher voltage and TDP and sold to the consumer market.
  • mrdude - Thursday, February 21, 2013 - link

    Oh yea, nVidia is never going to jeopardize the cash cow that is the Tesla for the HPC crowd, or Quadro for the professional market. The margins there aren't worth giving up in order to bring GPU compute (and its drivers) to the mass market.

    This notion that this is a GPGPU card is silly, frankly. We can throw around the max theoretical GFLOPs/TFLOPs figures all we please, the reality is that you'll never see anywhere close to those in professional applications. There are two reasons for that: Tesla and Quadro.
  • chizow - Tuesday, February 19, 2013 - link

    Yeah, totally agree with the post title, Nvidia has lost their fking minds.

    And PS: The X-Men *STILL* want their logo back.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This isn't 19G80 Kansas anymore Dorothy.

    Do any of you people live in the USA ?

    I mean really, how frikkin poor are all you crybabies, and how do you even afford any gaming system or any games ?

    Are you all running low end C2D still, no SSD's, and 1280x1024, do you live in a box ?

    How can you be in the USA and whine about this price on the very top end product for your Lifetime Hobby ?

    What is wrong with you, is the question.
  • Pariah - Tuesday, February 19, 2013 - link

    In most cases, this card won't make sense. There are at least a couple of scenarios where it might make sense. One, in an ultra highend gaming system. That means multiple Titan cards. Because these are single GPU cards, an SLI Titan setup should scale much better than an SLI 690 with 4 GPU's would. And further that point with triple SLI Titans.

    Secondly, this card is smaller and uses less power than a 690, which means you can use it in much smaller cases, even some mini-itx cases. That would be one helluva a nice portable LAN box.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This card makes sense for anyone running a mid sandy bridge and 1920x1080 monitor.
    After I complained about the 1920X1200 reviews here, pointing out nVidia is 12% BETTER compared to amd in the former resolution, 50 raging amd fanboys screeched they have a 1920X1200 monitor they run all the time and they were more than willing to pop the extra $150 bucks for it over the 1920x1080...

    So we can safely assume MOST of the people here have a 1920X1080 for pete sakes.
    A low end sandy is $50 to $80, same for a board, DDR3 is the cheapest ram.
    So for less than $200 bucks to prepare at max, (use your old case+ps) near everyone here is ready to run this card, and would find benefit from doing so.

    Now lying about that just because they don't plan on buying one is what most here seem to want to do.

  • Deo Domuique - Friday, March 8, 2013 - link

    This card should be cost ~600-650$. Not a single cent more. The rest is ala Apple markup for the mindless consumer. Unfortunately, there are a lot of them.
  • trajan2448 - Tuesday, February 19, 2013 - link

    Obviously a great piece of technology. Interested to see what the over clockers can achieve.
    If it was $700 It would make a lot more sense. Nonetheless, fun to see some fanatics do a TRI SLI overclocked and blow up their monitor.
  • TheJian - Wednesday, February 20, 2013 - link

    http://www.guru3d.com/articles-pages/geforce_gtx_t...
    1176mhz from 876 (boost). No bad for $2500 K20 basically for $1000. I've never done homework on it, but I don't think K20's overclock, but I could be wrong.

    Can't wait to see the review tomorrow. Clearly he'll bench it there and he has 3 :) You should get your answers then :)

    I'm wondering if some hacker will enable the K20 drivers, or if that's possible. It seems a lot of reviewers got 3, so you should have lots of data by weekend.
  • Bill Brasky - Tuesday, February 19, 2013 - link

    There were rumors this card would launch at 799-899, which made more sense. But for a grand this thing better be pretty darn close to 690.
  • wand3r3r - Tuesday, February 19, 2013 - link

    The price tag just makes this card a failure. It's a 580 replacement no matter how they label it, so they can shove it. They lost a potential customer...
  • karasaj - Tuesday, February 19, 2013 - link

    So what is the 680?
  • Sandcat - Tuesday, February 19, 2013 - link

    A GK104, which replaced the GF104,

    The GK110 is the replacement for the GF110, which was the GTX 580.
  • Ananke - Tuesday, February 19, 2013 - link

    the 680 was meant as a 560ti replacement...however, NVidia decided it turns too good to be sold too cheap, and changed the model numbering...I have several close friends in the marketing at NV :)
    However, NV is using this GK110 core for HPComputing for the very beginning in the Quadro cards, since there they really cannot skip on the double precision.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    BS.
    The 680 core is entirely different, the rollout time is over half a year off, and that just doesn't happen on a whim in Jan2012 with the post mental breakdown purported 7970 epic failure by amd...

    So after the scat lickers spew the 7970 amd failure, they claim it's the best card ever, even now.
    R O F L

    Have it both ways rumor mongering retreads. No one will notice... ( certainly none of you do).
  • rolodomo - Tuesday, February 19, 2013 - link

    Their business model has become separating money from the wallets of well-to-do who have no sense of value and technology (NVIDIA's PR admits this in the article ). It is a business model, but boutique. Doesn't do much for the their name brand in the view of the technorati either (NVIDIA: We Market to Suckers).
  • Wreckage - Tuesday, February 19, 2013 - link

    It's almost as fast as a pair of 7970's that cost $1100 at launch.

    AMD set the bar on high prices. Now that they are out of the GPU race, don't expect much to change.

    At least NVIDIA was able to bring a major performance increase this year. While AMD has become the new Matrox.
  • Stuka87 - Tuesday, February 19, 2013 - link

    AMD is out of the GPU race? What are you smoking? A $1000 dollar card does not put AMD out of the GPU race. The 7970GE competes well with the 680 for less money (They go back and forth depending on the game).

    Now if this card was priced at $500 then that would hurt AMD as the prices on the 660/670/680 would all drop. But its not the case, so your point is moot. Not to mention this card was due out a year ago, and it got delayed. Which is why the GK104 was bumped up to the 680 slot.
  • bigboxes - Tuesday, February 19, 2013 - link

    This is Wreckage we're talking about. He's trolling. Nothing to see here. Move along.
  • chizow - Tuesday, February 19, 2013 - link

    I agree with his title, that AMD is at fault at the start of all of this, but not necessarily with the rest of his reasonings. Judging from your last paragraph, you probably agree to some degree as well.

    This all started with AMD's pricing of the 7970, plain and simple. $550 for a card that didn't come anywhere close to justifying the price against the last-gen GTX 580, a good card but completely underwhelming in that flagship slot.

    The 7970 pricing allowed Nvidia to:

    1) price their mid-range ASIC, GK104, at flagship SKU position
    2) undercut AMD to boot, making them look like saints at the time and
    3) delay the launch of their true flagship SKU, GK100/110 nearly a full year
    4) Jack up the prices of the GK110 as an ultra-premium part.

    I saw #4 occurring well over a year ago, which was my biggest concern over the whole 7970 pricing and GK104 product placement fiasco, but I had no idea Nvidia would be so usurous as to charge $1k for it. I was expecting $750-800....$1k....Nvidia can go whistle.

    But yes, long story short, Nvidia's greed got us here, but AMD definitely started it all with the 7970 pricing. None of this happens if AMD prices the 7970 in-line with their previous high-end in the $380-$420 range.
  • TheJian - Wednesday, February 20, 2013 - link

    You realize you're dogging amd for pricing when they lost 1.18B for the year correct? Seriously you guys, how are you all not understanding they don't charge ENOUGH for anything they sell? They had to lay of 30% of the workforce, because they don't make any money on your ridiculous pricing. Your idea of pricing is KILLING AMD. It wasn't enough they laid of 30%, lost their fabs, etc...You want AMD to keep losing money by pricing this crap below what they need to survive? This is the same reason they lost the cpu war. They charged less for their chips for the whole 3yrs they were beating Intel's P4/presHOT etc to death in benchmarks...NV isn't charging too much, AMD is charging too LITTLE.

    AMD has lost 3-4B over the last 10yrs. This means ONE thing. They are not charging you enough to stay in business.

    This is not complicated. I'm not asking you guys to do calculus here or something. If I run up X bills to make product Y, and after selling Y can't pay X I need to charge more than I am now or go bankrupt.

    Nvidia is greedy because they aren't going to go out of business? Without Intel's money they are making 1/2 what they did 5yrs ago. I think they should charge more, but this is NOT gouging or they'd be making some GOUGING like profits correct? I guess none of you will be happy until they are both out of business...LOL
  • chizow - Wednesday, February 20, 2013 - link

    1st of all, AMD as a whole lost money, AMD's GPU division (formerly ATI) has consistently operated at a small level of profit. So comparing GPU pricing/profits impact on their overall business is obviously going to be lost in the sea of red ink on AMD's P&L statement.

    Secondly, the massive losses and devaluation of AMD has nothing to do with their GPU pricing, as stated, the GPU division has consistently turned a small profit. The problem is the fact AMD paid $6B for ATI 7 years ago. They paid way too much, most sane observers realized that 7 years ago and over the past 5-6 years it's become more obvious. The former ATI's revenue and profits did not justify the $6B price tag and as a result, AMD was *FORCED* to write down their assets as there were some obvious valuation issues related to the ATI acquisition.

    Thirdly, AMD has said this very month that sales of their 7970/GHz GPUs in January 2013 alone exceeded sales of those cards in the previous *TWELVE MONTHS* prior. What does that tell you? It means their previous price points that steadily dropped from $550>500>$450 were more than the market was willing to bear given the product's price:performance relative to previous products and the competition. Only after they settled in on that $380/$420 range for the 7970/GHz edition along with a very nice game bundle did they start moving cards in large volume.

    Now you do the math, if you sell 12x as many cards in 1 month at $100 profit instead of 1/12x as many cards at $250 profit over the course of 1 year, would you have made more money if you just sold the higher volume at a lower price point from the beginning? The answer is yes. This is a real business case that any Bschool grad will be familiar with when performing a cost-value-profit analysis.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Wow, first of all, basic common sense is all it takes, not some stupid idiot class for losers who haven't a clue and can't do 6th grade math.

    Unfortunately, in your raging fanboy fever pitch, you got the facts WRONG.

    AMD said it sold more in January than any other SINGLE MONTH of 2012 including "Holiday Season" months.

    Nice try there spanky, the brain farts just keep a coming.
  • frankgom23 - Tuesday, February 19, 2013 - link

    Who wants to pay more for less
    no new features..., this is a paper launch of a useless board for the consumer, I don't even need to see official benchmarks, I'm completely dissapointed.
    Maybe it's time to go back to ATI/AMD.
  • imaheadcase - Tuesday, February 19, 2013 - link

    If you would actually READ the article you would know why.

    I love how people cry a river without actually knowing how the card will perform yet.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Yes, go back, your true home is with losers and fools and crashers and bankrupt idiots who cannot pay for their own stuff.

    The last guy I talked to who installed a new AMD card for his awesome Eyefinity monitors gaming setup struggled for several days encompassing dozens of hours to get the damned thing stable, exclaimed several times he had finally achieved, and yet, the next day at it again, and finally took the thing, walked outside and threw it up against the brick wall "shattering it into 150 pieces" and "he's not going dumpster diving" he tells me, to try to retrieve a piece or part of it which might help him repair one of the two other DEAD upper range amd cards ( of 4 dead amd cards in the house ) he recently bought for mega gaming system.
    ROFL
    Yeah man, not kidding. He doesn't like nVidia by the way. He still is an amd fanboy.
    He is a huge gamer with multiple systems all running all day and night - and his "main" is "down"... needless to say it was quite stressful for him and has done nothing good for the very long friendship.
    LOL - Took it and in a seeing red rage and smashed that puppy to smithereens against the brick wall.

    So please, head back home, lots of lonely amd gamers need support.
  • iMacmatician - Tuesday, February 19, 2013 - link

    "For our sample card this manifests itself as GPU Boost being disabled, forcing our card to run at 837MHz (or lower) at all times. This is why NVIDIA’s official compute performance figures are 4.5 TFLOPS for FP32, but only 1.3 TFLOPS for FP64. The former assumes that boost is enabled, while the latter is calculated around GPU Boost being disabled. The actual execution rate is still 1/3."

    But the 837 MHz base and 876 MHz boost clocks give 2·(876 MHz)·(2688 CCs) = 4.71 SP TFLOPS and 2·(837 MHz)·(2688 CCs)·(1/3) = 1.50 DP TFLOPS. What's the reason for the discrepancies?
  • Ryan Smith - Tuesday, February 19, 2013 - link

    Apparently in FP64 mode Titan can drop down to as low as 725MHz in TDP-constrained situations. Hence 1.3TFLOPS, since that's all NVIDIA can guarantee.
  • tipoo - Tuesday, February 19, 2013 - link

    It seems if you were targetting maximum performance, being able to decouple them would make sense, as the GPU would both have higher thermal headroom as well as run cooler on average with the fan working harder, thus letting it hit the boost clocks higher.
  • Ryan Smith - Tuesday, February 19, 2013 - link

    You can always manually adjust the fan curve. NVIDIA is simply moving it with the temperature target by default.
  • Golgatha - Tuesday, February 19, 2013 - link

    WTF nVidia!? Seriously, WTF!?

    $1000 for a video card. Are they out of the GD minds!?
  • imaheadcase - Tuesday, February 19, 2013 - link

    No, read the article you twat.
  • tipoo - Tuesday, February 19, 2013 - link

    If they released a ten thousand dollar card, what difference would it make to you? This isn't' exactly their offering for mainstream gamers.
  • jackstar7 - Tuesday, February 19, 2013 - link

    I understand that my setup is a small minority, but I have to agree with the review about the port configuration. Not moving to multi-mDP on a card of this level just seems wasteful. As long as we're stuck with DVI, we're stuck with bandwidth limits that are going to stand in the way of 120Hz for higher resolutions (as seen on the Overlords and Catleap Extremes). Now I have to hope for some AIB to experiment with a $1000 card, or more likely wait for AMD to catch up to this.
  • akg102 - Tuesday, February 19, 2013 - link

    I'm glad Ryan got to experience this Nvidia circle jerk 'first-hand.'
  • Arakageeta - Tuesday, February 19, 2013 - link

    The Tesla- and Quadro-line GPUs have two DMA copy engines. This allows the GPU to simultaneously send and receive data on the full-duplex PCIe bus. However, the GeForce GPUs traditionally have only one DMA copy engine. Does the Titan have one or two copy engines? Since Titan has Tesla-class DP, I thought it might also have two copy engines.

    You can run the "deviceQuery" command that is a part of the CUDA SDK to find out.
  • Ryan Smith - Tuesday, February 19, 2013 - link

    1 copy engine. The full output of DeviceQuery is below.

    CUDA Device Query (Runtime API) version (CUDART static linking)

    Detected 1 CUDA Capable device(s)

    Device 0: "GeForce GTX TITAN"
    CUDA Driver Version / Runtime Version 5.0 / 5.0
    CUDA Capability Major/Minor version number: 3.5
    Total amount of global memory: 6144 MBytes (6442123264 bytes)
    (14) Multiprocessors x (192) CUDA Cores/MP: 2688 CUDA Cores
    GPU Clock rate: 876 MHz (0.88 GHz)
    Memory Clock rate: 3004 Mhz
    Memory Bus Width: 384-bit
    L2 Cache Size: 1572864 bytes
    Max Texture Dimension Size (x,y,z) 1D=(65536), 2D=(65536,65536), 3
    D=(4096,4096,4096)
    Max Layered Texture Size (dim) x layers 1D=(16384) x 2048, 2D=(16384,16
    384) x 2048
    Total amount of constant memory: 65536 bytes
    Total amount of shared memory per block: 49152 bytes
    Total number of registers available per block: 65536
    Warp size: 32
    Maximum number of threads per multiprocessor: 2048
    Maximum number of threads per block: 1024
    Maximum sizes of each dimension of a block: 1024 x 1024 x 64
    Maximum sizes of each dimension of a grid: 2147483647 x 65535 x 65535
    Maximum memory pitch: 2147483647 bytes
    Texture alignment: 512 bytes
    Concurrent copy and kernel execution: Yes with 1 copy engine(s)
    Run time limit on kernels: Yes
    Integrated GPU sharing Host Memory: No
    Support host page-locked memory mapping: Yes
    Alignment requirement for Surfaces: Yes
    Device has ECC support: Disabled
    CUDA Device Driver Mode (TCC or WDDM): WDDM (Windows Display Driver Mo
    del)
    Device supports Unified Addressing (UVA): Yes
    Device PCI Bus ID / PCI location ID: 3 / 0
    Compute Mode:
    < Default (multiple host threads can use ::cudaSetDevice() with device simu
    ltaneously) >

    deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 5.0, CUDA Runtime Versi
    on = 5.0, NumDevs = 1, Device0 = GeForce GTX TITAN
  • tjhb - Tuesday, February 19, 2013 - link

    Thank you!

    It seems to me NVIDIA are being incredibly generous to CUDA programmers with this card. I can hardly believe they've left FP64 capability at the full 1/3. (The ability to switch between 1/24 at a high clock and 1/3 at reduced clock seems ideal.) And we get 14/15 SMXs (a nice round number).

    Do you know whether the TCC driver can be installed for this card?
  • vacaloca - Wednesday, February 20, 2013 - link

    I'm assuming TCC driver would not work stock... if it's anything like the GTX 480 that could be BIOS/softstraps modded to work as Tesla C2050, it might be possible to get the HyperQ MPI, GPU Direct RDMA, and TCC support by doing the same except with a K20 or K20X BIOS. This would probably mean that the display outputs on the Titan card would be bricked. That being said, it's not entirely trivial... see below for details:

    https://devtalk.nvidia.com/default/topic/489965/cu...
  • tjhb - Thursday, February 21, 2013 - link

    That's an amazing thread. How civilised, that NVIDIA didn't nuke it.

    I'm only interested in what is directly supported by NVIDIA, so I'll use the new card for both display and compute.

    Thanks!
  • Arakageeta - Wednesday, February 20, 2013 - link

    Thanks! I wasn't able to find this information anywhere else.

    Looks like the cheapest current-gen dual-copy engine GPU out there is still the Quadro K5000 (GK104-based) for $1800. For a dual-copy engine GK110, you need to shell out $3500. That's a steep price for a small research grant!
  • Shadowmaster625 - Tuesday, February 19, 2013 - link

    For the same price as this thing, AMD could make a 7970 with a FX8350 all on the same die. Throw in 6GB of GDDR5 and 288GB/sec memory bandwidth and a custom ITX board and you'd have a generic PC gaming "console". Why dont they just release their own "AMDStation"?
  • Ananke - Tuesday, February 19, 2013 - link

    They will. It's called SONY Play Station 4 :)
  • da_cm - Tuesday, February 19, 2013 - link

    "Altogether GK110 is a massive chip, coming in at 7.1 billion transistors, occupying 551m2 on TSMC’s 28nm process."
    Damn, gonna need a bigger house to fit that one in :D.
  • Hrel - Tuesday, February 19, 2013 - link

    I've still never even seen a monitor that has a display port. Can someone please make a card with 4 HDMI port, PLEASE!
  • Kevin G - Tuesday, February 19, 2013 - link

    Odd, I have two different monitors has home and a third at work that'll accept a DP input.

    They do carry a bit of a premium over those with just DVI though.
  • jackstar7 - Tuesday, February 19, 2013 - link

    Well, I've got Samsung monitors that can only do 60Hz via HDMI, but 120Hz via DP. So I'd much rather see more DisplayPort adoption.
  • Hrel - Thursday, February 21, 2013 - link

    I only ever buy monitors with HDMI on them. I think anything beyond 1080p is silly. (lack of native content) Both support WAY more than 1080p, so I see no reason to spend more. I'm sure if I bought a 2560x1440 monitor it'd have DP. But I won't ever do that. I'd buy a 19200x10800 monitor though; one day.
  • vacaloca - Tuesday, February 19, 2013 - link

    A while ago when K20 released and my advisor didn't want to foot the bill, I ended up doing it myself. Looks like the K20 might be going to eBay since I don't need HyperQ MPI and GPU Direct RDMA or ECC for that matter. I do suspect that it might be possible to crossflash this card with a K20 or K20X BIOS and mod the softstraps to enable the missing features... but probably the video outputs would be useless (and warranty void, and etc) so it's not really an exercise worth doing.

    Props to NVIDIA for releasing this for us compute-focused people and thanks to AnandTech for the disclosure on FP64 enabling. :)
  • extide - Tuesday, February 19, 2013 - link

    Can you please run some F@H benchmarks on this card? I would be very very interested to see how well it folds. Also if you could provide some power consumption numbers (watts @ system idle and watts when gpu only is folding).

    That would be great :)
    Thanks!
  • Ryan Smith - Tuesday, February 19, 2013 - link

    OpenCL is broken with the current press drivers. So I won't have any more information until NVIDIA issues new drivers.
  • jimhans1 - Tuesday, February 19, 2013 - link

    Alright, the whining about this being a $1000 card is just stupid; nVidia has priced this right in my eyes on the performance/noise/temperature front, they have never billed this as being anything other than an Extreme style GPU, just like the 690, yes the 690 will outperform this in raw usage, but not by much I'm guessing, and it will run hotter, louder and use more power than the Titan, not to mention possible SLI issues that have plagued ALL SLI/CF on one PCB cards to date. If you want THE high end MAINSTREAM card, you get the 680, if you wan't the EXTREME card(s), you get the Titan or 690.

    Folks, we don't yell at Ferrari or Bugatti for pricing their vehicles to their performance capabilities; nobody yelled at Powercolor for pricing the Devil 13 at $1000 even though the 690 spanks it on ALMOST all fronts for $100 LESS.

    Yes, I wish I could afford 1 or 3 of the Titans; but' I am not going to yell and whine about the $1000 price because I CAN'T afford them, it gives me a goal to try and save my sheckles to get at least 2 of them before years end, hopefully the price may (but probably won't) have dropped by then.
  • chizow - Tuesday, February 19, 2013 - link

    The problem with your car analogy is that Nvidia is now charging you Bugatti prices for the same BMW series you bought 2 years ago. Maybe an M3 level of trim this time around, but it's the same class of car, just 2x the price.
  • Sandcat - Wednesday, February 20, 2013 - link

    The high end 28nm cards have all been exercises in gouging. At least they're being consistent with the 'f*ck the customer because we have a duopoly' theme.
  • Kevin G - Tuesday, February 19, 2013 - link

    The card is indeed a luxury product. Like all consumer cards, this is crippled in in some way compared to the Quadro and Tesla lines. Not castrating FP64 performance is big. I guess nVidia finally realized that the HPC market values reliability more than raw computer and hence why EDC/ECC is disabled. ditto for RMDA, though I strongly suspect that RMDA is still used for SLI between Geforce cards - just a lock out to another vendor's hardware.

    The disabling of GPU Boost for FP64 workloads is odd. Naturally it should consumer a bit more energy to do FP64 workloads which would either result in higher temps at the same frequency as FP32 or lower clocks at the same frequency as FP32. The surprise is that users are don't have the flexibility to choose or adjust those settings.

    Display overclocking has me wondering exactly what is being altered. DVI and DP operate at distinct frequencies and moving to a higher refresh rate at higher resolutions should also increase this. Cable quality would potentially have an impact here as well. Though for lower resolutions, driving them at a higher refresh rate should still be within the cabling spec.
  • Kepe - Tuesday, February 19, 2013 - link

    The comment section is filled with NVIDIA hate, on how they dropped the ball, lost their heads, smoked too much and so on. What you don't seem to understand is that this is not a mainstream product. It's not meant for those who look at performance/$ charts when buying their graphics cards. This thing is meant for those who have too much money on their hands. Not the average Joe building his next gaming rig. And as such, this is a valid product at a valid price point. A bit like the X-series Intel processors. If you look at the performance compared to their more regular products the 1000+ dollar price is completely ridiculous.

    You could also compare the GTX Titan to a luxury phone. They use extravagant building materials, charge a lot of extra for the design and "bling", but raw performance isn't on the level of what you'd expect by just looking at the price tag.
  • jimhans1 - Tuesday, February 19, 2013 - link

    I agree, the pricing is in line with the EXPECTED user base for the card; it is NOT a mainstream card.
  • Sandcat - Tuesday, February 19, 2013 - link

    The disconnect regards the Gx110 chip. Sure, it's a non-mainstream card, however people do have the impression that it is the lock-step successor to the 580, and as such should be priced similarly.

    Nvidia does need to be careful here, they enjoy a duopoly in the market but goodwill is hard to create and maintain. I've been waiting for the 'real' successor to the 580 to replace my xfire 5850's and wasn't impressed with the performance increase of the 680. Looks like it'll be another year....at least.

    :(
  • CeriseCogburn - Monday, March 4, 2013 - link

    lol - DREAM ON about goodwill and maintaining it.

    nVidia is attacked just like Intel, only worse. They have the least amount of "goodwill" any company could possibly have, as characterized by the dunderheads all over the boards and the also whining reviewers who cannot stand the "arrogant know it all confident winners who make so much more money playig games as an nVidia rep"...

    Your theory is total crap.

    What completely overrides it is the simple IT JUST WORKS nVidia tech and end user experience.
    Add in the multiplied and many extra features and benefits, and that equals the money in the bank that lets the end user rest easy that new games won't become an abandoned black holed screen.

    Reputation ? The REAL reputation is what counts, not some smarmy internet crybaby loser with lower self esteem than a confident winner with SOLID products, the BEST of the industry.
    That's arrogance, that's a winner, that's a know it all, that's Mr. Confidence, that's the ca$h and carry ladies magnet, and that's what someone for the crybaby underdog loser crash crapster company cannot stand.
  • Galvin - Tuesday, February 19, 2013 - link

    Can this card do 10bit video or still limited to 8bit?
  • alpha754293 - Tuesday, February 19, 2013 - link

    Does this mean that Tesla-enabled applications will be able to make use of Titan?
  • Ryan Smith - Tuesday, February 19, 2013 - link

    It depends on what features you're trying to use. From a fundamental standpoint even the lowly GT 640 supports the baseline Kepler family features, including FP64.
  • Ankarah - Tuesday, February 19, 2013 - link

    Highly unusual for a company to have two of their products at the exact same price point, catering to pretty much the same target audience.

    I guess it could be viewed as a poor-man's-Tesla but as far as the gaming side goes, it's quite pointless next to the 690, not to mention very confusing to anyone other than those are completely up-to-date on the latest news stories.
  • CeriseCogburn - Monday, March 4, 2013 - link

    Let's see, single GPU core fastest in the gaming world, much lower wattage, no need for profiles, constant FPS improvement - never the same or no scaling issues across all games, and you find it strange ?

    I find your complete lack of understanding inexcusable since you opened the piehole and removed all doubt.
  • Voidman - Tuesday, February 19, 2013 - link

    Finally somehting I could be excited about. I have a hard time caring much about the latest smart phone or tablet. A new high end video card though is something different all together. And then it turns out to be a "luxury product" and priced at 1k. Cancel excitement. Oh well, I'm happy with my 680 still, and I'm pretty sure I've still got overclocking room on it to boot. But for all those that love to hate on either AMD or Nvidia, this is what happens when one is not pushing the other. I have no doubt what so ever that AMD would do the same if they were on top at the moment.

  • HanakoIkezawa - Tuesday, February 19, 2013 - link

    The price is a bit disappointing but not unexpected. I was hoping this would be 750-850 not so I could buy one but so that I could get a second 670 for a bit cheaper :D

    But in all seriousness, this coming out does not make the 680 or 670 any slower or less impressive. In the same way the 3970x's price tag doesn't make the 3930k any less of a compelling option.
  • johnsmith9875 - Tuesday, February 19, 2013 - link

    Why not just make the video card the computer and let the intel chip handle graphics???
  • Breit - Tuesday, February 19, 2013 - link

    Thanks Ryan, this made my day! :)

    Looking forward to part 2...
  • hammer256 - Tuesday, February 19, 2013 - link

    Ryan's analysis of the target market for this card is spot on: this card is for small scale HPC type workloads, where the researcher just want to build a desktop-like machine with a few of those cards. I know that's what I use for my research. To me, this is the real replacement of the GTX 580 for our purposes. The price hike is not great, but when put to context of the K20X, it's a bargain. I'm lusting to get 8 of these cards and get a Tyan GPU server.
  • Gadgety - Tuesday, February 19, 2013 - link

    While gamers see little benefit, it looks like this is the card for GPU rendering, provided the software developers at VRay, Octane and others find a way to tap into this. So one of these can replace the 3xGTX580 3GBs.
  • chizow - Tuesday, February 19, 2013 - link

    Nvidia has completely lost their minds. Throwing in a minor bone with the non-neutered DP performance does not give them license to charge $1K for this part, especially when DP on previous flagship parts carried similar performance relative to Tesla.

    First the $500 for a mid-range ASIC in GTX 680, then $1200 GTX 690 and now a $1000 GeForce Titan. Unbelievable. Best of luck Nvidia, good luck competing with the next-gen consoles at these price points, or even with yourselves next generation.

    While AMD is still at fault in all of this for their ridiculous launch pricing for the 7970, these recent price missteps from Nvidia make that seem like a distant memory.
  • ronin22 - Wednesday, February 20, 2013 - link

    Bullshit of a typical NV hater.

    The compute-side of the card isn't a minor bone, it's its prime feature, along with the single-chip GTX690-like performance.

    "especially when DP on previous flagship parts carried similar performance relative to Tesla"

    Bullshit again.
    Give me a single card that is anywhere near the K20 in DP performance and we'll talk.

    You don't understand the philosophy of this card, as many around here.
    Thanksfully, the real intended audience is already recognizing the awesomeness of this card (read previous comments).

    You can go back to playing BF3 on your 79xx, but please close the door behind you on your way out ;)
  • chizow - Wednesday, February 20, 2013 - link

    Heh, your ignorant comments couldn't be further from the truth about being an "NV hater". I haven't bought an ATI/AMD card since the 9700pro (my gf made the mistake of buying a 5850 though, despite my input) and previously, I solely purchased *multiple* Nvidia cards in this flagship market for the last 3 generations.

    I have a vested interest in Nvidia in this respect as I enjoy their products, so I've never rooted for them to fail, until now. It's obvious to me now that between AMD's lackluster offerings and ridiculous launch prices along with Nvidia's greed with their last two high-end product launches (690 and Titan), that they've completely lost touch with their core customer base.

    Also, before you comment ignorantly again, please look up the DP performance of GTX 280 and GTX 480/580 relative to their Tesla counterparts. You will see they are still respectable, ~1/8th of SP performance, which was still excellent compared to the completely neutered 1/32 DP of GK104 Kepler. That's why there is still a high demand for flagship Fermi parts and even GT200 despite their overall reputation as a less desirable part due to their thermal characteristics.

    Lastly, I won't be playing BF3 on a 7970, try a pair of GTX 670s in SLI. There's a difference between supporting a company through sound purchasing decisions and stupidly pissing away $1K for something that cost $500-$650 in the past.

    The philosophy of this card is simple: Rob stupid people of their money. I've seen enough of this in the past from the same target audience and generally that feeling of "awesomeness" is quickly replaced by buyer's remorse as they realize that slightly higher FPS number in the upper left of their screen isn't worth the massive number on their credit card statement.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    That one's been pissing acid since the 680 launch, failed and fails to recognize the superior leap of the GTX580 over the prior gen, which gave him his mental handicap believing he can get something for nothing, along with sucking down the master amd fanboy Charlie D's rumor about the "$350" flagship nVidia card blah blah blah 680 blah blah second tier blah blah blah.

    So instead the rager now claims he wasted near a grand on two 670's - R O F L - the lunatics never end here man.
  • bamboo69 - Tuesday, February 19, 2013 - link

    Origin is using EK Waterblocks? i hope they arentt nickel plated, their nickel blocks flake
  • Knock24 - Wednesday, February 20, 2013 - link

    I've seen it mentioned in the article that Titan has HyperQ support, but I've also read the opposite elsewhere.
    Can anyone confirm that HyperQ is supported? I'm guessing the simpleHyperQ Cuda SDK example might reveal if it's supported or not.
  • torchedguitar - Wednesday, February 20, 2013 - link

    HyperQ actually means two separate things... One part is the ability to have a process act as a server, providing access to the GPU for other MPI processes. This is supported on Linux using Tesla cards (e.g. K20X) only, so it won't work on GTX Titan (it does work on Titan the supercomputer, though). The other part of HyperQ is that there are multiple hardware queues available for managing the work on multiple CUDA streams. GTX Titan DOES support this part, although I'm not sure just how many of these will be enabled (it's a tradeoff, having more hardware streams allows more flexibility in launching concurrent kernels, but also takes more memory and takes more time to initialize). The simpleHyperQ sample is a variation of the concurrentKernels sample (just look at the code), and it shows how having more hardware channels cuts down on false dependencies between kernels in different streams. You put things in different stream because they have no dependencies on each other, so in theory nothing in stream X should ever get stuck waiting for something in stream Y. When that does happen due to hitting limits of the hardware, it's a false dependency. An example would be when you try to time a kernel launch by wrapping it with CUDA event records (this is the simpleHyperQ sample). GPUs before GK110 only have one hardware stream, and if you take a program that launches kernels concurrently in separate streams, and wrap all the kernels with CUDA event records, you'll see that suddenly the kernels run one-at-a-time instead of all together. This is because in order to do the timing for the event, the single hardware channel queues up the other launches while waiting for each kernel to finish, then it records the end time in the event, then goes on to the next kernel. With HyperQ's addition of more hardware streams, you get around this problem. Run the simpleHyperQ sample on a 580 or a 680 through a tool like Nsight and look at the timeline... You'll see all the work in the streams show up like stair steps -- even though they're in different streams, they happen one at a time. Now run it on a GTX Titan or a K20 and you'll see many of the kernels are able to completely overlap. If 8 hardware streams are enabled, the app will finish 8x faster, or if 32 are enabled, 32x faster.

    Now, this sample is extremely contrived, just to illustrate the feature. In reality, overlapping kernels won't buy you much speedup if you're already launching big enough kernels to use the GPU effectively. In that case, there shouldn't much room left for overlapping kernels, except when you have unbalanced workloads where many threads in a kernel finish quickly but a few stragglers run way longer. With HyperQ, you greatly increase your chances that kernels in other streams can immediately start using the resources freed up when some of the threads in a kernel finish early, instead of waiting for all threads in the kernel to finish before starting the next kernel.
  • vacaloca - Monday, March 4, 2013 - link

    I wanted to say that you hit the nail on the head... I just tested the simpleHyperQ example, and indeed, the Titan has 8 hardware streams enabled. For every multiple higher than 8, and the "Measured time for sample" goes up.
  • AeroJoe - Wednesday, February 20, 2013 - link

    Very good article - but now I'm confused. If I'm building an Adobe workstation to handle video and graphics, do I want a TITAN for $999 or the Quadro K5000 for $1700? Both are Kepler, but TITAN looks like more bang for the buck. What am I missing?
  • Rayb - Wednesday, February 20, 2013 - link

    The extra money you are paying is for the driver support in commercial applications like Adobe CS6 with a Quadro card vs a non certified card.
  • mdrejhon - Wednesday, February 20, 2013 - link

    Excellent! Geforce Titan will make it much easier to overclock an HDTV set to 120 Hz
    ( http://www.blurbusters.com/zero-motion-blur/hdtv-r... )

    Some HDTV’s such as Vizio e3d420vx can be successfully “overclocked” to a 120 Hz native PC signal from a computer. This was difficult because an EDID override was necessary. However, the Geforce Titan should make this a piece of cake!
  • Blazorthon - Wednesday, February 20, 2013 - link

    Purely as a gaming card, Titan is obviously way to overpriced to be worth considering. However, it's compute performance is intriguing. It can't totally replace a Quadro or Tesla, but there are still many compute workloads that you don't need those extremely expensive extra features such as ECC and Quadro/Tesla drivers to excel in. Many of them may be better suited to a Tahiti card's far better value, but stuff like CUDA workloads may find Titan to be the first card to truly succeed GF100/GF110 based cards as a gaming and compute-oriented card, although like I said, I think that the price could still be at least somewhat lower. I understand it not being around $500 like GF100/110 launched at for various reasons, but come on, at most give us an arpund $700-750 price...
  • just4U - Thursday, February 21, 2013 - link

    Some one here have stated that AMD is at fault for pricing their 7x series so high las year. Perhaps many were disapointed with the $550 price range but that's still somewhat lower than previously released Nvidia products thru the years. Several of those cards (at various price points) handily beat the 580 (which btw never did get much of a price drop) and at the time that's what it was competing against.

    So I can't quite connect the dots in why they are saying that it's AMD's fault for originally pricing the 7x series so high when in reality it was still lower than newly released Nvidia product over the past several years.
  • CeriseCogburn - Monday, March 4, 2013 - link

    For the most part, correct.
    The 7970 came out at $579 though, not $550. And it was nearly not present for many months, till just the prior day to the 680's $499 launch.

    In any case, ALL these cards drop in price over the first six months or so, EXCEPT sometimes, if they are especially fast, like the 580, they hold at the launch price, which it did, until the 7970 was launched - the 580 was $499 till the day the 7970 launched.

    So what we have here is the tampon express. The tampon express has not paid attnetion to any but fps/price vs their revised and memory holed history, so it will continue forever.

    They have completely ignored capital factors like the extreme lack of production space in the node, ongoing prior to the 7970 release, and at emergency low levels prior to the months later 680 release, with the emergency board meeting, and multi-billion dollar borrowing buildout for die space production expansion, not to mention the huge change in wafer from dies payment which went from per good die to per wafer cost, thus placing the burden of failure on the GPU company side.

    It's not like they could have missed that, it was all over the place for months on end, the amd fanboys were bragging amd got diespace early and constantly hammering away at nVidia and calling them stupid for not having reserved space and screaming they would be bankrupt from low yields they had to pay for from the "housefires" dies.

    So what we have now is well trained (not potty trained) crybabies pooping their diapers over and over again, and let's face it, they do believe they have the power to lower the prices if they just whine loudly enough.

    AMD has been losing billions, and nVidia profit ratio is 10% - but the crying babies screams mean to assist their own pocketbooks at any expense, including the demise of AMD even though they all preach competition and personal CEO capitalist understanding after they spew out 6th grader information or even make MASSIVE market lies and mistakes with illiterate interpretation of standard articles or completely blissful denial of things like diespace (mentioned above) or long standing standard industry tapeout times for producing the GPU's in question.

    They want to be "critical reporters" but they fail miserably at it, and merely show crybaby ignorance with therefore false outrage. At least they consider themselves " the good hipster !"
  • clickonflick - Thursday, March 7, 2013 - link

    i agree that the price of this GPU is really high , one could easily assemble a fully mainstream laptop online with dell at this price tag or a desktop, but for gamers, to whom performance is above price. then it is a boon for them
    for more pics check this out
    clickonflick/nvidia-geforce-gtx-titan

Log in

Don't have an account? Sign up now