The heat coming from the end of the card shouldn't be as bad as coming directly from the GPU.
It'll probably be a 5*C increase in temps getting to the CPU, not enough to even affect the CPU temps in any really manner.
Case temps are often around 100-110F, CPU temps are around 150-170F peak, it may idle a little warmer but at full load I don't expect they're to be much of a difference.
Compared to the 3080, the price/performance is a lot worse. But someone looking for the absolute fastest card isn't looking for the best price/performance, and with 24 GB of VRAM it is aiming at more than just a gaming market, despite the GeForce branding. The size of the card might limit how it's used outside gaming, however. Some organizations have used 2080 Tis instead of V100s or T4s to do rendering or machine learning work in servers. It's cheaper but there are certain disadvantages to it. NVIDIA doesn't want people to do it. The size of the 3090 might add one more disadvantage to dissuade people.
Well it's $1000 less than the Titan RTX... but I really hope the rumors of an AIB version of the 3080 with 20GB of RAM is true. I don't really want to downgrade from 11GB on a 1080 Ti to 10GB two generations later, even if it's better in every other way. There's a lot of interesting machine learning stuff that would benefit without going all out on the 3090.
I think the 10 GB in the 3080 is more spacious than the 11 GB in the 1080 Ti because of the compression technology. Also, the games that push the VRAM capacity in the future are probably likely to use directstorage that might also be used to alleviate VRAM capacity crunch a bit.
Note that the PS4 has 8GB for the entire system and was designed for 1080p while the PS5 will have 16 GB for the entire system, designed for 4k and with a CPU that can handle more complex tasks. The 980 Ti, which came out a year and a half after the PS4 release, has 6 GB of VRAM.
But don't the games have to support that compression? I don't think it helps with everything. The ram sizes seem odd and smaller than I was hoping for. I was expecting at least 12GB on the 3070 and 3080,and was sad when it was only 8 and 10GB
LOL@ at his "come to Papa" when he pulled the 3090 out of the oven. That thing is a BEAST, whoa nelly it's huge.
Very happy to hear the 3080 is at $699 instead of the rumored $799, ain't cheap but I have been saving and will be coming from a 970 so this is just going to be a massive upgrade.
Big-datacentre-Ampere can do 64-bit FP operations on the tensor cores; presumably this has been turned off in the RTX30xx to avoid accidentally providing really useful capabilities to people only wanting to spend $699 on their compute accelerators, but one can dream ...
FP64 is really niche, in the A100 presentation the big focus was TF32 with even less precision than FP32. I'm hoping they kept that part, most likely yes but I haven't see it in writing anywhere.
There are only a few FP64 units in the GPUs in the RTX 30xx cards, so if it's even capable to do the FP64 tensor core operations on it (it may be for compatibility reason) it will be so slow that it would not make sense to use the card for that purpose.
And as Kjella said, FP64 is used for scientific simulations, and the vast majority of compute outside that realm is FP32, with a lot of AI switching to lower precision than that.
The mobo industry will have to create a new generation of motherboards to support multiple Ampere cards due to the ultra-wide, slot-spanning width. And you'll need six 8-pin power connectors and $3,000 in gpu cards for dual 3090s. But the 3070 and 3080 seem very reasonably priced. Good luch getting one before Christmas!
His delivery on BFGPU was perfect. Can't wait for the arch deep dive, I'd like to see how they got 2x ipc and 2.7x total performance on raster. I expected RT perf would have been the bigger jump
Personally, even at 4k I am fairly happy with the gaming expierence I get with my good old gtx1080. My primary bottleneck is not processing, it is ram usage. Yes, it could be better, but it keeps most games at or above 4k 30fps at decent settings (even HZD). My 2 motivations to upgrade is more than 8 GB of RAM, and modern HDMI connectivity for my TV so I can do 4k 60hz and HDR at the same time. Performance be damned (provided it is similar or better than the 1080,which should be easy), whoever gives me modern HDMI and at least 12GB or ram for the lowest price will get my business this fall\winter
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
50 Comments
Back to Article
drexnx - Tuesday, September 1, 2020 - link
the jacket is famous, but will the spatulas be the next Jensen meme?Makaveli - Tuesday, September 1, 2020 - link
I'm waiting for him to sacrifice an NV gamer to his overloads on stream.MadManMark - Tuesday, September 1, 2020 - link
LOL yeah I even looked for that ... but didn't realize a 3080 was hidden behind it all along!willis936 - Tuesday, September 1, 2020 - link
Getting some weird duplicate entries on this one. Makes it harder to follow.Urthor - Tuesday, September 1, 2020 - link
Samsung confirmed for the 3090 it seems. Big news thereSarahKerrigan - Tuesday, September 1, 2020 - link
"12:21PM EDT - This marks the first time that NVIDIA has used a foundry other than TSMC for its flagship gaming GPUs"IBM, for the high-end 6800's? UMC, for 200-series?
Chaitanya - Tuesday, September 1, 2020 - link
That cooler is really odd.eek2121 - Tuesday, September 1, 2020 - link
I guess that will depend on the noise levels. Despite what most people think, I believe it will be beneficial.xenol - Tuesday, September 1, 2020 - link
I think it's pretty clever. They optimized the PCB to be smaller and used the footprint that would've been where the PCB was to extend the cooler.It's almost no different than the GTX 670 where it was like a 8" PCB, but had a blower give it another 4"
a00r - Tuesday, September 1, 2020 - link
looks like hot air is going to be blowing into the CPU, guess I will have to go liquid cooling for the CPUxenol - Tuesday, September 1, 2020 - link
Depending on how you have to mount the cooler, that may not be a good idea since the pump may be higher than the radiator/"tank"But otherwise, I don't think it'll be that bad. It'll certainly be lower than current non-blower designs that dump all of the heat back in the case.
0ldman79 - Tuesday, September 1, 2020 - link
It's a heat pipe, not the only one.The heat coming from the end of the card shouldn't be as bad as coming directly from the GPU.
It'll probably be a 5*C increase in temps getting to the CPU, not enough to even affect the CPU temps in any really manner.
Case temps are often around 100-110F, CPU temps are around 150-170F peak, it may idle a little warmer but at full load I don't expect they're to be much of a difference.
Makaveli - Tuesday, September 1, 2020 - link
lmao pulling the 3090 out of the oven how ironic.nevcairiel - Tuesday, September 1, 2020 - link
He is totally leaning into the Jensen kitchen memes.kenansadhu - Tuesday, September 1, 2020 - link
I must say that was pretty funnyCaedenV - Tuesday, September 1, 2020 - link
The oven is to help keep the card cool loldagnamit - Tuesday, September 1, 2020 - link
I'll say it. $1,500.00 is a good price on the 3090./ducks
drexnx - Tuesday, September 1, 2020 - link
impossible to know without 3rd party benchesdagnamit - Tuesday, September 1, 2020 - link
fair enough.senttoschool - Tuesday, September 1, 2020 - link
3090 has 157% more shader Tflops than 2080ti at a 50% price increase, yea I'd say it's a great deal.And we haven't even talked about its increase in RT flops and Tensor flops.
Yojimbo - Tuesday, September 1, 2020 - link
Compared to the 3080, the price/performance is a lot worse. But someone looking for the absolute fastest card isn't looking for the best price/performance, and with 24 GB of VRAM it is aiming at more than just a gaming market, despite the GeForce branding. The size of the card might limit how it's used outside gaming, however. Some organizations have used 2080 Tis instead of V100s or T4s to do rendering or machine learning work in servers. It's cheaper but there are certain disadvantages to it. NVIDIA doesn't want people to do it. The size of the 3090 might add one more disadvantage to dissuade people.Kjella - Tuesday, September 1, 2020 - link
Well it's $1000 less than the Titan RTX... but I really hope the rumors of an AIB version of the 3080 with 20GB of RAM is true. I don't really want to downgrade from 11GB on a 1080 Ti to 10GB two generations later, even if it's better in every other way. There's a lot of interesting machine learning stuff that would benefit without going all out on the 3090.Yojimbo - Tuesday, September 1, 2020 - link
I think the 10 GB in the 3080 is more spacious than the 11 GB in the 1080 Ti because of the compression technology. Also, the games that push the VRAM capacity in the future are probably likely to use directstorage that might also be used to alleviate VRAM capacity crunch a bit.Note that the PS4 has 8GB for the entire system and was designed for 1080p while the PS5 will have 16 GB for the entire system, designed for 4k and with a CPU that can handle more complex tasks. The 980 Ti, which came out a year and a half after the PS4 release, has 6 GB of VRAM.
CaedenV - Tuesday, September 1, 2020 - link
But don't the games have to support that compression? I don't think it helps with everything. The ram sizes seem odd and smaller than I was hoping for. I was expecting at least 12GB on the 3070 and 3080,and was sad when it was only 8 and 10GBYojimbo - Tuesday, September 1, 2020 - link
No, the compression is done through the hardware and drivers, as far as I know. The games just do their normal thing.UltraWide - Tuesday, September 1, 2020 - link
It's a home run for sure! This generation is locked in for Nvidia.L3R4F - Tuesday, September 1, 2020 - link
Because who needs reviews, benchmarks or see what the competition has to offer.sing_electric - Tuesday, September 1, 2020 - link
Toms Hardware: "Just buy it!"Koenig168 - Tuesday, September 1, 2020 - link
If the 3070 can really outperform the 2080Ti at USD499, that is pretty mind-boggling!! (waiting for independent benchmarks)evilpaul666 - Tuesday, September 1, 2020 - link
At RTX, probably, for the five games that use it two years after it's launch.SirDragonClaw - Tuesday, September 1, 2020 - link
It outperforms it in rasterization too.And there will be over 150 titles by the end of next year using RTX, hardly a small amount...
Icehawk - Tuesday, September 1, 2020 - link
LOL@ at his "come to Papa" when he pulled the 3090 out of the oven. That thing is a BEAST, whoa nelly it's huge.Very happy to hear the 3080 is at $699 instead of the rumored $799, ain't cheap but I have been saving and will be coming from a 970 so this is just going to be a massive upgrade.
imaheadcase - Tuesday, September 1, 2020 - link
12:38PM EDT - So why they aren't comparing it to RTX 2080 Ti, I have no ideaBecause it isn't, the 3080 was like they said...
Anyway, did they say with 3090 is getting sold? I want one.
prophet001 - Tuesday, September 1, 2020 - link
Sweeet!!Now just bring out that Zen 3 and I'll prolly do a new build. :D
TomWomack - Tuesday, September 1, 2020 - link
Big-datacentre-Ampere can do 64-bit FP operations on the tensor cores; presumably this has been turned off in the RTX30xx to avoid accidentally providing really useful capabilities to people only wanting to spend $699 on their compute accelerators, but one can dream ...Kjella - Tuesday, September 1, 2020 - link
FP64 is really niche, in the A100 presentation the big focus was TF32 with even less precision than FP32. I'm hoping they kept that part, most likely yes but I haven't see it in writing anywhere.Yojimbo - Tuesday, September 1, 2020 - link
There are only a few FP64 units in the GPUs in the RTX 30xx cards, so if it's even capable to do the FP64 tensor core operations on it (it may be for compatibility reason) it will be so slow that it would not make sense to use the card for that purpose.And as Kjella said, FP64 is used for scientific simulations, and the vast majority of compute outside that realm is FP32, with a lot of AI switching to lower precision than that.
Catalina588 - Tuesday, September 1, 2020 - link
The mobo industry will have to create a new generation of motherboards to support multiple Ampere cards due to the ultra-wide, slot-spanning width. And you'll need six 8-pin power connectors and $3,000 in gpu cards for dual 3090s. But the 3070 and 3080 seem very reasonably priced. Good luch getting one before Christmas!A5 - Tuesday, September 1, 2020 - link
Real question, is dual-gpu even still supported?Kjella - Tuesday, September 1, 2020 - link
The specs are up on Nvidia's site, NVLink is supported. But gaming support for SLI is so shitty I'm tempted to answer no.Kjella - Tuesday, September 1, 2020 - link
On the 3090, not the 3070/3080.Yojimbo - Tuesday, September 1, 2020 - link
The specs on Nvidia's site mentions 2 8-pin power connectors for the 3090 and nothing about the 12-pin connector. What happened to it?Unashamed_unoriginal_username_x86 - Tuesday, September 1, 2020 - link
His delivery on BFGPU was perfect. Can't wait for the arch deep dive, I'd like to see how they got 2x ipc and 2.7x total performance on raster. I expected RT perf would have been the bigger jumphammer256 - Tuesday, September 1, 2020 - link
That cooler... Is that going to send hot air to the CPU compartment?Wait... What's the power consumption of the 3080?
jeremyshaw - Tuesday, September 1, 2020 - link
Just like existing axial fan coolers.shaolin95 - Tuesday, September 1, 2020 - link
Toilets won't stop flushing at AMD HQs lolAgent Smith - Tuesday, September 1, 2020 - link
Imagine if you just bought a 2080Ti and its now beat by a basic 3070 for half the price.Ha ha ha...
voicequal - Tuesday, September 1, 2020 - link
Unless you're lucky, it will be months before you can get these cards.CaedenV - Tuesday, September 1, 2020 - link
Personally, even at 4k I am fairly happy with the gaming expierence I get with my good old gtx1080. My primary bottleneck is not processing, it is ram usage. Yes, it could be better, but it keeps most games at or above 4k 30fps at decent settings (even HZD).My 2 motivations to upgrade is more than 8 GB of RAM, and modern HDMI connectivity for my TV so I can do 4k 60hz and HDR at the same time. Performance be damned (provided it is similar or better than the 1080,which should be easy), whoever gives me modern HDMI and at least 12GB or ram for the lowest price will get my business this fall\winter
Meteor2 - Tuesday, September 1, 2020 - link
The 3070 looks exceptional value, isn't it? 2080 Ti performance for $499. That's amazing.Feels to me like the smallest card is the biggest deal.