The NVIDIA GeForce GTX 1650 Review, Feat. Zotac: Fighting Brute Force With Power Efficiency
by Ryan Smith & Nate Oh on May 3, 2019 10:15 AM ESTFollowing up on last week’s launch of NVIDIA’s new budget video card, the GeForce GTX 1650, today we’re taking a look at our first card, courtesy of Zotac. Coming in at $149, the newest member of the GeForce family brings up the rear of the GeForce product stack, offering NVIDIA’s latest architecture in a low-power, 1080p-with-compromises gaming video card with a lower price to match.
As the third member of the GeForce GTX 16 series, the GTX 1650 directly follows in the footsteps of its GTX 1660 predecessors. Built on a newer, smaller GPU specifically for these sorts of low-end cards, the underlying TU117 GPU is designed around the same leaner and meaner philosophy as TU116 before it. This means it eschews the dedicated ray tracing (RT) cores and the AI-focused tensor cores in favor of making smaller, easier to produce chips that retain the all-important core Turing architecture.
The net result of this process, the GeForce GTX 1650, is a somewhat unassuming card if we’re going by the numbers, but an important one for NVIDIA’s product stack. Though its performance is pedestrian by high-end PC gaming standards, the card fills out NVIDIA’s lineup by offering a modern Turing-powered card under $200. Meanwhile for the low-power video card market, the GTX 1650 is an important shot in the arm, offering the first performance boost for this hard-capped market in over two years. The end result is that the GTX 1650 will serve many masters, and as we’ll see, it serves some better than others.
NVIDIA GeForce Specification Comparison | ||||||
GTX 1650 | GTX 1660 | GTX 1050 Ti | GTX 1050 | |||
CUDA Cores | 896 | 1408 | 768 | 640 | ||
ROPs | 32 | 48 | 32 | 32 | ||
Core Clock | 1485MHz | 1530MHz | 1290MHz | 1354MHz | ||
Boost Clock | 1665MHz | 1785MHz | 1392MHz | 1455MHz | ||
Memory Clock | 8Gbps GDDR5 | 8Gbps GDDR5 | 7Gbps GDDR5 | 7Gbps GDDR5 | ||
Memory Bus Width | 128-bit | 192-bit | 128-bit | 128-bit | ||
VRAM | 4GB | 6GB | 4GB | 2GB | ||
Single Precision Perf. | 3 TFLOPS | 5 TFLOPS | 2.1 TFLOPS | 1.9 TFLOPS | ||
TDP | 75W | 120W | 75W | 75W | ||
GPU | TU117 (200 mm2) |
TU116 (284 mm2) |
GP107 (132 mm2) |
GP107 (132 mm2) |
||
Transistor Count | 4.7B | 6.6B | 3.3B | 3.3B | ||
Architecture | Turing | Turing | Pascal | Pascal | ||
Manufacturing Process | TSMC 12nm "FFN" | TSMC 12nm "FFN" | Samsung 14nm | Samsung 14nm | ||
Launch Date | 4/23/2019 | 3/14/2019 | 10/25/2016 | 10/25/2016 | ||
Launch Price | $149 | $219 | $139 | $109 |
Right off the bat, it’s interesting to note that the GTX 1650 is not using a fully-enabled TU117 GPU. Relative to the full chip, the version that’s going into the GTX 1650 has had a TPC fused off, which means the chip loses 2 SMs/64 CUDA cores. The net result is that the GTX 1650 is a very rare case where NVIDIA doesn’t put their best foot forward right off the bat – the company is essentially sandbagging – which is a point I’ll loop back around to here in a bit.
Within NVIDIA’s historical product stack, it’s somewhat difficult to place the GTX 1650. Officially it’s the successor to the GTX 1050, which itself was a similar cut-down card. However the GTX 1050 also launched at $109, whereas the GTX 1650 launches at $149, a hefty 37% generation-over-generation price increase. Consequently, you could be excused if you thought the GTX 1650 felt a lot more like the GTX 1050 Ti’s successor, as the $149 price tag is very comparable to the GTX 1050 Ti’s $139 launch price. Either way, generation-over-generation, Turing cards have been more expensive than the Pascal cards they have replaced, and the low price of these budget cards really amplifies this difference.
Diving into the numbers then, the GTX 1650 ships with 896 CUDA cores enabled, spread over 2 GPCs. This is actually not all that big of a step up from the GeForce GTX 1050 series on paper, but Turing’s architectural changes and effective increase in graphics efficiency mean that the little card should pack a bit more of a punch than it first shows on paper. The CUDA cores themselves are clocked a bit lower than usual for a Turing card, however, with the reference-clocked GTX 1650 boosting to just 1665MHz.
Rounding out the package is 32 ROPs, which are part of the card’s 4 ROP/L2/Memory clusters. This means the card is being fed by a 128-bit memory bus, which NVIDIA has paired up with GDDR5 memory clocked at 8Gbps. Conveniently enough, this gives the card 128GB/sec of memory bandwidth, which is about 14% more than the last-generation GTX 1050 series cards got. Thankfully, while NVIDIA hasn’t done much to boost memory capacities on the other Turing cards, the same is not true for the GTX 1650: the minimum here is now 4GB, instead of the very constrained 2GB found on the GTX 1050. Not that 4GB is particularly spacious in 2019, however the card shouldn’t be quite so desperate for memory as its predecessor was.
Overall, on paper the GTX 1650 is set to deliver around 60% of the performance of the next card up in NVIDIA’s product stack, the GTX 1660. And in practice, what we'll find is a little better than that, with the new card offering around 65% of a GTX 1660's performance.
Meanwhile, let’s talk about power consumption. With a (reference) TDP of 75W, the smallest member of the Turing family is also the lowest power. 75W cards have been a staple of the low-end video card market – in NVIDIA’s case, this is most xx50 cards – as a 75W TDP means that an additional PCIe power connector is not necessary, and the card can be powered solely off of the PCIe bus.
Overall these cards satisfy a few niche roles that add up to a larger market. The most straightforward of these roles is the need for a video card for basic systems where a PCIe power cable isn’t available, as well as low-power systems where a more power-hungry card isn’t appropriate. For enthusiasts, the focus tends to turn specifically towards HTPC systems, as these sorts of low-power cards are a good physical fit for those compact systems, while also offering the latest video decoding features.
It should be noted however that while the reference TDP for the GTX 1650 is 75W, board partners have been free to design their own cards with higher TDPs. As a result, many of the partner cards on the market are running faster and hotter than NVIDIA’s reference specs in order to maximize their cards’ performance, with TDPs closer to 90W. So anyone specifically looking for a 75W card to take advantage of its low power requirements will want to pay close attention to card specifications to make sure it’s actually a 75W card, like the Zotac card we’re reviewing today.
Product Positioning & The Competition
Shifting gears to business matters, let’s talk about product positioning and hardware availability.
The GeForce GTX 1650 is a hard launch for NVIDIA, and typical for low-end NVIDIA cards, there are no reference cards or reference designs to speak of. In NVIDIA parlance this is a "pure virtual" launch, meaning that NVIDIA’s board partners have been doing their own thing with their respective product lines. These include a range of coolers and form factors, as well as the aforementioned factory overclocked cards that require an external PCIe power connector in order to meet the cards' greater energy needs.
Overall, the GTX 1650 launch has been a relatively low-key affair for NVIDIA. The Turing architecture/feature set has been covered to excess at this point, and the low-end market doesn't attract the same kind of enthusiast attention as the high-end market does, so NVIDIA has been acting accordingly. On our end we're less than thrilled with NVIDIA's decision to prevent reviewers from testing the new card until after it launched, but we're finally here with a card and results in hand.
In terms of product positioning, NVIDIA is primarily pitching the GTX 1650 as an upgrade for the GeForce GTX 950 and its same-generation AMD counterparts, and this has been the same upgrade cadence gap we’ve seen throughout the rest of the GeForce Turing family. As we'll see in our benchmark results, the GTX 1650 offers a significant performance improvement over the GTX 950, while the uplift over the price-comparable GTX 1050 Ti is similar to other Turing cards at around 30%. Meanwhile, one particular advantage that it has here over past-generation cards is that with its 4GB of VRAM, the GTX 1650 doesn't struggle nearly as much on more recent games as the 2GB GTX 950 and GTX 1050 do.
Broadly speaking the GTX xx50 series of cards are meant to be 1080p-with-compromises cards, and GTX 1650 follows this trend. The GTX 1650 can run some games at 1080p at maximum image quality – including some relatively recent games – but in more demanding games it becomes a tradeoff between image quality and 60fps framerates, something the GTX 1660 doesn't really experience.
Unusual this year for NVIDIA, the company is also sweetening the pot a bit by extending their ongoing Fortnite bundle to cover the GTX 1650. The bundle itself isn’t much to write home about – some game currency and skins for a game that’s free to begin with – but it’s an unexpected move since NVIDIA wasn’t offering this bundle on the other GTX 16 series cards when they launched.
Finally, let’s take a look at the competition. AMD of course is riding out the tail-end of the Polaris-based Radeon RX 500 series, so this is what the GTX 1650 will be up against. AMD’s most comparable card in terms of total power consumption is their Radeon RX 560, a card that is simply outclassed by the far more efficient GTX 1650. The GTX 1050 series already overshot the RX 560 here, so the GTX 1650 largely serves to pile on NVIDIA’s efficiency lead, leaving AMD out of the running for 75W cards.
But this doesn’t mean AMD should be counted out altogether. Instead of the RX 560, AMD has setup the Radeon RX 570 8GB against the GTX 1650, which makes for a very interesting battle. The RX 570 is still a very capable card, especially versus the lower performance of the GTX 1650, and its 8GB of VRAM is further icing on the cake. However I’m not entirely convinced that AMD and its partners can hold 8GB card prices to $149 or less over the long run, in which case the competition may end up shifting towards the 4GB RX 570 instead.
In any case, AMD’s position is that while they can’t match the GTX 1650 on features or power efficiency – and bear in mind that the RX 570 is rated to draw almost twice as much power here – they can match it on pricing and beat it on performance. Which as long as AMD wants to hold the line here, this is a favorable matchup for AMD on a pure price/performance basis for current-generation games. The RX 570 is a last-generation midrange card, and the Turing architecture alone can’t help the low-end GTX 1650 completely make up that performance difference.
On a final note, AMD is offering their own bundle as well as part of their 50th anniversary celebration. For the RX 570 the company and its participating board partners are offering copies of bothThe Division 2 (Gold Edition) and World War Z, giving AMD a much stronger bundle than NVIDIA’s. So between card performance and game bundles, it's clear that AMD is trying very hard to counter the new GTX 1650.
Q2 2019 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$349 | GeForce RTX 2060 | ||||
Radeon RX Vega 56 | $279 | GeForce GTX 1660 Ti | |||
Radeon RX 590 | $219 | GeForce GTX 1660 | |||
Radeon RX 580 (8GB) | $189 | GeForce GTX 1060 3GB (1152 cores) |
|||
Radeon RX 570 | $149 | GeForce GTX 1650 |
126 Comments
View All Comments
Marlin1975 - Friday, May 3, 2019 - link
Not a bad card, but it is a bad price.drexnx - Friday, May 3, 2019 - link
yep, but if you look at the die size, you can see that they're kinda stuck - huge generational die size increase vs GP107, and even RX570/580 are only 232mm2 compared to 200mm2.I can see how AMD can happily sell 570s for the same price since that design has been long paid for vs. Turing and the MFG costs shouldn't be much higher
Karmena - Tuesday, May 7, 2019 - link
Check the prices of RX570, they cost 120$ on newegg. And you can get one under 150$tarqsharq - Tuesday, May 7, 2019 - link
And the RX570's come with The Division 2 and World War Z right now.You can get the ASrock version with 8GB VRAM for only $139!
0ldman79 - Sunday, May 19, 2019 - link
Problem is on an OEM box you'll have to upgrade the PSU as well.Dealing with normies for customers, the good ones will understand, but most of them wouldn't have bought a crappy OEM box in the first place. Most normies will buy the 1650 alone.
AMD needs 570ish performance without the need for auxiliary power.
Yojimbo - Friday, May 3, 2019 - link
Depending on the amount of gaming done, it probably saves over 50 dollars in electricity costs over a 2 year period compared to the RX 570. Of course the 570 is a bit faster on average.JoeyJoJo123 - Friday, May 3, 2019 - link
Nobody in their right mind that's specifically on the market for an aftermarket GPU (a buying decision that comes about BECAUSE they're dissatisfied with the current framerate or performance of their existing, or lack of, a GPU) is making their primary purchasing decision on power savings alone. In other words, people aren't saying "Man, my ForkNight performance is good, but my power bills are too high! In order to remedy the exorbitant cost of my power bill, I'm going to go out and purchase a $150 GPU (which is more than 1 month of my power bill alone), even if it offers the same performance of my current GPU, just to save money on my power bill!"Someone might make that their primary purchasing decision for a power supply, because outside of being able to supply a given wattage for the system, the only thing that matters is its efficiency, and yes, over the long term higher efficiency PSUs are better built, last longer, and provide a justifiable hidden cost savings.
Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650. It has essentially outpriced itself from competing viably in the lower budget GPU market.
Yojimbo - Friday, May 3, 2019 - link
I don't know what you consider being in a right mind is, but anyone making a cost sensitive buying decision that is not considering total cost of ownership is not making his decision correctly. The electricity is not free unless one has some special arrangement. It will be paid for and it will reduce one's wealth and ability to make other purchases.logamaniac - Friday, May 3, 2019 - link
So I assume you measure the efficiency of the AC unit in your car and how it relates to your gas mileage over duration of ownership as well? since you're so worried about every calculation in making that buying decision?serpretetsky - Friday, May 3, 2019 - link
It doesn't really change the argument if he does or does not take into account his AC unit in his car. Electricity is not free. You can ignore the price of electricity if you want, but your decision to ignore it or not does not change the total cost of ownership. (I'm not defending the electricity calculations above, I haven't verified them)