ATI Mobility Radeon 9600 and NVIDIA GeForce FX Go5650: Taking on DX9
by Andrew Ku on September 14, 2003 11:04 PM EST- Posted in
- Laptops
Conclusion
With what turned out to be not one, not two, but almost a six-month wait, we finally got the head-to-head we were looking for. And with the scores in mind, we are extremely pleased with the way Mobility Radeon 9600 turned out. It seems definitely ready for the next generation games and benchmarks. In our various benchmark runs, we were even able to roughly gauge the heat emission between the Mobility Radeon 9600 and the GeForce FX Go5650. While we can’t release full results, we can state that in our Half-Life 2 benchmark runs, the Mobility Radeon 9600 was able to noticeably generate less heat. We are still waiting for all battery consumption benchmarks to finish, and we will report back as soon as that is completed.Results aside, it was a bit frustrating to see NVIDIA and ATI take so long to get the chips to market. After all, we reported back in March on these two solutions, and it took us quite some time (albeit almost 6 months) before we started to see real tangible retail systems. Granted, they were in other overseas markets, but the main technology market is still North America.
ATI isn’t completely without fault, as their product announcement comes after their tradition of the Mobility Radeon 9000, which was touted as the first mobile graphics chip to be announced and shipped within a week. Hopefully, we will see the next generation of mobile graphics processors (M11 and NV36M) with an announcement much closer to their full market release. (Of the two, we have only been able to see M11, which is definitely something to keep your eyes peeled for as we near official announcement.) Ideally, each company’s marketing should hold off until the date nears, and not jump the gun to respond to the other.
With the GeForce4 4200 Go ultimately replaced by the Go56xx, NVIDIA is starting to head in the right direction. Power consumption and heat emissions for the GeForce FX Go based notebooks have succeeded in many things for which the GeForce4 4200 Go did not. However, NVIDIA has fair way to go to take their mobility graphics processors up to the same speed as Mobility Radeon 9600 in many of the next-generation games on the horizon.
The developer of Half-Life 2, Valve, is the first developer to voice their displeasure for the NV3x architecture with such intensity, because it has forced them to write additional codepaths particularly for NVIDIA hardware; thus, costing them time, money, and extra resources. This was something not needed to run on ATI hardware, which is why they entered into an agreement with ATI. The order of the agreement was based on already existing hardware benchmark scores to a marketing agreement, not the other way around as some have speculated.
Now, the only way for NVIDIA hardware to run reasonably well in full DX9 games such as Half-Life 2, AquaMark 3, among others, is to lower several image quality related settings: no fog, 32-bit dropped to 16-bit, low dynamic range, etc. The current selection of older DX8 games may suit the GeForce FX based systems (desktop and notebook) just fine, but we are on the heels of a software change to DX9, which is why we are in the process of revising our graphics benchmark suite. The result of GeForce FX benchmarking in DX8 is that consumers are getting use to the higher fps rates in UT2003 and Jedi Knight 2. If Valve didn’t program a special codepath for NVIDIA hardware, customers would be calling up their technical support, and ultimately sending back the software title (RMA issues), which would result in Valve's loss of money. This ends up leaving both the programmer and the NVIDIA consumer dissatisfied because neither side gets to see the full DX9 experience appreciated. Don’t forget that programmers are also artists, and on a separate level, it is frustrating for them to see their hard work go to waste, as those high level settings get turned off. We can’t even begin to hypothesize or speculate the performance results for Go5200, which is a full DX9 part, had we sought to include it in this review.
Update 9/17: We are finished with the battery consumption runs, and we can report back that there is no noticible difference between the two mobile graphic parts, in this respect. We ran both under the highest battery conservation settings (PowerPlay and PowerMizer) and the standard MobileMark settings. Due to NDA reasons, we cannot release the numbers, but the margin between the two result were negligible.
47 Comments
View All Comments
Anonymous User - Monday, September 15, 2003 - link
#25 those benchmarks were performed at 1280x1024 with 4x AA and 8x AF. They provide a better theoretical test than lower settings because the test will be more GPU-limited than CPU-limited: if you wanted to play the game at good frame rate you would not be using such high settings.However I agree we need to see real-world numbers too: what settings are necessary to see reasonable frame rates out of this (say 40-50 FPS)?
Anonymous User - Monday, September 15, 2003 - link
#18 what are you talking about? Isn't that the opposite of what I (#17) said? Or did a post get deleted to cause numbers to get out of sync?#19 Andrew so you admit these numbers don't actually tell you how the game will perform using the "appropriate" code path for the 56x0? Even though that reduces image quality so shouldn't really be compared directly to the Radeon, it still would be nice to see real-world numbers for the sake of comparison.
Anonymous User - Monday, September 15, 2003 - link
"The scores that we achieved in AquaMark 3 are similarly reminiscent of our scores in Half-Life 2 but without such large margins. In AquaMark 3, the GeForce FX Go5650 achieves sub 10 fps scores in all but one of the scenarios. Meanwhile, the Mobility Radeon 9600 on the average is situated in the mid teens. Minimally, though, the Mobility Radeon 9600 shows its clear lead over the GeForce FX Go5650 with a 58% lead. At its best, the Mobility Radeon 9600 doubles the margin between its counterpart, and this just reinforces the GeForce FX Go5650’s trouble in true DX9 benchmarks."I really think there is a misinterpretation of the AquaMark 3 numbers. What is the point of being able to one gpu outperforms the other in up to 58% if none of them can push numbers above 24 fps? The reviewer should have noted that none of theses solutions will do when it comes to all DX9 games even in low quality setups.
The honest recomendation would better be: wait for the next gen DX9 mobile chips because there is not such thing as true DX9 mobile solution neither from Nvidia nor from ATI.
dvinnen - Monday, September 15, 2003 - link
<<<We are currently revising our graphics benchmark suite in the anticipation of future DX9 stuff. These two GPUs are full DX9 parts, and we are benchmarking them accordingly. UT2003 and our current line of benchmarking titles are DX8, and therefore aren't specifically appropriate for this context. Why are our choices of benchmark titles odd? The Mobility and Go mobile graphics parts are no more than mobile version of desktop processors (clocked down, better power management features and in the M10 case integrated memory package).>>>I understand that. But the whole suite dosen't have to be dx9 to get an idea how it will play. I agreee with HL2, Warcraft3, and Splinter Cell, because lots of people play them. (Or in the case of HL2, will play.) AquaMark3 is also a good choice. But not all games are going to be Dx9. OpenGL is still a viable choice. Doom3 is going to use it and many games will be using the engin in the comeing years. I also brought up UT03 because lots of people play it. Quake3 is rather usless now, I agree. It became outdated long ago and now people are pushing 500 fps on it. But a OpenGL benchmark (like RtCW:ET, and yes I know it's still based on Quake3 engin, but you don't get the insane FPS) would be apprciated.
Andrew Ku - Monday, September 15, 2003 - link
#20: Please look at our test configuration page. The divers we used are the newest available drivers for testing, at per the time of the head to head. Remember, mobile drivers need to be vailidated by the mobile system vendor, not the graphics part vendor.Anonymous User - Monday, September 15, 2003 - link
No drivers are going to make up a 400% differential (!). Nvidia had better get their act together for NV4x.Anonymous User - Monday, September 15, 2003 - link
Ugghh...I just purchased a Dell Inspiron 8600 with the 128mb Geforce Go 5650. Was looking forward to having a mobile platform to play some of the upcoming games. How disappointing to realize that the 5650 just won't be up to snuff. Nvidia should be ashamed of themselves...
I would have liked to have waited until a ATI radeon 9600 came out for a dell system, but I got a good deal on the laptop and the 9600 card just doesn't seem to have wide distribution yet except in some very expensive custom laptops. Maybe I'll be able to switch out my Nvidia card for an ATI card when it becomes available from Dell?
Anonymous User - Monday, September 15, 2003 - link
I would like to know if this was done using the Rel 50 drivers that aren't "publically released". I remeber hearing comment that these drivers were made for DX9 Games (I think) and that NVIDIA stopped research on the current drivers in DX9 months ago. I really think it would skew the results and the benchmarks do about as much good as the ones posted on HL2 benchmarks a couple days ago. (Sept 12th I think)Also, that comment on openGL makes me think.
Don't get me wrong NVIDIA isn't looking to good,
But if the drivers aren't the upmost recent on there card I'd like to know how it's a head to head test?
Andrew Ku - Monday, September 15, 2003 - link
#17: The article was subtitled "Taking on DX9." Therefore we benchmarked in DX9 as we stated on the Half-Life 2 page.Anonymous User - Monday, September 15, 2003 - link
#17: Yes you are correct. We should all run our monitors at 30Hz too, any more is a waste.