4.4 Pounds of Gaming Performance

In all of our synthetic metrics, the Alienware M11x R3's updated processor and GPU helped it put in a strong showing against even full-voltage chips from the Arrandale generation. Sandy Bridge's aggressive Turbo Boost seems to be paying big dividends towards helping the line finally get out from under being heavily CPU-limited, so let's see how that pans out once we start using it for what it was intended for: gaming.

The combination of a faster processor and faster GPU seems to pay off handsomely for the M11x R3, but the big news is definitely that the CPU limitations that tended to plague performance on the last two generations are mostly ameliorated by the i7-2617M. It's difficult to gauge how deleterious an effect downgrading to the i5-2537M would have on the M11x R3's gaming performance, but at these settings we appear to be largely GPU-limited. Pay special attention to those Mafia II results: Mafia II has a tendency to stress every part of a system in a way few games do these days, and for that reason I actually use it to max out power consumption when I do desktop testing. Nothing else really comes close.

We have added a few titles since the M11x R2 review, specifically Mafia II, Metro 2033, and StarCraft II. That means we don't have the same set of laptops in those charts, so we added a couple other laptops to flesh things out. You can get a full comparison of how the smaller M11x R3 stacks up against the Dell XPS 15 in Mobile Bench, or you can compare the M11x R3 with it's big brother, the M14x we reviewed earlier this week. Considering the difference in chassis size and CPU performance, it's pretty clear that the i7-2617M isn't holding the GT 540M back much; the M14x on the other hand still gets a healthy boost from the GT 555M. Moving to our High settings will obviously stress the hardware more, but we've included the native 768p results for the M11x as well.

At our "High" preset the M11x R3 continues to put in a strong showing, and most of these games are actually quite playable at the notebook's native 1366x768 resolution. It appears that we're again heavily GPU-limited here, which is fantastic given the low voltage processor. The 17W TDP of the i7-2617M (and the i5-2537M) is as low as Sandy Bridge goes, one watt lower than the previous generation's i7-640UM.

The Fastest Ultraportable on the Block Battery, Noise, and Heat
Comments Locked

55 Comments

View All Comments

  • JarredWalton - Friday, July 22, 2011 - link

    There are two major problems with your A8-3530MX plan:

    1) Judging by the results of the A8-3500M, it is unlikely that the 3530MX would actually be faster than the i7-2617M in most applications. Certainly the SNB CPUs will be faster in single-threaded performance.

    2) i7-2617M (and the i5-2537M) are 17W TDP parts, which means they should use at most around 17W. The A8-3500M by contrast is a 35W TDP part, while the 3530MX is a 45W part; good luck getting that into a sub-13" chassis. I know you hate Intel, but that doesn't make AMD's parts universally better.

    Assuming Alienware did go with Llano, pricing should at least drop $50 to $100 or so, which really isn't the point with a premium brand like Alienware. Perhaps Dell or HP will make a 13.3" laptop with an A6 (A8 TDP is still too high for most companies to try fitting it into anything smaller than 14"), and sell it for $600 without a discrete GPU. Performance will be lower, making high detail gaming a non-starter, but you can get two such laptops for the cost of the M11x R3 with i7 CPU.
  • Beenthere - Friday, July 22, 2011 - link

    Seems like there's a few Intel fanbois at Anandtech. The M11x is still ugly and they won't get my money until they sell AMD powered laptops. :)
  • JarredWalton - Friday, July 22, 2011 - link

    And that, my friend, is the definition of a fanboi: "I won't buy it unless it has brand X." It's not about being better; rather, it's about using a specific brand for no reason other than the brand.

    Fact: Intel currently has a faster CPU at every power level than AMD.
    Fact: AMD has a better integrated GPU.
    Fact: With a discrete GPU, Intel will be faster (see point one above).
    Fact: AMD costs less for their APU vs. comparable Intel CPU + dGPU.

    So, if you want to take that and say that AMD is better on price/performance, that's fine. They are. But if you need a specific level of performance, the pricing difference starts to erode. Consider:

    The Fusion 6620G graphics is about as fast as a GT 525M, slower than the GT 540M, and also slower than the HD 6630M. It's good for up to ~medium settings at 1366x768, but you wouldn't want higher resolution gaming on it. Add in a faster dGPU and you've added $100 to the price, and now the Llano CPU becomes more of a bottleneck. Heck, the Llano CPU is even a bottleneck for the fGPU in the 3500M, though I suspect that will largely go away with the 3530MX.

    So if you're looking at a 45W TDP Llano and adding in a dGPU, how would that be better than just going with a faster Intel CPU with the same dGPU? If the price difference is only $50 (which precludes Alienware type of hardware, obviously), and you're already paying $800+, the 6% increase in total cost will be outweighed by a greater than 6% increase in overall performance.

    Llano A8 laptops priced under $700 should sell quite well. That's not even remotely in Alienware/Dell's plans for the M11x, which is why they're not worried about Llano. Deliver better performance, charge more, and make more money -- that's what the M11x R3 is supposed to do. If you don't want to buy it because it costs to much, that's sensible, but to refuse to buy something "because it doesn't have an AMD APU"? That's brand loyalty, which is just a less offensive way of saying fanboi.
  • SquattingDog - Saturday, July 23, 2011 - link

    Well said Jarred :)
  • redchar - Friday, July 22, 2011 - link

    The m11x and liano parts seem to have a lot in common, I was thinking. They put more focus on the GPU than on the CPU - but not in such a way to leave the cpu crippled as you would with atom or bobcat. I was thinking that I could see a liano laptop of the size of the m11x, I mean, sure liano parts are 35-45w, but the 540M GPU in the r3 is around 30-40 too, and liano is a CPU+GPU chip. So, I believe you could make a laptop similar to the m11x with liano. It would cater to the same market, more or less. The only thing is, as this review shows, it would not perform as well. Close, but not as good. It's actually impressive how well the sandy bridge + nvidia discrete work together with optimus to result in better performance and battery life than a fusion part gets, seeing as fusion's integration has its advantages. I'm not the kind of fanboy who would bother getting an inferior machine just to avoid buying a competing product, so I would stick by the m11x, but liano would be a pretty nice second choice, and I would assume at a lower price, too.

    Certainly, as he mentioned though, alienware's styling is not the best, and not everyone can put up with it. I believe in function over fashion, but that's just me.
  • Wolfpup - Friday, July 22, 2011 - link

    Disappointed it still uses Floptimus. But the GPU itself...is that really much of an upgrade? Yes it's 96 instead of 72 cores (and Direct X 11), but the new archetecture ends up performing somewhere between 2/3 to 100% the speed of the old architecture for the number of cores, right?

    Is it really that big an upgrade?

    Total side note, I've got a 96 core part just like that in a desktop that runs Folding at Home 24/7. Wish I could put something better in there, but it's got a tiny power supply, and that part works (does a heck of a lot of work Folding too!)
  • JarredWalton - Friday, July 22, 2011 - link

    Using clever terms like "Floptimus" doesn't actually make the technology bad. What's wrong with Optimus, where you would actually prefer not having it?

    Let me see... you can get Intel and NVIDIA reference drivers, switching happens very quickly, and you pay perhaps a 3-5% performance hit in some cases (due to the copying of the frame buffer). You also lose out on 3D Vision (who cares?), and for some titles you have to create a custom profile.

    The alternative is the switchable graphics used in the original M11x. Switch and reboot, or switch and watch the flickering for five seconds; neither is a great experience (though SLI notebook flicker just as bad). Now go ahead and ask how many driver updates such laptops have received since launch. I believe the correct answer is two updates since the March 2010 review, and that's two more than most other non-Optimus switchable graphics laptops have received. The current driver is actually relatively recent: 263.08 released in March of this year. Any games released since then have the potential for compatibility issues, though the GT 335M isn't likely to have issues since it's an older DX10.1 part.

    In short, Optimus isn't perfect, but it's a lot closer than the switchable graphics alternatives I've encountered.

    Concerning the GT 540M vs. GT 335M, it's really not even close as far as performance goes. Sure, each core on the 400M/500M may not always be faster than a 300M core, but the 540M comes clocked at 1344MHz on the shaders vs. 1080MHz with the 335M. We tested the ASUS N82Jv, which used a full i5-450M instead of a ULV CPU, and even with a ULV SNB part the M11x R3 is quite a bit faster in gaming (the closest the GT 335M gets is in STALKER at our Medium preset; everything else is about 20-40% faster for the 540M). Here's the comparison: http://www.anandtech.com/bench/Product/246?vs=396
  • redchar - Friday, July 22, 2011 - link

    I'm still on the r1, so with the launch of the r2, I thought I didn't care for optimus - the r2 had a loss in battery life and I was wondering if it had to do with the nvidia gpu turning on when it didn't need to - that it had imperfect sensing of what gpu is needed for what task. So, I thought I liked the switchable option best, for it puts the user in complete control so that he can decide when he wants to use the discrete gpu.
    But considering the r3 is better all around, I can't really hate optimus now. It's certainly easier, especially to the casual user who wouldn't know how to work switchable, or that it even existed. If the r3 can get the same battery life with the pain-free optimus that the r1 can get with manual switching, then it's all for the best.
  • Uritziel - Friday, July 22, 2011 - link

    I completely agree with Jarred.
  • ouchtastic - Friday, July 22, 2011 - link

    I think 1280 x 720 is the sweet spot res to game on this laptop, it's a standard HD res, and your framerates can only be higher than what's already been benched.

Log in

Don't have an account? Sign up now