Alienware M11x R3: Portable Powerhouseby Dustin Sklavos on July 22, 2011 1:15 AM EST
Introducing an Ultraportable Demon
We've been keeping track of Alienware's M11x series since the very first one landed and have had the privilege of testing each one. The move from Penryn to Arrandale in the R2 netted a substantial boost in performance at the cost of some battery life, though that issue was mitigated somewhat by the introduction of NVIDIA's Optimus graphics switching, replacing the more finicky software-based GPU switching in the first generation model. With the vastly improved power consumption and efficiency of Sandy Bridge, do we have a true successor to the last two models?
From first impressions, it certainly looks that way. Everything in the M11x R3 has gotten a healthy boost--everything, that is, except the screen. So spoiler alert there: the one big change we were hoping for, our last major complaint about the M11x in the R2, still remains present in the R3. Yet the move from Arrandale to Sandy Bridge has yielded dividends in other notebooks, and the GPU has received a stellar upgrade from the old GeForce GT 335M. And as a final bonus, Alienware is packing USB 3.0 in the R3.
|Alienware M11x R3 Specifications|
Intel Core i7-2617M
(2x1.5GHz + HTT, 32nm, 4MB L3, Turbo to 2.6GHz, 17W)
|Memory||2x4GB Hynix DDR3-1333 (Max 2x8GB)|
NVIDIA GeForce GT 540M 2GB DDR3
(96 CUDA Cores, 672MHz/1344MHz/1.8GHz core/shader/memory clocks, 128-bit memory bus)
11.6" LED Glossy 16:9 1366x768
(AU Optronics AUO305C Panel)
|Hard Drive(s)||Seagate Momentus 7200.5 500GB 7200-RPM HDD|
Atheros AR8151 PCIe Gigabit Ethernet
Intel Centrino Advanced-N 6205 802.11a/b/g/n
Realtek ALC665 HD Audio
Mic and dual headphone jacks
|Battery||8-Cell, 14.8V, 63Wh battery|
USB 2.0 (Chargeable)
Dual headphone, mic jacks
2x USB 3.0
|Operating System||Windows 7 Home Premium 64-bit SP1|
|Dimensions||11.25" x 9.19" x 1.29" (WxDxH)|
Flash reader (MMC, SD/Mini SD, MS/Duo/Pro/Pro Duo)
RGB configurable backlit 82-key keyboard
|Warranty||1-year limited warranty (available up to four years)|
Starting at $999
Priced as configured: $1,419
Much like in our review of the Alienware M14x, right out the gate I'll tell you that most of the upgrades to the base system aren't going to seem worth it. Our review unit comes equipped with the fastest processor Dell makes available in the M11x R3, the Intel Core i7-2617M. For just a 17W TDP it's a remarkably capable piece of kit, able to turbo up to 2.3GHz on both cores or 2.6GHz on a single core, and it promises to be a major improvement on the i7-640UM the previous generation sported. The alternative choice, for $200 less, is the i5-2537M, which takes a 300MHz hit to both turbo clocks, comes with a slightly slower 1.4GHz nominal clock, and 1MB less of L3 cache. Given the low resolution screen, it's hard to really swallow a $200 upgrade to the faster i7.
That's especially true when you realize the CPU and GPU are tied together into two specific combinations: you can get either the i7-2617M and 2GB DDR3 NVIDIA GeForce GT 540M, or the i5-2537M and 1GB DDR3 NVIDIA GeForce GT 540M. That extra gigabyte of video memory is a waste on a part like the GT 540M, whose 96 CUDA cores and 128-bit memory bus are ill-equipped to take advantage of the extra space. The 540M ships at spec, with 672MHz on the core, 1344MHz on the shaders, and an effective 1.8GHz on the DDR3. This is a massive improvement on the GT 335M that the M11x R2 shipped with, running more than 200MHz faster on the core while offering an additional 24 shaders. It also brings support for DirectX 11 and has performance around the AMD Mobility Radeon HD 5650, just as we requested in our review of the R2.
The last notable upgrade is the inclusion of USB 3.0: the two USB ports on the right side of the M11x R3 are now USB 3.0 instead of the 2.0 used in the last generation.
Essentially what we're left with is a very healthy improvement to the system itself along with better connectivity. Unfortunately we're still missing out on the better screen--something Alienware otherwise gets right with their M14x, M17x and M18x. Other than the heavy rejiggering of the M11x R3's insides, though, the shell itself remains unchanged and in line with the rest of Alienware's notebooks: glossy black accents on the speaker grilles along with edge-to-edge gloss for the screen, a backlit keyboard, and a smooth rubberized texture on the plastic shell. The design has gone largely unchanged from the very first iteration, so our thoughts there still apply. If it ain't broke, don't fix it, and while I personally still take some issue with the intake on the bottom of the notebook, at least the parts included in this version should generate less heat than the two previous generations.
Post Your CommentPlease log in or sign up to comment.
View All Comments
JarredWalton - Friday, July 22, 2011 - linkThere are two major problems with your A8-3530MX plan:
1) Judging by the results of the A8-3500M, it is unlikely that the 3530MX would actually be faster than the i7-2617M in most applications. Certainly the SNB CPUs will be faster in single-threaded performance.
2) i7-2617M (and the i5-2537M) are 17W TDP parts, which means they should use at most around 17W. The A8-3500M by contrast is a 35W TDP part, while the 3530MX is a 45W part; good luck getting that into a sub-13" chassis. I know you hate Intel, but that doesn't make AMD's parts universally better.
Assuming Alienware did go with Llano, pricing should at least drop $50 to $100 or so, which really isn't the point with a premium brand like Alienware. Perhaps Dell or HP will make a 13.3" laptop with an A6 (A8 TDP is still too high for most companies to try fitting it into anything smaller than 14"), and sell it for $600 without a discrete GPU. Performance will be lower, making high detail gaming a non-starter, but you can get two such laptops for the cost of the M11x R3 with i7 CPU.
Beenthere - Friday, July 22, 2011 - linkSeems like there's a few Intel fanbois at Anandtech. The M11x is still ugly and they won't get my money until they sell AMD powered laptops. :)
JarredWalton - Friday, July 22, 2011 - linkAnd that, my friend, is the definition of a fanboi: "I won't buy it unless it has brand X." It's not about being better; rather, it's about using a specific brand for no reason other than the brand.
Fact: Intel currently has a faster CPU at every power level than AMD.
Fact: AMD has a better integrated GPU.
Fact: With a discrete GPU, Intel will be faster (see point one above).
Fact: AMD costs less for their APU vs. comparable Intel CPU + dGPU.
So, if you want to take that and say that AMD is better on price/performance, that's fine. They are. But if you need a specific level of performance, the pricing difference starts to erode. Consider:
The Fusion 6620G graphics is about as fast as a GT 525M, slower than the GT 540M, and also slower than the HD 6630M. It's good for up to ~medium settings at 1366x768, but you wouldn't want higher resolution gaming on it. Add in a faster dGPU and you've added $100 to the price, and now the Llano CPU becomes more of a bottleneck. Heck, the Llano CPU is even a bottleneck for the fGPU in the 3500M, though I suspect that will largely go away with the 3530MX.
So if you're looking at a 45W TDP Llano and adding in a dGPU, how would that be better than just going with a faster Intel CPU with the same dGPU? If the price difference is only $50 (which precludes Alienware type of hardware, obviously), and you're already paying $800+, the 6% increase in total cost will be outweighed by a greater than 6% increase in overall performance.
Llano A8 laptops priced under $700 should sell quite well. That's not even remotely in Alienware/Dell's plans for the M11x, which is why they're not worried about Llano. Deliver better performance, charge more, and make more money -- that's what the M11x R3 is supposed to do. If you don't want to buy it because it costs to much, that's sensible, but to refuse to buy something "because it doesn't have an AMD APU"? That's brand loyalty, which is just a less offensive way of saying fanboi.
SquattingDog - Saturday, July 23, 2011 - linkWell said Jarred :)
redchar - Friday, July 22, 2011 - linkThe m11x and liano parts seem to have a lot in common, I was thinking. They put more focus on the GPU than on the CPU - but not in such a way to leave the cpu crippled as you would with atom or bobcat. I was thinking that I could see a liano laptop of the size of the m11x, I mean, sure liano parts are 35-45w, but the 540M GPU in the r3 is around 30-40 too, and liano is a CPU+GPU chip. So, I believe you could make a laptop similar to the m11x with liano. It would cater to the same market, more or less. The only thing is, as this review shows, it would not perform as well. Close, but not as good. It's actually impressive how well the sandy bridge + nvidia discrete work together with optimus to result in better performance and battery life than a fusion part gets, seeing as fusion's integration has its advantages. I'm not the kind of fanboy who would bother getting an inferior machine just to avoid buying a competing product, so I would stick by the m11x, but liano would be a pretty nice second choice, and I would assume at a lower price, too.
Certainly, as he mentioned though, alienware's styling is not the best, and not everyone can put up with it. I believe in function over fashion, but that's just me.
Wolfpup - Friday, July 22, 2011 - linkDisappointed it still uses Floptimus. But the GPU itself...is that really much of an upgrade? Yes it's 96 instead of 72 cores (and Direct X 11), but the new archetecture ends up performing somewhere between 2/3 to 100% the speed of the old architecture for the number of cores, right?
Is it really that big an upgrade?
Total side note, I've got a 96 core part just like that in a desktop that runs Folding at Home 24/7. Wish I could put something better in there, but it's got a tiny power supply, and that part works (does a heck of a lot of work Folding too!)
JarredWalton - Friday, July 22, 2011 - linkUsing clever terms like "Floptimus" doesn't actually make the technology bad. What's wrong with Optimus, where you would actually prefer not having it?
Let me see... you can get Intel and NVIDIA reference drivers, switching happens very quickly, and you pay perhaps a 3-5% performance hit in some cases (due to the copying of the frame buffer). You also lose out on 3D Vision (who cares?), and for some titles you have to create a custom profile.
The alternative is the switchable graphics used in the original M11x. Switch and reboot, or switch and watch the flickering for five seconds; neither is a great experience (though SLI notebook flicker just as bad). Now go ahead and ask how many driver updates such laptops have received since launch. I believe the correct answer is two updates since the March 2010 review, and that's two more than most other non-Optimus switchable graphics laptops have received. The current driver is actually relatively recent: 263.08 released in March of this year. Any games released since then have the potential for compatibility issues, though the GT 335M isn't likely to have issues since it's an older DX10.1 part.
In short, Optimus isn't perfect, but it's a lot closer than the switchable graphics alternatives I've encountered.
Concerning the GT 540M vs. GT 335M, it's really not even close as far as performance goes. Sure, each core on the 400M/500M may not always be faster than a 300M core, but the 540M comes clocked at 1344MHz on the shaders vs. 1080MHz with the 335M. We tested the ASUS N82Jv, which used a full i5-450M instead of a ULV CPU, and even with a ULV SNB part the M11x R3 is quite a bit faster in gaming (the closest the GT 335M gets is in STALKER at our Medium preset; everything else is about 20-40% faster for the 540M). Here's the comparison: http://www.anandtech.com/bench/Product/246?vs=396
redchar - Friday, July 22, 2011 - linkI'm still on the r1, so with the launch of the r2, I thought I didn't care for optimus - the r2 had a loss in battery life and I was wondering if it had to do with the nvidia gpu turning on when it didn't need to - that it had imperfect sensing of what gpu is needed for what task. So, I thought I liked the switchable option best, for it puts the user in complete control so that he can decide when he wants to use the discrete gpu.
But considering the r3 is better all around, I can't really hate optimus now. It's certainly easier, especially to the casual user who wouldn't know how to work switchable, or that it even existed. If the r3 can get the same battery life with the pain-free optimus that the r1 can get with manual switching, then it's all for the best.
Uritziel - Friday, July 22, 2011 - linkI completely agree with Jarred.
ouchtastic - Friday, July 22, 2011 - linkI think 1280 x 720 is the sweet spot res to game on this laptop, it's a standard HD res, and your framerates can only be higher than what's already been benched.