Fermi Goes Mobile: AVADirect's Clevo W880CU with GTX 480M
by Dustin Sklavos on July 7, 2010 11:45 PM ESTAVADirect was kind enough to provide us with our testing unit, a specially equipped Clevo W880CU, and a refresher of the notebook's configuration is below:
AVADirect Clevo W880CU Specifications | |
Processor | Intel Core i7-820QM |
(4x1.73GHz, 45nm, 8MB L3, Turbo to 3GHz, 45W) | |
Chipset | Intel PM55 |
Memory | 2x2GB DDR3-1333 (Max 2x4GB) |
Graphics | NVIDIA GeForce GTX 480M 2GB GDDR5 |
(352 CUDA Cores, 425MHz/800MHz/2.4GHz Core/Shader/RAM clocks) | |
Display | 17.3" LED Glossy 16:9 1080p (1920x1080) |
Hard Drive(s) | Seagate Momentus XT 500GB 7200 RPM Hybrid Drive |
(additional empty bay with RAID 0/1 capability) | |
Optical Drive | Blu-ray Writer |
Networking | Gigabit Ethernet |
Intel Centrino Ultimate-N 6300 (a/b/g/n) | |
Clevo Bluetooth | |
V.92 56K Modem | |
Audio | Realtek ALC888/1200 HD Audio |
4.1 speakers with line-in, mic, optical, and headphone jacks | |
Capable of 5.1 | |
Battery | 3-Cell, 12V, 48Wh battery |
Operating System | Windows 7 Home Premium 64-bit |
Pricing | $2936.80 as configured from AVADirect |
We ran the W880CU through our usual lineup of Futuremark synthetic benchmarks, bouncing between four different versions of 3DMark and two different PCMarks. The matchup you'll want to watch is how the W880CU compares against the W860CUs; these three units are all equipped with an Intel Core i7-820QM processor and 4GB of DDR3, making them fairly ideal comparisons. The only difference that may effect scores is the use of the Corsair Nova SSDs in the W860s.
The first thing to notice is that the GeForce GTX 480M takes the W880CU to the top of the class in almost every 3DMark benchmark; in fact, the newer the 3DMark gets, the wider the 480M's lead. The only exceptions are units equipped with dual-GPU solutions. PCMark is much less favorable, but the reduced scores are very likely attributable to the SSDs used in the higher scoring test systems.
So how does the GeForce GTX 480M fare in actual gaming scenarios?
46 Comments
View All Comments
anactoraaron - Thursday, July 8, 2010 - link
I really don't care one bit anymore about DX9. Please stop putting this in your testing. I doubt anyone else cares about DX9 numbers anymore... I mean why not put in DX8 numbers too???And why are DX9 numbers only shown for nVidia products? Are they asking you to do so?
anactoraaron - Thursday, July 8, 2010 - link
it downright tears past the competition in Far Cry 2 and DiRT 2- yeah and people want to NOT play those in DX11... <sigh>anactoraaron - Thursday, July 8, 2010 - link
And you say that in DIRT 2 it holds a 25% lead- yeah when comparing the DX9 numbers of the 480M to the DX11 numbers of the mobility 5870. The real difference is actually .1 fps (look at the W880CU-DX11 line)... yep I'm not reading the rest of this article. Not a very well written article...btw sorry bout the double post earlier...
JarredWalton - Thursday, July 8, 2010 - link
You're not reading the chart properly. We put DX11 in the appropriate lines and colored then different for a reason. Bright green compares with blue, and dark green compares with purple. The yellow/orange/gold are simply there to show performance at native resolution (with and without DX11).In DX9, 480M gets 79.6 vs. 5870 with 59.9. 480M wins by 33%
In DX11, 480M gets 60.0 vs. 5870 with 48.1. 480M wins by 25%.
As for including DX9, it's more a case of using something other than DX11 for cards that don't support DX11, as well as a check to see if DX11 helps or hurts performance. DiRT 2 doesn't have a DX10 path, so we drop to DX9 if we don't have DX11 hardware. Another example, in Metro 2033 enabling DX11 results in a MASSIVE performance hit. So much so that on notebooks it's essentially useless unless you run at 1366x768 with a 480M.
Dustin Sklavos - Thursday, July 8, 2010 - link
While it's swell that you don't care about DX9 anymore, the fact is that a substantial number of games released today STILL use it. DX10 never really took off, and while DX11 is showing strong signs of adoption moving forward, a large number of games still run in DX9 mode.GTVic - Thursday, July 8, 2010 - link
Is the author an NVIDIA fanboi? Apparently the 5870M is anemic while the 480M is the "fastest mobile GPU on the planet". Of course the more moderate comments are hidden in the details while "fastest on the planet" is screamed in bold letters.Never mind that unless you have an FPS counter on your display you couldn't tell the difference, apparently a few extra FPSs and a name that starts with "N" is all you need to get a glowing review complete with stupendous superlatives.
Also apparently it is OK to dismiss certain games because they are known to favour ATI hardware. But lets not mention anything about cough, Far Cry, cough.
JarredWalton - Thursday, July 8, 2010 - link
I'd love to see what NVIDIA thinks of your comment, because I know they felt Dustin was overly harsh. He's also been accused of being an AMD Fanboi, so apparently he's doing his job properly. ;-)The gaming performance is a case of looking at what's required to play a game well, as well as focusing on the big picture. Sure, L4D2 is a Source engine game and favors AMD architectures traditionally. It also happens to run at 62 FPS 1080p with 4xAA (which is faster than the 58 FPS the 5870 manages at the same settings). Mass Effect 2 performance has changed quite a bit between driver versions on 5870, and it isn't as intensive as other games. Just because 5870 leads at 1600x900 in two out of nine titles doesn't mean it's faster. At 1080p the margins generally favor 480M, and with 4xAA enabled they favor it even more.
Even with that, we go on to state that the 480M doesn't deliver a resounding victory. It's the world's fastest mobile GPU, yes, but it ends up being 10-15% on average which is hardly revolutionary. I said the same thing in the G73Jh review on the 5870, and it got an editor's choice while this doesn't. Seriously, read the conclusion (pages 6 and 7) and tell me that we come off as being pro-NVIDIA and anti-AMD in this review.
frozentundra123456 - Thursday, July 8, 2010 - link
I did re-read those parts also. I didn't even notice it myself on the first reading, but I can see how one would see some "ATI bashing" (although I would not use that strong a word), in that the article is about the 480M, but you spend a considerable amount of time criticizing(justifiably) the HD5870M. It just seems that you emphasized the shortcomings of the ATI part in an article primarily about the 480M, while being rather easy on the 480M itself in most sections.That said, I dont think you are unfair in general or intentionally, I just think the article was somewhat skewed in that particular section.
And actually, as you are, I am quite disappointed in both GPUs, but more-so in the 480m in that it is more expensive and power hungry for a rather small performance increase.
erple2 - Thursday, July 8, 2010 - link
I disagree. The "bashing" that's done for the 5870M I think sets the tone for how "lame" the 480M ultimately is.I found that the bashing of the 5870 really brought to me in perspective just how relatively uninteresting the 480M really is. I mean, if the 5870 was only marginally faster than an "archaic" G92 part, what does that say about NVidia's self-proclaimed ushering in a "revolution" in graphical performance?
I see it as a giant "thud", much like the GTX465 is alluded to in page 5..
GTVic - Thursday, July 8, 2010 - link
As I mentioned, I did see the more moderate comments, what I was trying to get across was that the attention grabbing headline was out of balance with the actual review.And if you discount one game for being favoured by ATI then you should probably mention Far Cry being favoured by NVIDIA. Those type of issues are being highlighted again with recent revelations that NVIDIA is hobbling CPU benchmarks of PhysX performance with unoptimized code.
One additional comment, it is always difficult to compare graphs with long titles for the different configurations, especially when the colors and the order keep changing.