Archive Home arrow Reviews: arrow Video Cards arrow ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480

ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480 E-mail
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Friday, 01 October 2010
Table of Contents: Page Index
ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480
Features and Specifications
Closer Look: ASUS ENGTX480
Video Card Testing Methodology
DX10: 3DMark Vantage
DX10: Crysis Warhead
DX11: Aliens vs Predator
DX11: Battlefield Bad Company 2
DX11: BattleForge
DX9 SSAO: Mafia II
DX11: Metro 2033
DX11: Unigine Heaven 2.1
ASUS ENGTX480 Overclocking
NVIDIA APEX PhysX Enhancements
NVIDIA 3D-Vision Effects
GeForce GTX480 Temperatures
VGA Power Consumption
Editor's Opinion: NVIDIA Fermi
ENGTX480/2DI/1536MD5 Conclusion

DX9+SSAO: Mafia II

Mafia II is a single-player third-person action shooter developed by 2K Czech for 2K Games, and is the sequel to Mafia: The City of Lost Heaven released in 2002. Players assume the life of World War II veteran Vito Scaletta, the son of small Sicilian family who immigrates to Empire Bay. Growing up in the slums of Empire Bay teaches Vito about crime, and he's forced to join the Army in lieu of jail time. After sustaining wounds in the war, Vito returns home and quickly finds trouble as he again partners with his childhood friend and accomplice Joe Barbaro. Vito and Joe combine their passion for fame and riches to take on the city, and work their way to the top in Mafia II.

Mafia II is a DirectX-9/10/11 compatible PC video game built on 2K Czech's proprietary Illusion game engine, which succeeds the LS3D game engine used in Mafia: The City of Lost Heaven. In our Mafia-II Video Game Performance article, Benchmark Reviews explored characters and gameplay while illustrating how well this game delivers APEX PhysX features on both AMD and NVIDIA products. Thanks to DirectX-11 APEX PhysX extensions that can be processed by the system's CPU, Mafia II offers gamers is equal access to high-detail physics regardless of video card manufacturer.

  • Mafia II
    • Extreme Settings: (Antialiasing, 16x AF, High Shadow Quality, High Detail, High Geometry, Ambient Occlusion)


Mafia II Extreme Quality Settings

Cost Analysis: Mafia II (1920x1200)

  • $140 Radeon HD 5770 1GB costs $9.21 per FPS
  • $220 GeForce GTX 460 1GB costs $6.03 per FPS
  • $260 Radeon HD 5850 1GB costs $5.71 per FPS
  • $295 GeForce GTX 470 1280MB costs $6.40 per FPS
  • $370 Radeon HD 5870 1GB costs $6.94 per FPS
  • $470 GeForce GTX 480 1356MB costs $8.13 per FPS
  • $650 Radeon HD 5970 2GB costs $8.59 per FPS
  • $940 GeForce GTX 480 SLI costs $10.48 per FPS
  • Test Summary: Of all the video games presently available for DirectX-11 platforms, Mafia II is by far one of the most detailed and feature-rich. The ASUS GeForce GTX 480 allows the Fermi GF100 GPU to produce great high-quality APEX PhysX effects while displaying frame rates superior to anything AMD's lineup could offer with CPU support. Although the game looks and feels its best with APEX PhysX enabled, this special effect was disabled in our tests to ensure a fair performance comparison. With all settings being equal in terms of graphics quality, the ASUS GTX 480 delivers a minor 8% improvement with PhysX disabled but expands to 89.7 in SLI for 78% scaling efficiency.

    On a side note, Mafia 2 is absolutely phenomenal with 3D-Vision... and with its built-in multi-monitor profiles and bezel correction already factored this game is also well suited for 3D-Vision Surround. Combining two GeForce GTX 480's into SLI allowed this game to play at 5760 x 1080 resolution across three monitors using the highest settings with APEX PhysX enabled, delivering a thoroughly impressive experience. If you already own a 3D Vision kit and 120Hz monitor, Mafia II was developed with 3D Vision in mind. If purchasing the equipment is within your reach (I suggest the ASUS VG236H model that comes with a NVIDIA 3D-Vision kit enclosed), you owe to yourself to experience this game the way it was intended: in 3D.

    Graphics Card Radeon HD4890 GeForce GTX285 Radeon HD5850 GeForce GTX470 Radeon HD5870 GeForce GTX480 Radeon HD5970
    GPU Cores 800 240 1440 448 1600 480 3200 (1600 per GPU)
    Core Clock (MHz) 850 670 725 608 850 700 725
    Shader Clock (MHz) N/A 1550 N/A 1215 N/A 1401 N/A
    Memory Clock (MHz) 975 1300 1000 837 1200 924 1000
    Memory Amount 1024 MB GDDR5 1024MB GDDR3 1024MB GDDR5 1280MB GDDR5 1024MB GDDR5 1536MB GDDR5 2048MB GDDR5
    Memory Interface 256-bit 512-bit 256-bit 320-bit 256-bit 384-bit 512-bit (256-bit per GPU)



    # RE: ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480RealNeil 2010-10-01 04:40
    It's good to see that they're getting a handle on the power usage and heat producing issues that many have written about concerning GTX480 cards.
    I'll probably go for a pair of GTX460's in SLI configuration myself. So far, two of them cost less than a single 480 does, and their performance together is knocking on it's door. Also, two of the 460's draw far less power than one 480 does. I'll take a chance and assume that they'll also produce less heat because of their low power usage.

    Heat and power is important to many of us as you said above, I feel that it's a lot of money to buy one and will also cost a lot to use it over the lifetime of the card. It is a truly impressive video card though. Thanks for another detailed and informative review.
    Report Comment
    # RE: ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480Adam 2010-10-01 08:20
    Well it's a helluva lot better then the first bunch, still hungry and hot, but far more tolerably so.

    Surprised ASUS didnt do anything with the cooler though.
    Report Comment
    # 15 pro nVidia articles sins the last ATi cart got testted.Michael 2010-10-01 09:45
    I am just wondering do got stock ore get paid by nVidia, as your last 15 articles have all bin very pro nVidia.

    And point out all the strong points of nVidia, not that i have anything aginst those strong points, I have my self a 3x SLI GTX480 + dedicated GTX280 for PhysX + 3D Vision, on a 3 screen setup.

    But not because the 480 is the best card, but because it scale's mouths better in SLI 3x then CF-X 3x.
    And even tho they are really fast, they ware also very irritating loud and hot before i installed water cooling on them.

    That said, if i would buy a single card i would for shore go for ATi as they run cooler and uses less power.

    Looking at all the articles it looks like they ware put together whit the help of a nVidia PR guy.

    And if BMR want to be taken seriously, more balanced reviews would be welcome.
    Report Comment
    # Did you notice...?BruceBruce 2010-10-01 10:21
    Have you noticed that the only new cards to be released in the last few months have been from NVIDIA? A year ago, ATI released the HD5xxx series and everyone complained that we were working for AMD, because we were praising the design and performance of the new Radeon cards. Well, 6 moths later, NVIDIA releases new cards that are now the best performers; some at price/performance ratios that beat the ATI competion by a country mile (the GTX460). Those are facts - not PR, not marketing spin, not bias, just the facts.

    That's the way the computer industry works, when new designs get released, they are generally a LOT better thatn last year's model. Ever hear of Moore's law? When ATI brings out their next generation of video cards, I expect them to be better than what is available today. And when we test them, if that is indeed the case, that's what we will report. And then someone will complain again that we're getting paid off by ATI. As if...!!!!
    Report Comment
    # Michael = paid by AMD?Olin Coles 2010-10-01 15:09
    Michael, since your IP address resolved to Bergin, Norway, I'll forgive the horrible grammar and spelling in your comment. What I won't forgive is the fact that you've come to this website and insulted me with a claim that I'm paid by NVIDIA to write these reviews without so much as an example.

    How can you seriously expect me to review an AMD Radeon product when the last video card they offered was the Radeon HD 5550 almost five months ago? You might also go back and look over all of MY article, and count-up who has received more awards.

    If you want your remark taken seriously, perhaps you should be more constructive. Otherwise, you just come off as another fanboy troll with poor spelling.
    Report Comment
    # Snerk!ChrisW 2010-10-07 07:55
    Forgiving him for his grammar because his IP is in Norway... WTF!

    Of course he's a fanboy or a troll, but you don't need to be a Grammar Nazi about it!
    Report Comment
    # It's hard to take you seriously...Hank 2010-10-01 12:00

    It's very hard to take you seriously when your post is filled with simple spelling errors and grammatical mistakes. If you disagree with the results of the tests, then it is up to you to test for yourself. Many of the benchmarks used are free or have free versions. Anyone who can afford a setup with 3 GTX480s in SLI can certainly afford to do their own testing.

    Report Comment
    # RE: It's hard to take you seriously...Servando Silva 2010-10-01 12:32
    Additionally, that'd mean every other site publishing GTS450, GTX460, GTX470 etc. reviews is being payed by Nvidia. That's just wrong. So, that means AMD also payed recently with their new CPUs and Intel payed us back when they released their LGA1156 processors.... As if!
    Report Comment
    # RE: RE: It's hard to take you seriously...Adam 2010-10-01 12:56
    Nvidia is controlling the market by bribing all of the reviewers! It's a conspiracy, people!

    Tinfoil hat time.
    Report Comment
    # RE: RE: RE: It's hard to take you seriously...dlb 2010-10-01 19:58
    Tinfoil hat time? That implies that we haven't been wearing 'em before now.... I haven't taken mine off since the Reagan's "Star Wars" era. And I won't take it off either - not until BMR starts getting paid by AMD/ATI.

    Report Comment
    # What was the voltage difference of the GPUs?RS 2010-10-03 19:53

    Great review. I wanted to know if you guys measured the GPU voltage (in MSI Afterburner or Asus Voltage Tweak software) to check if the more mature 40nm process resulted in lower GPU voltage at load? If so what was the difference?

    Also, since this is only 1 representative sample, how can you be certain that the more mature 40nm manufacturing process is the reason for the reduced demand? Could it be an outlier videocard?
    Report Comment
    # RE: What was the voltage difference of the GPUs?Olin Coles 2010-10-03 19:56
    Hello RS:

    Yes, it could always be one lucky sample or it could be this way for every sample. It's very difficult to verify, but the mere fact that one card could ever reach temperatures or power consumption this low is surprising.

    I have not conducted the GPU voltage tests you mention.
    Report Comment
    # VoltagesRS 2010-10-03 20:11
    Could you please check the voltages of this GTX480 if you still have it? I think that would give us a better indication if a more mature manufacturing process in indeed in play. What about testing a 6 months older GTX470 as well? Wouldn't the more mature process apply to the entire GF100 line?
    Report Comment
    # 40nm GF100 GPU VoltageOlin Coles 2010-10-03 20:14
    Sure, but it could be a few days since I'm on a deadline for another project. I will measure idle and loaded GPU voltage on this new GTX 480 and the original engineering sample.
    Report Comment
    # VoltagesRS 2010-10-03 20:18
    Thank you very much Olin! No rush.
    Report Comment
    # 40nm GF100 GPU VoltageOlin Coles 2010-10-03 20:42
    Added into the article:

    NVIDIA GeForce GTX 480 Engineering Sample
    MSI Afterburner reported 0.962V GPU at idle, and 1.025V under load.
    GPU-Z reported 0.953V 15.0A 14.3W idle VDDC, and 0.980V 75.0A 74.0W at load.

    MSI Afterburner reported 0.962V GPU at idle, and 1.075V under load.
    GPU-Z reported 0.955V 11.0A 10.5W idle VDDC, and 1.033V 70.0A 72.3W at load.
    Report Comment
    # ThanksRS 2010-10-04 08:01
    Thanks for the voltage update Olin. I would have imagined that the more mature manufacturing process would have allowed the GPU to operate at the same frequency with lower voltages. In this case, not only does the Asus card operate cooler and quieter, but it does so at higher voltages. A great mystery indeed.

    Still what you measured is an observable result. I look forward to future GTX480/470 reviews, where you can provide more data points :)
    Report Comment
    # Poor review, pull it.Strafage 2010-10-03 19:59
    This is a very poor review. You cannot draw these sorts of conclusions and claim Fermi runs cooler now based on comparing just two cards.

    No mention of voltages either.

    In no way have your test proven that these cards are running cooler now compared to before.

    Hope no one makes the mistake of buying one of these cards now if they wouldn't before thinking the heat and noise issues are resolved, because this 'review' proves nothing of the sort.
    Report Comment
    # RE: Poor review, pull it.Olin Coles 2010-10-03 20:06
    Your comment makes it seem like you didn't bother to ready the article. I have tested three GTX 480's including this one, and they keep getting cooler with each new release. Also, why would mentioning voltages validate our video card power consumption findings?

    I will pull the article offline and start taking orders from you when this becomes your website. Until then, your opinion matters as much as the next anonymous post.
    Report Comment
    # RE: RE: Poor review, pull it.hurleybird 2010-10-03 20:29
    3 data points is still way to small to draw any kind of conclusion.

    Voltage is absolutely something you need to give in this article, because voltage is directly related to heat output. When GTX 480 launched there was quite a bit of voltage binning (same thing with HD5870 even), where some cards were binned higher or lower. Specs remained the same, except for voltage, heat, and power consumption. Assuming absolutely no change, it's entirely possible to randomly get three cards successively binned with lower voltage.

    Now, that's not saying that TSMC isn't making advances on 40nm, I'm sure they are, and obviously any kind of decrease in defect density will be amplified by larger dies, but when you already had such variability in volts, heat, and power at launch, you need a lot more than three data points to draw a solid conclusion. Someone could have easily gotten the same data from three random cards at launch.
    Report Comment
    # RE: RE: RE: Poor review, pull it.Olin Coles 2010-10-03 20:40
    So then what is the magic number of video cards that would prove that it's a trend and not an anomoly? How many different manufacturers do you need to have samples from?
    Report Comment
    # RE: ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480hurleybird 2010-10-03 21:50
    Probably more than would be reasonable to buy ;), although three cards at launch vs. three cards from today would be the minimum I personally be comfortable drawing any sort of conclusion with (assuming they all show the marked difference).

    Obviously the more cards you use the stronger the probability is (again assuming those added cards support your original data), however as I said you could have randomly taken three cards at launch and gotten the same results thanks to the large amount of voltage binning.
    Report Comment
    # RE: ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480hurleybird 2010-10-03 21:52
    Also like I said, I have no doubt that TSMC has been making improvements, and a decrease in defect density would have an increased positive effect on larger dies. It's not really a matter of *if* yields (and by extension voltage binning) at TSMC is improving, it's a matter of *how much*. Another possible result of increased yields could involve changing the GTX 470 / GTX 480 mix, where more chips are able to qualify as a GTX 480, but only at higher voltages. Thus, depending on what Nvidia does with them, increased yields could conceivably lead to more high voltage/heat/temp cards being produced.

    About the only way that I could see for easily proving that GTX 480's are getting cooler is if new chips start using lower voltage than the lowest voltage that was available at launch. Otherwise with this type of problem you're dealing with statistics. You don't "prove" anything so much as provide probability. With a high enough probability you can begin to draw conclusions.
    Report Comment
    # GF100 always was powerfulCorpse 2010-10-04 00:56
    Not denying anything about how good GF100 is (but it is less effecient still). My one retort is this:

    "but it doesn't appear that AMD has any surprises for the upcoming holiday season"

    Umm, Cayman, Barts etc. These are coming up in the next couple of months and as we dpont have benchies, regardless of performance, they'll be a surprise (bad or good). And if Cayman improves like it's 'rumoured' too, it will eclipse the 480.

    Also, there was a 512 core GTX 480 out in the wild and it wasnt very impressive.

    Gf100 good- yes no doubt, comment on no surprises from AMD - quite ignorant.
    Report Comment
    # RE: ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480Trajan Long 2010-10-10 18:52
    480 is awesome and paves the way for great advances in the future. The next generation will solve whatever heat issues remain with a huge performance boost and Nvidia will rule on all fronts, not just tech.
    Report Comment

    Comments have been disabled by the administrator.

    Search Benchmark Reviews

    Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter