|EVGA GTX 460 SC Superclocked Video Card|
|Reviews - Featured Reviews: Video Cards|
|Written by Bruce Normann|
|Saturday, 04 September 2010|
Page 18 of 19
NVIDIA GTX460 Final Thoughts
I wrote earlier this year that the first Fermi cards from NVIDIA were not really "Competitors" for ATI, because they occupied different price and market segments than the existing series of Radeon HD 5xxx video cards. Well all that's changed now, with the introduction of the GF104 GPU. With 1.95 billion transistors and an estimated die size of 366 mm2, it's in the same league as the ATI Cypress chip, introduced last September on the Radeon HD 5870. On second thought, maybe NVIDIA is in the National League and ATI is in the American League. They both play the same game, but by different rules, and once a year everyone gets together and pretends that they are all the same. Then it's Football season, thank goodness.
If I allow myself to anthropomorphize these products, I thought it was a bit cruel for the GF104 to go gunning for the HD 5830, the crippled sister of the Radeon family. As fate would have it, she held on to the $200-$240 market with only a hope and a prayer by her side. There was no better point for NVIDIA to attack, with a product more clearly focused on gaming graphics, than this thinly populated market segment. Resistance was futile; there was no way the GTX 460 was going to lose this battle. That's because the GTX 460 is a wolf in sheep's clothing. To put it more plainly, and give away my conclusion to those who are reading this entire page, the GTX 460 is a 5850-class video card with a $230 price tag.
From a technology standpoint, the GTX 460 has a whole lot more in common with the Radeon HD 5850 than it does with the HD 5830. Let's compare. The HD 5850 disables one out of ten (10%) possible stream processing units, the HD 5830 disables three out of every ten (30%). The GTX 460 ships with one out of eight possible Streaming Multiprocessor blocks (12.5%) disabled. Match ‘em up.... looks like a 5850 to me. Now let's look at clock rates, the top clock rate that ATI specs out for the Cypress line is 850 MHz, and the HD 5850 ships with a 725 MHz stock clock. It's too early to guess what the highest clock will be on the GF104 chip, but Galaxy and Palit are already shipping cards with factory core clocks over 800 MHz. Almost every reviewer that bothered to overclock their GTX 460 sample got it easily up to the 850 range. The base clock for the GTX 460 is 675 MHz. Once again, the similarity to the HD 5850 is pretty plain; chop off one (presumably dead) processing cluster and downclock the core significantly, so it doesn't compete with the top model (or the lame duck GTX 465 in this case...).
Forgive me for dabbling in a bit of fairy tale economics, but I can't help myself. First of all, I'm going to make a bold assumption that an HD 5830 chip costs exactly the same amount of money to produce as an HD 5870 or HD 5850. Same amount of silicon, same pin out, same package, same testing costs - all the production costs are equal. Next, I'll extend the same bold assumption and conclude that every GF104 chip costs almost exactly the same as the Cypress chips I just mentioned. Same number of transistors, same technology node, same supplier, same production lines, same die area, etc. The only difference is the R&D and SG&A costs that have to get amortized in to establish a fully burdened cost. (I wish I could add a survey button here: agree or disagree.) The pricing model on the other hand, has you paying for performance, which seems realistic and fair for the consumer. That's where NVIDIA chose their battleground.
I've come to one inescapable conclusion: the GTX460 is really comparable to an HD 5850 from a technology standpoint, and NVIDIA chose to sell it at a price point currently occupied by a lesser model, the HD 5830. Sounds like a good marketing plan to me, especially since I believe that every Cypress-based card and every GF104-based card share the same cost structure. Sure, you can add or subtract features, but the fundamental production costs are comparable, even if the performance is not. ATI has had a monopoly on DX11 hardware for what seems like ages, so you can't blame NVIDIA for throwing a spanner in the works and trying to disrupt the market. Finally, I can say, "Fermi = Competition". BTW, just like you, I can't wait to read the next chapter in this continuing battle saga.