|Year In Review: 2008 Computer Hardware Industry Failure|
|News - Featured Website News|
|Written by Olin Coles|
|Wednesday, 31 December 2008|
Page 2 of 5
2008: The Reason PC Lost
There are several key factors which have combined themselves to create a 'perfect storm' for computer hardware, but if I were to put things into a simple sentence, it would simply be that PC hardware lost because of PC software.
When Windows 3 was launched a lifetime ago, it was heralded as the major breakthrough in computing for the 16-bit interface and a structure less-reliant on 8-bit DOS. The world was in awe when Microsoft later released the 32-bit Windows 95, which marked the beginning of a very long and exciting era for computing. But what changed between Windows 98, 98 Second Edition, Millennium Edition, 2000, XP, and Vista?
The truthful answer is not much. Even though they would add small usability improvements and enhance security, the 32-bit architecture meant that 64-bit desktop processors were ahead of the consumer software curve by more than four years. This is where my tale of woe begins: 2008 was a make or break year for many, and instead of improving technology or offering a real innovation, software writers kept their eye on mainstream money and ignored 64-bit computing.
This doesn't affect very many enthusiasts, at least not yet, but the triple-channel platforms introduced by Intel recently have made 6 GB+ of system memory the most common amount for new systems. There's that writing again, right there on the wall: you can keep your old Operating System and software, but if you try using them on new hardware there will only be heartache.
But this is still only a narrow view of how component PC hardware, namely motherboard, system memory, and processor parts up to this point, have progressed beyond software to the point of becoming meaningless enhancements. There's still more damage to be done, and this time it's vide games that are killing off the discrete graphics market.
NVIDIA and ATI have been fighting the good fight for as long as most can remember. But when was the last time you really needed a new video card to play your favorite game? For me, it was back when Battlefield 2 came out. My GeForce 4 MX played older games just fine, but this new title required something like the Radeon X800 GT. To this day, that three year old technology can still push most of the newest games with high settings. Which leads to the real problem: software is not pushing the need for better hardware.
If there's no reason to upgrade, there's no reason to buy. If there's nobody buying, then the manufacturers have no reason to sell. Combine the lot, and you've got our present-day economic disaster. But wait, it gets better. While weak software development has stymied hardware sales, the biggest problems are just ahead... in the next section.