(This is a picture of a comparison of the current generation of Intel processors (Penryn) and the latest graphics processor from nVidia. Although size isn’t everything, the sheer computational might of nVidia’s GTX 200 series graphics processors is nothing short of impressive. Picture courtesy of AnandTech.)
There’s a really interesting transition in computing technology that’s starting to emerge from the fog of idle speculation into the light of real marketplace implementation. Or at least the pundits are finally able to point to some direct design changes that give more credit to the idea.
You see, there are really two main camps in computer technology for processors; the CPU manufacturers (like AMD, Intel, and VIA) and then there are the GPU manufacturers (like ATI and nVidia). An for at least the past decade and a half, these two groups have been largely separate in their market segments. There were, of course, always integrated graphics options for low-power or low-budget systems. But by and large, GPU makers and CPU makers were complementary, not competitive.
That’s beginning to change. And it is mostly due to the road-map of the microprocessor juggernaut Intel. For while processors have been Intel’s bread and butter for years, the sheer processing requirements of modern operating systems (mostly Vista) for graphics-related functions has increased the need for parallel processing abilities across the board of modern computers, not just for gamers.
It used to be that a “graphics card” or GPU (graphics processing unit) was an expensive or at least superfluous piece of additional hardware that helped the CPU (central processing unit) to crank through complex calculations that required parallel processing. CPU’s are great for regular multi-purpose tasks, but GPU’s excel at parallel processing operations, like rendering 3D graphics. This almost always used to translate directly into games. For until recently, the only applications (or justifications) for a graphics card was to play games (unless you were a CAD designer).
Then came OS X. While OS X itself didn’t bump the system requirements for graphics hardware too much (thanks to efficient OpenGL code), it set a very high visual standard for Windows to follow with their next OS, Vista. And as with much of Microsoft’s code, it was bloated, unnecessarily complex, and computationally demanding.
But it was music to GPU manufacturer’s ears. Now it wasn’t just the gamers that would be needing better graphics processors; it was anyone running Vista. Suddenly, a whole new market for graphics had opened, and it was because of the OS that would be on virtually any new computer. Now, I don’t want to blow this out of proportion, for Vista has hardly created an explosion of discrete graphics sales. But what Vista has done is set a new hardware standard for what is considered a baseline performance standard. No longer will any old integrated graphics solution b adequate for even those who only use their computer for IM, email, and solitaire.
It should come as no surprise, though, that after decades of perfecting their ability to make general purpose CPU’s, Intel would want to extend its expertise into making graphics processors which are, to gloss over a number of important differences, just a parallel-type CPU. Intel isn’t walking into an empty market, however, as nVidia and ATI have been in the business for quite a few years.
Intel’s maiden voyage into the waters of graphics processing will lead them into an all-out naval war between the industry leader nVidia (i.e. the British Royal Navy at its peak) and the underdog challenger ATI (i.e. the underpowered but determined American Navy in its early years). Intel has little experience in designing GPU’s (limited to their largely impotent integrated graphics solutions which are the laughing stock of gamers everywhere), but bring to the table an army of fabs’ (fabrication factories; where the silicon wafers are transformed into microprocessors). Think of Intel as the newcomer that starts with some pretty pathetic integrated GPU solutions (i.e. the German Navy of the 1800’s) but within a very short period of time transforms into Larabee (i.e. the German U-boat Wolfpacks), a GPU solution to challenge even the mighty nVidia.
But will Intel sink or float in their GPU venture? Luckily for Intel (and unluckily for nVidia and ATI), Intel could sink in its first few attempts at a dedicated graphics processors. Larabee doesn’t need to demolish or even compete with nVidia or ATI. All Intel needs for Larabee to do is demonstrate that it can build a graphics processor that has the potential to compete with nVidia and ATI. Why? Because Intel has such a massive market share in the microprocessor market and has recently been besting their only real competition in the CPU market, AMD. Intel is at the top of their game, and has so much momentum that it can afford to divert some of its engineers and capital into a graphics processor project (with potentially big returns).
Time will tell, but it will certainly shuffle the traditional rivalries in the CPU/GPU markets (especially as AMD owns ATI and has a foothold in both markets as well).