In my last BunsenBlog entry, I briefly mentioned the AMD 780G chipset as being good enough for an HTPC (Home Theater PC) and could play some modern computer games with decent quality. I originally found out about the drastic improvements in IGP (Integrated Graphics Processor) performance from an article at AnandTech.com. The article mainly focused on power consumption (a good transition from my previous post on green computing), but does a good job of comparing/contrasting the top IGP chipsets out there.
Why is this at all important? I mean, serious computer users never use integrated graphics, right? Well, until recently that was indeed true. Integrated graphics were almost never able to handle the latest computer games, and even began to impact basic operating system functionality in Vista due to the Aero interface’s graphics requirements. To provide smooth OS operation, a discrete graphics solution was often necessary. (Laptops have IGP solutions as well, but for this post, I’m going to focus on desktop graphics).
Then came the AMD 780G and nVidia GeForce 8200 chipsets. Both chipsets are for AMD processors only, leaving anyone with an Intel processor with no comparable options. Both the AMD and nVidia solutions were roughly similar in terms of performance and power consumption. What really got a lot of attention (at least from computer geeks) was the incredible performance gap between the AMD/nVidia chipsets and Intel’s G35 IGP solution.
The AnandTech article (IGP Power Consumption – 780G, GF8200, and G35) ran a couple tests with Blu-Ray playback that really benefitted from the hardware decoding ability of the AMD and nVidia chipsets. “Hardware decoding” means that the graphics hardware of the chipset has the ability to process certain video codecs (like VC1 and H.264, the standards for Blu-Ray and HD video) all on its own. This means the heavy lifting of decoding 1080p video can be done by the graphics hardware and not the CPU.
Why does it matter which part of the computer does the decoding? First of all, a GPU (Graphics Processor Unit) is designed with very specific types of operations in mind. The CPU, by comparison, is the jack-of-all-trades, and can do just about any operation you ask it. The trade-off is in how well a CPU can do a certain task. The GPU can do a few things (in this case, VC1 and H.264 playback) really well. The CPU can do a lot of general processing well, but isn’t very good at graphics (as one might infer).
The result is that for the AMD 780G and nVidia GeForce 8200 chipsets, HD video playback is a no-brainer. They do it virtually flawlessly, leaving the CPU to consume much less power (and never spike at 100%, which can ruin the viewing experience of HD video). The Intel G35 chipset on the other hand, (and the G45 isn’t much better, by the way) relies completely on the processor to do the heavy lifting. HD video playback is barely watchable, with regular hiccups in playback.
So for those in the HTPC crowd (at which the AnandTech article was originally aimed), the AMD and nVidia chipsets stand out as the obvious winners. And let’s not even focus on gaming. While the nVidia and AMD chipsets provided a very playable experience with Unreal Tournament 3 (a recent first-person shooter), the Intel chipset provided a, shall we say, slide show experience.
The other reason that these recent IGP solutions interest me is in their use on the mainstream budget box (which far outnumber HTPCs). A lot of budget systems are sold with integrated graphics only, mostly to save costs. But for anyone using Vista, or wanting to play games (which is an ever increasing percentage of the population), graphics solutions will make or break the user’s experience. So while I use my AMD 780G chipset to save on energy costs and avoid purchasing another discrete graphics card, the mainstream implications of these new, capable graphics solutions is certainly promising.