The life of a PC gaming enthusiast is a constant battle between saving some moolah and buying the latest and the greatest graphics card. That card, from deep within its dark transistor-filled dungeon, would always hold the promise of squeezing out greater performance from the system. Nvidia and AMD (which acquired ATI in 2006) are the two super giants of the GPU world, and compete for the coveted space in your system’s casing. Both companies offer extremely powerful cards that are capable of handling cutting-edge innovations and data processing, to provide us with jaw- dropping visual candy. And both have been at each other’s throats for quite some time. Luckily, the years of friction have given us, the consumers, some drool-worthy products.
Nvidia started its journey in 1993, but truly picked up steam after introducing the famous “GeForce” series. Since then, it has capitalized on its success and gobbled up many other smaller companies. Nvidia managed to secure a contract to develop the graphics hardware for Microsoft’s Xbox console, and later Sony’s PlayStation 3 made it the leading independent GPU manufacturer. Now, though Nvidia is much more than a GPU producer, its core remains the same. It has also stepped into the Smartphone and tablets market. providing processing power to a number of gadgets.
What about AMD?
AMD’s need to fuel its latest series, marketed as AMD Fusion, promised a one- die solution for both GPU and CPU. The final design of the Fusion lineup was a merger between AMD and ATI, which was bought out in a huge 5.6 billion US dollars deal. ATI was never a GPU manufacturer per se, as they always headed the research and development aspects of business, while third-party publishers mass produced the GPUs. So AMD’s takeover didn’t cause a major overhaul of the company. ATI’s venture into the gaming console resulted in powering Nintendo’s GameCube, Wii and most importantly Microsoft’s Xbox 360.
This Is the section where I have to constantly look over my shoulder because PC fan boys will tear this issue (or me) apart if they find this article leaning over to any particular side (deep breath!). Comparing AMD and Nvidia is relatively hard, since they both take completely different approaches. AMD’s flagship is the Radeon lineup, a successor to its Rage series. Nvidia’s jewel in the crown is the GeForce series. Though both these companies offer mobility solutions as well, we will only talk about their discrete graphics cards. These cards vary in memory, clock speed, visual technologies, pixel pipelines, form factor (heat sinks), price and architecture.
Nvidia offers a cocktail of innovations in its cards; SLI (scalable link interface), which allows parallel processing by cramping two or more cards together, PhysX support, Pure Video, GPU Direct, 3D Vision Surround and CUDA (Compute Unified Device Architecture — Nvidia’s parallel computing architecture) have made significant impact in tilting the pivot. AMD, on the other hand, has its share of novelties such as Vision Engine, Crossfire X (in answer to Nvidia’s SLI capability), Dual Graphics, Eyefinity and HD3D technology. All of these remarkable advancements require in-depth analysis and stress tests. As a user you will find that some of these technologies will cater to your needs more than others.
The current lineup
Although it is not possible to announce either Nvidia or AMD as the overall winner, I have dared to compare a few of their cards, with respect to their pricing and ‘weight’.
ATI Radeon HD 5670 is a clear winner in this category, as it offers remarkable performance for a card in this price range. Add that to high definition audio bit streaming, multi-display support and, most importantly, DirectX 11 and you have a fizzy little worker on your hands. This baby will cost you approximately $85.
Radeon HD 5770 vs GeForce GTS 450 While Radeon HD 5770 (which costs you approximately $170) can drive up to three different displays with one single card, it lacks in terms of in-game graphical performance that the GTS 450 (which is slightly cheaper for approximately $120) can deliver. This is surprising, considering 5770 has a faster core clock. Both support multichannel HDMI audio output, so choosing between them is a trade-off between performance and display support.
Radeon HD 6870 vs GeForce GTX 560
Again, this is a close call as they both are neck-toneck in terms of performance. Though the 560 is based on CUDA (codenamed Fermi) and offers PhysX support, it is limited to games that support this feature. On the other hand, the 6870 ($290) outperforms the 560 ($250) in some games, but ever so slightly. So again, personal preference is the tie-breaker.
GeForce GTX 580
Capable of maxing out nearly every game out there, the GeForce 580 is a mammoth card (with a price-tag of around $560) of the Fermi series. Based on CUDA, running quieter, supporting better anti-aliasing and providing PhysX support, there is no match for this powerhouse in its price range.
Though the 580 is brilliant, the Radeon HD 6990 is in a league of its own, both in terms of pricing (a whopping $1000!) and performance. Though it is power hungry, the card can churn out insane amounts of detail and drool-inducing eye candy. Offering unmatched DirectX 11 graphical performance and enough features to put any other card out there to shame, this is one card that even Chuck Norris would be proud of.
What does the future hold?
Exciting times lay ahead of us, as both Nvidia and AMD are investing heavily into GPGPU (General-Purpose computing on Graphics Processing Units), as well as mobile and discrete GPUs. Nvidia’s next-generation CUDA based GPUs, named Kepler, are set to be launched in 2012. These are speculated to showcase better power efficiency than Fermi, while improving on graphical output.
The HD 6000 series is nearing its end. and AMD has promised that the 7000 series will be launched soon. It will feature AMD’s Next Generation Cores (NGC), which aim to provide better in-game performance and better GPGPU rendering. AMD has also streamlined its Fusion APUs (Accelerated Processing Unit), and aims to bring the fight to Intel.
With no signs of slowing down from either camp, there is a good chance we will be witness to another round of “give-and-take” from both sides. As the display resolutions keep on increasing, the power to drive them will also have to keep up. Increased concentration on research and development on new ways to render data ensures that this awesome race drives better performance than ever.