Frankly, after the fiasco I had in 2003 with an ATI Radeon 9800 Pro (BIOS-modded to XT specs) I never thought I'd own an ATI video card again. After all, ATI's Catalyst drivers don't seem to be as stable as Nvidia's, most games run better on Nvidia silicon, and the proprietary PhysX effects in new games like Batman: Arkham Asylum are only capable on an Nvidia graphics card. But this year, after owning eight different Nvidia products (16MB TNT, GeForce 3 Ti 200, GeForce 4 Ti 4400, GeForce 6800 GT, GeForce 7800 GT, GeForce 7950 GT, GeForce 8800 GTS and GeForce GTX 260), Nvidia dropped the ball.
In a move reminiscent of their unveiling the first DirectX 9 card, the 9700 in 2002, ATI beat Nvidia to market with the first DirectX 11 card, the 5800-series. Codenamed "Evergreen", its existence was first noticed at an AMD Technology Analyst Day in July 2007. Even more amazing is that Nvidia was apparently caught flat-footed by this and does not have an answer to ATI's new lineup. And if recent reports are to be believed, it will not have a DX11 competitor in retail channels until possibly March or April at the earliest.
But not everything's coming up roses for ATI. Problems with the 40nm chip production at TSMC meant that current demand for the new video card far outstripped the supply. And despite cards trading for $100 over MSRP at Christmas, the 5800-series was rarer than a straight guy at a Clay Aiken concert. In essence, it was a "perfect storm" of unavailability: competition from Nvidia not expected until Spring 2010, production facility operating at reduced capacity, and everyone wanting one for the holidays. In my twenty-five years of working with computers, I've never seen such a desperate situation where people were clamoring and fighting to buy one.
In fact, the state of affairs for an ATI card was so dire that I enlisted ATI's 5800-series launch partner, Dell, in hopes of securing one under my tree for Christmas. I contacted our Senior Accounts Manager with whom we annually spend $30,000 or more on computer-related purchases and inquired about the Radeons. The good news was I could order it at MSRP but the bad news was that even they were out of stock.
I placed my order on November 18th with shipment expected by November 23rd. However, on the 24th I received word that Dell's distributors, Tech Data and Ingram Micro, couldn't get stock either so the order was bumped back to December 4th with shipment to follow on the 8th. Finally, after the 8th had come and gone, I notified Dell to cancel my order as I was tired of waiting. Christmas came and went but on New Year's Eve I received an overnighted package from FedEx. I opened it to find a 5850 and a 5870.
I weighed my options and decided to keep the 5850 as I calculated the extra $100 premium of the 5870 didn't justify the minor 10-15% performance increase.
So what makes the new 5850 so much more desirable than my year-old Nvidia GTX 260?
Apart from the apples-to-oranges difference in Stream Processors (Nvidia's architecture is scalar, while AMD's is superscalar) the 5850 doesn't look that much faster on paper. Perhaps the most interesting item is the bus width and memory type where Nvidia and ATI are clearly divided. Nvidia prefers to use slower DDR3 memory with a wider 448-bit bus, while ATI goes with faster DDR5 in a narrower 256-bit bus.
LESS IS MORE
The GTX 260 sucks down kilowatts like the Rockefeller Center Christmas tree. At idle, it's consuming 160 watts and under load that spikes to 260 watts. Unfortunately, my five year-old APC 350 battery backup was not up to the tasking of powering it and my 22-inch LCD when the power unexpectedly went out during Hurricane Gustav in 2008. It took just three seconds to completely drain my battery backup. In retrospect, the GTX 260 probably consumes nearly as much wattage by itself as my whole system did when I originally purchased the APC 350. Amazingly, ATI has developed the 5850 with Ferrari performance on a Prius appetite. It idles at 27 watts and its full load of 151 watts is below what the GTX 260 consumes at rest. Considering how many hours a day my PC spends on, it's like replacing an incandescent bulb in my PC with a compact fluorescent light.
Although there are many brands of ATI 5850 cards, Dell was only able to get Visiontek cards. The first and last Visiontek card I owned was a GeForce 3 Ti 200 in 2001. It lasted maybe a year before it started artifacting heavily in games. I RMA'd it and received a new one promptly. About two years later, the replacement (then in my brother-in-law's PC) began turning the entire screen red during gaming. For whatever reason, Visiontek dropped off the map for a couple years only to emerge in 2005 selling Nvidia's chief competition, ATI video cards. And they seem to have done very well at it-- in 2008 they were ranked #1 among all ATI's North American channel partners.
Despite the fact that the 5850 is nearly as large as my outgoing GTX 260, it shipped in a very compact box with minimal packaging. There was a single CD that included the Catalyst 9.11 drivers from November, and the hardware consisted of a 6-pin PCI-E power adapter and a strange DVI adapter. In fact, I had to forgo my fancy DVI cable and revert to a standard VGA cable to use ATI's proprietary display port. Also, I was bummed that they didn't include a demo disk or a free copy of DiRT 2 or Battlestations: Pacific as other 5850 vendors such as Polorcolor and Sapphire are doing.
Visually speaking, the Visiontek 5850 doesn't sport any wild or flashy graphics and instead comes in a basic matte black sheathed cover with a striking red fan. On the underside, the card is bare and doesn't even include a backplate. I tested it on my system which consists of Windows 7 Ultimate 32-bit with an E8500 Core 2 Duo @ 3.66 GHz, GIGABYTE GA-EP45-UD3R mainboard, Antec 650W power supply, Seagate Barracuda 250GB 7200 RPM 16MB Cache SATA 3.0Gb/s Hard Drive and 4GB OCZ Fatal1ty DDR2 memory.
For maximum benefit, I used the latest Nvidia Beta drivers 195.81 released on December 15th for my GTX 260 and the December 9.12 Catalyst drivers and ran all benchmarks at my 22-inch LCD's native resolution of 1680x1050. After reading all the horror stories about ATI cards, I was expecting the worst when uninstalling my Nvidia drivers and loading the Catalyst drivers, but it went remarkably smooth. Interestingly, on the Windows 7 Experience Index, the 5850 recorded a score of 7.7 (with 7.9 being the maximum). Previously, I had scored a 7.1 with the GTX 260.
BENCHMARKS
3DMARK VANTAGE: Two years ago, this was the first DirectX 10 benchmark available and it's still a very viable one, although the built-in PhysX support tends to skew the numbers favorably for Nvidia cards. For instance, as soon as I loaded it, it locked up with a physxloader.dll error. After correcting that, the 5850 excelled by a healthy margin of 55%, pulling down a GPU score of 14,268 to the 260's 9,208. In the Texture Fill Feature Test, the Radeon really flexed its muscles by nearly tripling the results of the 260 with 1451 GTEXELS/S to 554 GTEXELS/S. And for the math-heavy, Perlin Noise Pixel Shader Test, which stresses the arithmetic computing power of the graphics card, the 5850 hit 122 FPS, quadrupling the 32 FPS of the 260.
BORDERLANDS: Not only was Borderlands my favorite game of 2009, it's also one of the few current games to ship with a built-in benchmarking tool (although to be fair, it didn't work properly until the first patch). Unfortunately, the game does have quite a few bugs that still exist even after two updates and this clearly affects the benchmarks. And like 3DMark Vantage, which relies heavily on the PhysX library, Borderlands also initially crashed with a physxcudart.dll issue. As such, the performance disparity between the two cards was closer here than anywhere else. The 260 managed a minimum framerate of 21.76, average of 57.07, and a maximum of 130.91 while the 5850 hit slightly faster numbers of 25.03, 63.12 and 154.16. Additionally, Borderlands would periodically freeze up with the error that the ATI Display Driver has stopped responding. Because of that, future updates should dramatically improve the game's performance with ATI cards.
DiRT 2: Having played DiRT 2 for nearly a month on my GTX 260 before the 5850 fell in my lap, I can attest that it looks quite good and performs really well in DX9, knocking off a minimum of 55 FPS and a maximum of 71. But the icing on the cake is that the 5850 enables DX11's Shader Model 5, advanced lighting, and tessellated water effects with virtually no framerate penalty. The Radeon's minimum FPS was 55 and the maximum was 69 with DX11. And trust me, once you played it in DX11, there's no going back. However, I noticed that in actual gameplay (and not the demo benchmark) that DX11's framerate sometimes dips into the thirties. This can cause an occasional hitch, particularly during crowded events, but fortunately isn't too distracting.
S.T.A.L.K.E.R.: CALL OF PRIPYAT: Personally, I love S.T.A.L.K.E.R.'s X-Ray engine (see my initial review from February 21, 2007) but the actual game was a disappointment. Despite that, developer GSC Game World has done a great job incorporating DX11 into Version 1.6 of the X-Ray engine for Call of Pripyat. The full benchmark cycles through four tests: Day, Night, Rain and SunShafts and can be run in either DX9, DX10 or DX11. To stress-test the 260 as much as possible, I tested it against the 5850 in a DX10 head-to-head match as well as the 5850 individually in DX11. The 260 hit a minimum frame rate of 29 and a maximum frame rate of 138 while the 5850 achieved frame rates of 45 and 190. Amazingly, the 5850's frame rate dropped to just 38 and 184 when tested under DX11. However, this remains the only benchmark where I could not visually see an improvement with DX11.
UNIGINE HEAVEN: Unigine stole Futuremark's thunder by being the first to release a DirectX 11 benchmark and Heaven is presently the best showcase for DX11's hardware tessellation. Flat, two-dimensional walls and stairs magically gain depth and a dragon suddenly sprouts spikes sharp enough to cut you. Unfortunately, there also seems to be some issue causing missing textures as black bars randomly pop up throughout the DX11 demo. I've e-mailed Unigine about the issue and they explained a newer version of Heaven with more content would be available March 1st. Despite that, it doesn't seem to negatively impact the performance. Once again, the 5850 mopped the floor with the 260, pulling down better numbers under DX11 than the 260 could muster with DX9. The Radeon averaged 47.6 FPS for an overall score of 1198 while the GeForce trailed with 31.3 FPS and a 789 score. In DX11, the 5850 averaged 34.5 FPS for an 870 high score.
CONCLUSION
Primarily because the 260 performed so well in games like DiRT 2, I was hesitant to upgrade to a 5850. However, my fears were unfounded as I discovered not only was the 5850 a breeze to install, but it offers roughly 50% faster frame rates, full DirectX 11 support, and draws a heckuva lot less power. Plus, as the Catalyst drivers mature, the performance will increase. I'm not sure what Nvidia has up its sleeve for its GT100 "Fermi" DX11 card but I couldn't be happier with my 5850.
In a move reminiscent of their unveiling the first DirectX 9 card, the 9700 in 2002, ATI beat Nvidia to market with the first DirectX 11 card, the 5800-series. Codenamed "Evergreen", its existence was first noticed at an AMD Technology Analyst Day in July 2007. Even more amazing is that Nvidia was apparently caught flat-footed by this and does not have an answer to ATI's new lineup. And if recent reports are to be believed, it will not have a DX11 competitor in retail channels until possibly March or April at the earliest.
But not everything's coming up roses for ATI. Problems with the 40nm chip production at TSMC meant that current demand for the new video card far outstripped the supply. And despite cards trading for $100 over MSRP at Christmas, the 5800-series was rarer than a straight guy at a Clay Aiken concert. In essence, it was a "perfect storm" of unavailability: competition from Nvidia not expected until Spring 2010, production facility operating at reduced capacity, and everyone wanting one for the holidays. In my twenty-five years of working with computers, I've never seen such a desperate situation where people were clamoring and fighting to buy one.
In fact, the state of affairs for an ATI card was so dire that I enlisted ATI's 5800-series launch partner, Dell, in hopes of securing one under my tree for Christmas. I contacted our Senior Accounts Manager with whom we annually spend $30,000 or more on computer-related purchases and inquired about the Radeons. The good news was I could order it at MSRP but the bad news was that even they were out of stock.
I placed my order on November 18th with shipment expected by November 23rd. However, on the 24th I received word that Dell's distributors, Tech Data and Ingram Micro, couldn't get stock either so the order was bumped back to December 4th with shipment to follow on the 8th. Finally, after the 8th had come and gone, I notified Dell to cancel my order as I was tired of waiting. Christmas came and went but on New Year's Eve I received an overnighted package from FedEx. I opened it to find a 5850 and a 5870.
I weighed my options and decided to keep the 5850 as I calculated the extra $100 premium of the 5870 didn't justify the minor 10-15% performance increase.
So what makes the new 5850 so much more desirable than my year-old Nvidia GTX 260?
Apart from the apples-to-oranges difference in Stream Processors (Nvidia's architecture is scalar, while AMD's is superscalar) the 5850 doesn't look that much faster on paper. Perhaps the most interesting item is the bus width and memory type where Nvidia and ATI are clearly divided. Nvidia prefers to use slower DDR3 memory with a wider 448-bit bus, while ATI goes with faster DDR5 in a narrower 256-bit bus.
LESS IS MORE
The GTX 260 sucks down kilowatts like the Rockefeller Center Christmas tree. At idle, it's consuming 160 watts and under load that spikes to 260 watts. Unfortunately, my five year-old APC 350 battery backup was not up to the tasking of powering it and my 22-inch LCD when the power unexpectedly went out during Hurricane Gustav in 2008. It took just three seconds to completely drain my battery backup. In retrospect, the GTX 260 probably consumes nearly as much wattage by itself as my whole system did when I originally purchased the APC 350. Amazingly, ATI has developed the 5850 with Ferrari performance on a Prius appetite. It idles at 27 watts and its full load of 151 watts is below what the GTX 260 consumes at rest. Considering how many hours a day my PC spends on, it's like replacing an incandescent bulb in my PC with a compact fluorescent light.
Although there are many brands of ATI 5850 cards, Dell was only able to get Visiontek cards. The first and last Visiontek card I owned was a GeForce 3 Ti 200 in 2001. It lasted maybe a year before it started artifacting heavily in games. I RMA'd it and received a new one promptly. About two years later, the replacement (then in my brother-in-law's PC) began turning the entire screen red during gaming. For whatever reason, Visiontek dropped off the map for a couple years only to emerge in 2005 selling Nvidia's chief competition, ATI video cards. And they seem to have done very well at it-- in 2008 they were ranked #1 among all ATI's North American channel partners.
Despite the fact that the 5850 is nearly as large as my outgoing GTX 260, it shipped in a very compact box with minimal packaging. There was a single CD that included the Catalyst 9.11 drivers from November, and the hardware consisted of a 6-pin PCI-E power adapter and a strange DVI adapter. In fact, I had to forgo my fancy DVI cable and revert to a standard VGA cable to use ATI's proprietary display port. Also, I was bummed that they didn't include a demo disk or a free copy of DiRT 2 or Battlestations: Pacific as other 5850 vendors such as Polorcolor and Sapphire are doing.
Visually speaking, the Visiontek 5850 doesn't sport any wild or flashy graphics and instead comes in a basic matte black sheathed cover with a striking red fan. On the underside, the card is bare and doesn't even include a backplate. I tested it on my system which consists of Windows 7 Ultimate 32-bit with an E8500 Core 2 Duo @ 3.66 GHz, GIGABYTE GA-EP45-UD3R mainboard, Antec 650W power supply, Seagate Barracuda 250GB 7200 RPM 16MB Cache SATA 3.0Gb/s Hard Drive and 4GB OCZ Fatal1ty DDR2 memory.
For maximum benefit, I used the latest Nvidia Beta drivers 195.81 released on December 15th for my GTX 260 and the December 9.12 Catalyst drivers and ran all benchmarks at my 22-inch LCD's native resolution of 1680x1050. After reading all the horror stories about ATI cards, I was expecting the worst when uninstalling my Nvidia drivers and loading the Catalyst drivers, but it went remarkably smooth. Interestingly, on the Windows 7 Experience Index, the 5850 recorded a score of 7.7 (with 7.9 being the maximum). Previously, I had scored a 7.1 with the GTX 260.
BENCHMARKS
3DMARK VANTAGE: Two years ago, this was the first DirectX 10 benchmark available and it's still a very viable one, although the built-in PhysX support tends to skew the numbers favorably for Nvidia cards. For instance, as soon as I loaded it, it locked up with a physxloader.dll error. After correcting that, the 5850 excelled by a healthy margin of 55%, pulling down a GPU score of 14,268 to the 260's 9,208. In the Texture Fill Feature Test, the Radeon really flexed its muscles by nearly tripling the results of the 260 with 1451 GTEXELS/S to 554 GTEXELS/S. And for the math-heavy, Perlin Noise Pixel Shader Test, which stresses the arithmetic computing power of the graphics card, the 5850 hit 122 FPS, quadrupling the 32 FPS of the 260.
BORDERLANDS: Not only was Borderlands my favorite game of 2009, it's also one of the few current games to ship with a built-in benchmarking tool (although to be fair, it didn't work properly until the first patch). Unfortunately, the game does have quite a few bugs that still exist even after two updates and this clearly affects the benchmarks. And like 3DMark Vantage, which relies heavily on the PhysX library, Borderlands also initially crashed with a physxcudart.dll issue. As such, the performance disparity between the two cards was closer here than anywhere else. The 260 managed a minimum framerate of 21.76, average of 57.07, and a maximum of 130.91 while the 5850 hit slightly faster numbers of 25.03, 63.12 and 154.16. Additionally, Borderlands would periodically freeze up with the error that the ATI Display Driver has stopped responding. Because of that, future updates should dramatically improve the game's performance with ATI cards.
DiRT 2: Having played DiRT 2 for nearly a month on my GTX 260 before the 5850 fell in my lap, I can attest that it looks quite good and performs really well in DX9, knocking off a minimum of 55 FPS and a maximum of 71. But the icing on the cake is that the 5850 enables DX11's Shader Model 5, advanced lighting, and tessellated water effects with virtually no framerate penalty. The Radeon's minimum FPS was 55 and the maximum was 69 with DX11. And trust me, once you played it in DX11, there's no going back. However, I noticed that in actual gameplay (and not the demo benchmark) that DX11's framerate sometimes dips into the thirties. This can cause an occasional hitch, particularly during crowded events, but fortunately isn't too distracting.
S.T.A.L.K.E.R.: CALL OF PRIPYAT: Personally, I love S.T.A.L.K.E.R.'s X-Ray engine (see my initial review from February 21, 2007) but the actual game was a disappointment. Despite that, developer GSC Game World has done a great job incorporating DX11 into Version 1.6 of the X-Ray engine for Call of Pripyat. The full benchmark cycles through four tests: Day, Night, Rain and SunShafts and can be run in either DX9, DX10 or DX11. To stress-test the 260 as much as possible, I tested it against the 5850 in a DX10 head-to-head match as well as the 5850 individually in DX11. The 260 hit a minimum frame rate of 29 and a maximum frame rate of 138 while the 5850 achieved frame rates of 45 and 190. Amazingly, the 5850's frame rate dropped to just 38 and 184 when tested under DX11. However, this remains the only benchmark where I could not visually see an improvement with DX11.
UNIGINE HEAVEN: Unigine stole Futuremark's thunder by being the first to release a DirectX 11 benchmark and Heaven is presently the best showcase for DX11's hardware tessellation. Flat, two-dimensional walls and stairs magically gain depth and a dragon suddenly sprouts spikes sharp enough to cut you. Unfortunately, there also seems to be some issue causing missing textures as black bars randomly pop up throughout the DX11 demo. I've e-mailed Unigine about the issue and they explained a newer version of Heaven with more content would be available March 1st. Despite that, it doesn't seem to negatively impact the performance. Once again, the 5850 mopped the floor with the 260, pulling down better numbers under DX11 than the 260 could muster with DX9. The Radeon averaged 47.6 FPS for an overall score of 1198 while the GeForce trailed with 31.3 FPS and a 789 score. In DX11, the 5850 averaged 34.5 FPS for an 870 high score.
CONCLUSION
Primarily because the 260 performed so well in games like DiRT 2, I was hesitant to upgrade to a 5850. However, my fears were unfounded as I discovered not only was the 5850 a breeze to install, but it offers roughly 50% faster frame rates, full DirectX 11 support, and draws a heckuva lot less power. Plus, as the Catalyst drivers mature, the performance will increase. I'm not sure what Nvidia has up its sleeve for its GT100 "Fermi" DX11 card but I couldn't be happier with my 5850.
No comments:
Post a Comment