Monday, May 01, 2017

GeForce GTX 1080 Ti: A Video Card Named Desire

Last fall, I reviewed Nvidia’s $399 GTX 1070, a graphics card that I had been eagerly awaiting for two years. I settled on it because the $699 1080 was nearly twice as expensive and only 25% faster. Despite that, neither card was able to provide the 4K (3840x2160) gaming experience I was looking for. Nvidia realized this also, and three months after milking the launch of the 1070 and 1080, slyly released the Pascal Titan X, a $1,200 sledgehammer and the only single-card solution capable of running the newest games at 4K. I fantasized about buying one and bludgeoning those 4K frame rates into submission. However, logic triumphed over passion, and I ended my article by observing that while the 1070 wasn’t quite powerful enough for 4K, it was the smart decision since I wouldn’t lose a fortune on a Titan X if Nvidia decided to release a 1080 Ti later on.
 
Well, it’s six months later and as I’ve said before, patience is a virtue. With the threat of it being dethroned by AMD’s upcoming Vega video card, Nvidia launched a preemptive strike with the long-rumored 1080 Ti.  Competition does wonders for complacency, and the 1080 Ti arrives with Titan-topping performance for the price of a standard 1080. In my twenty years of buying discrete graphics cards, never have I seen such a powerful card priced so aggressively. Granted, to a casual observer $699 for a single graphics card might seem excessive when one can purchase both a complete PS4 Pro and a Xbox One S for that sum. However, it’s important to remember that the 1080 Ti is effectively a $1,200 card with a $699 price. It’s also a fiscal nightmare for those early adopters who bought a Titan at Christmas and have now seen their equity fall faster than the UK economy after Brexit.
 
After much anticipation, the 1080 Ti was scheduled to be available on Friday, March 10th at 12:00 PM CST, but I was able to sneak my order in at 11:57. It was fortuitous as Nvidia’s stock sold out in the first five minutes. Many customers took to the message boards to complain that they had a 1080 Ti in their virtual shopping cart when it suddenly vanished and they were greeted with a disappointing “Sold Out” message. The same experience played out over and over again for other luckless shoppers hoping to find the cards on Amazon, NewEgg and similar E-tailers. Who knew so many people would line up to drop $700 on a video card? Every day it seems there’s one headline forecasting the end of PC gaming, but it’s clear that there’s still a resoundingly active contingent not fit to let that happen. To quote Mark Twain, it would appear that the imminent demise of PC gaming has been greatly exaggerated.
 
Despite the glut of sales, I was surprised (and honestly overjoyed) when I received a shipping notification a few hours later. Given that it was a bustling Friday afternoon, I wasn’t expecting it to ship the same day and was simply satisfied that I had managed to get my order in. An E-commerce site called Digital River was fulfilling the orders and mailed it from Circle Pines, Minnesota. Although I’ve never heard of that place, it sounded suspiciously like it should border the fictitious town of Wayward Pines.
 
Unbeknownst to me, Lady Luck was again on my side as it left the Land of 10,000 Lakes late Friday. Had it not gone out then, it would have been delayed to Monday when an unexpected blizzard blanketed the Northeast. Winter storm Stella closed thousands of schools, cancelled 8,800 flights and caused five weather-related deaths in three states and Canada. Eager buyers along the snow belt region, many of whom paid extra for expedited shipping and failed to see their cards materialize, furiously screamed the storm’s name like they were trapped in a Tennessee Williams novel. Meanwhile, my 1080 Ti was due to arrive on Wednesday, but actually came a day early on Tuesday. Score one for living in the hot and sunny South! 
 
However, when I received my package, it looked like Cristiano Ronaldo had used it in the intervening 1,300 miles for soccer practice. Both sides of the box were dented in enough to have torn the cardboard, and its placement on my garage floor looked like it was simply catapulted from the back of FedEx truck. Naturally, if it’s something like an article of clothing or vitamins, I don’t stress over a package’s condition too much. However, when it’s a $700 piece of sensitive electronics equipment, that gives me pause for concern.
 
Fortunately, the damage seemed to be relegated solely to the exterior and the card inside was unharmed. I’ve gone through a lot of video cards, but never one that’s cost $700. Therefore, I was pleased (and a little amazed) when I saw how nicely the presentation box looked and felt. Surely, I thought, if Tiffany’s ever sold a graphics card this would be it. Indeed, lifting it out of the box reinforced the idea that it felt as solid and expensive as a gold brick. I had my 1070 nearby and picked it up for an impromptu comparison. It appeared flimsy and cheap next to the 1080. Nvidia brags that the Ti boasts a die-cast aluminum body that is heat-treated for strength and rigidity, and I believe it. 
   
Tucked alongside the card was a little black folder that housed several pamphlets. The first one proudly proclaimed “Welcome to GeForce GTX Gaming” which would admittedly be more impressive if perhaps I owned an AMD card and had defected to Nvidia. However, given that I’m simply upgrading from a GTX 1070 to a GTX 1080 Ti, it seemed a little silly. The same could be said for the next piece of fluff, which looked to target 40 year-old virgins who still live with their parents. It was a “Special-Edition Premium Badge” that Nvidia smugly declared is “Unique Gamer Swag” although I wonder how such a tiny sticker can be considered much of an upgrade over a regular sticker? Furthermore, it cautions not to “affix it to your notebook or desktop’s internals, your sibling’s hair, or your pet” and I couldn’t decide if they were serious or trying to be funny. Lastly, “Stick Once. Stick Wisely.” was emblazoned at the bottom and it sounded like something you’d either find in a fortune cookie or a PSA about one night stands and STDs. 
 
Also present were two small guides, one for “Support” and one for “Quick Start”. But interestingly, there was no driver disc which is fine by me, since they are always hopelessly outdated. Finally, a DVI to DisplayPort adapter cable was dubiously included. (See below). Once powered up, “GEFORCE GTX” glowed menacingly in green and looked rather cool, even if it clashed with my red LED fans. I suppose I’ll switch to some green LED fans eventually to complete the alien motif.
 
Jealous Titan owners derisively refer to the 1080 Ti as a “cut-down” Titan as if to imply that it’s somehow inferior. The truth of the matter is that Nvidia disabled one of the 32-bit memory channels on its GP102 architecture to justify the price drop. That results in a somewhat odd amount of  VRAM (11 GB), a nonstandard 352-bit bus width, and 8 fewer ROPs. However, the 1080 Ti does sport higher memory frequencies than Titan (11,000 MT/s vs. 10,000 MT/s) which ultimately makes it a smidgen faster (484 GB/s to 480 GB/s). The king is dead, long live the king!
 
Visually speaking, it’s nearly impossible to distinguish the 1080 Ti from its lesser 1080 sibling, even though all the hardware underneath is pure Titan. The most obvious difference is that Nvidia has now deleted the DVI connector, which truthfully shouldn’t affect many potential owners. Considering that even Dual-Link DVI maxes out at 2560x1600@60Hz, I suspect few customers will want to limit their pricey purchase to that mid-level resolution.  Also evident is the inclusion of a new 6-pin power connector which joins the standard 8-pin connector. This is so the power-hungry GP102 platform can suck down its full 250 watts, nearly double what my 1070 draws. Sadly, I can remember when a 250-watt power supply was enough to power a whole computer, not just the video card. However, Nvidia claims that a new 7-phase dualFET power supply and 14 dualFETs yield a 40% improvement in energy consumption. Despite that, I don’t expect the 1080 Ti to win any power conservation awards.  Of course, all that extra juice makes a lot of heat, so the vapor chamber cooler has also been internally redesigned to double the airflow of previous models. And as I would later find out, it needs every iota of cooling available.            

3DMARK TIME SPY
After the spanking my 1070 received in Time Spy I was looking forward to a rematch with the 1080 Ti. Futuremark’s programs are always designed to be as future-proof as possible and as such are deliberately challenging. With the 1070, it scored a best of 37.7 FPS, though in certain areas  it was decidely choppy as the frame rate plummeted. With the muscular 1080 Ti, it was a whole new experience and I even recruited my wife and daughter to witness the spectacle as it unfolded. It ran as beautifully as it looked, pulling down 60.26 FPS in the graphics test and achieving the Holy Grail of Benchmarkers everywhere. I say this because 60 FPS is generally regarded as the optimum target for smooth game play, and to achieve that is impressive in any benchmark, much less one as demanding as Time Spy with a native resolution of 2560x1440.
  
FINAL FANTASY XIV: HEAVENSWARD
It seems Heavensward was baffled by my 1080 Ti, in much the same way I am by Trump’s bizarre tweets. It incorrectly reported my VRAM as 3072 MB, which is roughly a quarter of the total amount, and then spit out a composite score of 19,893, a figure just 15% higher than what my 1070 recorded. I knew that number wasn’t correct and could only assume that the Ti was being held back by the relatively low 1920x1080 resolution. Much like a Top-Fuel dragster can’t completely put down its power in a brief eighth-mile sprint, I felt sure the Ti needed more room to stretch its legs. I bumped the resolution to 2560x1440 and was rewarded with a new tally of 16,649 which is within 20-percent of my 1070’s best effort (17,086) at 1920x1080. Of course, the solitary frame rate doesn’t tell the whole story and Heavensward simply looked better and ran smoother no matter what the synthetic score was. 
      
UNIGINE VALLEY
At four years-old, this is the patriarch of the group and despite that advanced age, still proves to be a viable exercise. Whereas the 1070 was only 50-percent faster (48.4 FPS) than the 970 (31.6 FPS), the 1080 Ti took no prisoners by almost doubling the 1070’s score to 89 FPS. However, it’s not to say that victory didn’t come at the expense of some kilowatts. Usually silent, the 1080 Ti audibly spooled up when the test began and I watched nervously as the GPU temperature spiked like a life-threatening fever. While the 1070 peaked at 150 degrees, the 1080 rocketed skyward until it plateaued at a scorching 185 degrees, just 10 degrees shy of its maximum thermal limit. Finally, Valley also failed to correctly recognize the Ti’s VRAM, though at 5505 MB, it was admittedly a closer estimate than Heavensward was capable of. 
     
SNIPER ELITE 4
I suppose you could say that I bought a $700 graphics card for a $40 game, because I primarily purchased the 1080 Ti for the sole purpose of being able to run SE4 at 4K. Rebellion did a fantastic job developing this game, and it improves upon the successes of Sniper Elite III and Sniper Elite V2 which were both outstanding in their own rite. And after replaying the first level with all the eye candy maxed at 4K, I have to admit that it was unequivocally money well spent! Incredibly, the 1080 renders SE4 at 3840x2160 better than my 1070 did at 2560x1440. Everything is smoother, the visual effects are crystal clear, and the game borders on photo-realism. No longer is there an uncanny valley, where we’re forced to wince at poorly rendered animations and facial expressions. But perhaps most amazing is that SE4 is not only the first game to enable DX12 with a performance advantage over DX11, it’s also the only one to currently offer the option of offloading some of the work to Async Compute. I’ve mentioned before that Async Compute is Nvidia’s Achilles heel, as its software algorithms are no match for AMD’s hardware-supported silicon. However, the 1080 Ti succeeds here primarily due to its massive raw processing power. Although benchmarks with a Ti were not available, a regular 1080 was tested and returned 117 FPS with DX11, 123 FPS with DX12 and 126 FPS with DX12+Async Compute. Granted, the big news here is not the small gains under DX12+ASC, but rather the lack of a penalty for using them, something that has historically been the case until now.
 
SHADOW WARRIOR 2
Like SE4, this one doesn’t have a benchmark either, but I wanted to include it since I encountered an unusual anomaly with it. Somehow, I missed the original version of this game when it was first released in 1997, but I had a blast playing the 2013 reboot. It’s an unapologetic throwback with a lot of crude juvenile humor, so naturally I loved it. Last fall, the sequel arrived and I ripped through it on my 1070 at 2560x1440. As fate would have it, the new 14-mission Bounty Hunt DLC landed at the same time as my 1080 Ti, so I wasted no time trying it out. With it running so well on my 1070, I envisioned my 1080 Ti carving up those 4K frame rates as easily as my Katana sliced through a Yakuza soldier. But despite being nearly twice as powerful, the 1080 visibly struggled, and during intense firefights, the FPS plunged into the 30s. Dismayed, I reluctantly dropped the effects from “Ultra” to “High” but the problem persisted. Finally, I relented and accepted that it only ran smoothly at 2560x1440, acknowledging that it was a complete waste of money when it performed no better than a 1070. So, in a popular gaming forum, I casually mentioned how terrible SW2 was running on my 1080 Ti when one member suggested that I make sure VSYNC and Triple Buffering was disabled, and instead enable Adaptive VSYNC in the Nvidia control panel. Truthfully, I was a little skeptical of this recommendation since I’ve never had to do it before in 20 years of gaming, but with nothing to lose (and everything to gain) I decided to try it. Sure enough, that fixed it and my 1080 Ti started performing like a 1080 Ti should, with me gleefully hacking and slashing enemies at 4K Ultra settings. 
       
WHAT’S NEXT?
I’m not naïve enough to believe that I’ll have the fastest video card in the world for long. In fact, the same rumor mill that accurately predicted the 1080 Ti is already churning away with news of possible replacements. As mentioned above, the performance of AMD’s impending Vega video card could vastly increase or decrease Nvidia’s delivery window of such a successor. If Vega raises the bar, we could see an immediate retaliatory effort based around two new models. The first (and most probable) is a full-core GP102 likely to be crowned as the new Titan. The original Titan in 2013 started out as a graphics processing unit for deep neural learning at the Oak Ridge National Laboratory. But when Nvidia realized that enthusiasts would pay top-dollar for it, Titan was repositioned as the flagship of their consumer graphics card line. Following that, Nvidia deviously doled out upgraded versions almost annually to keep the revenue stream flowing and their coveted title of world’s fastest intact. The original Titan was succeeded by the Titan Black and Titan Z in 2014, the Titan X in 2015 and yet another Titan X in 2016, each based on that family’s current architecture. Presently, the $5,000 enterprise-level Quadro P6000 is the only graphics card utilizing the complete GP102 chip with 30 SMs and 3,840 CUDA cores. For that reason, it would be easy for Nvidia to repurpose it as the next Titan with a cheaper price.
 
A little further out on the roadmap is Volta, Nvidia’s next generation graphics card. References to it have already been spied in the latest round of GeForce drivers with the codename GV100. There was also a rather large order placed recently by Nvidia through TSMC for the GV100 chips. Aside from that, little else is known since there’s a lengthy debate over even the node size. Originally, Volta was to be a 10nm process, but the latest scuttlebutt is that yield problems have forced them to use a hybrid 16nm FinFET which is being billed as 12nm. What has been confirmed is that Nvidia will use HBM2 memory in Volta, after being criticized for using cheaper GDDR5 in Pascal, while AMD enjoyed success with HBM1 in Fury.
 
CONCLUSION
I spend less than 10% of my time gaming, so the idea of using a $700 graphics card for emails and web-surfing is a lot like using a $3 million Patriot missile to shoot down a $200 drone. It’s overkill really, but innovation is truly driving development. Dell just debuted their new $5000 8K (7680x4320) monitor and AMD engineers have admitted that their ultimate goal is 16K (15360x8640) for graphics so real they are indistinguishable from reality. Given that, it looks like there are still a couple graphic card upgrades in my future.      

  Pumpkin Spice It's not everyday you park next to an orange Lotus Elise       Pirate Press            November 2023          At the en...