Monday, February 01, 2021

Amped Up!

Testing Ampere, Nvidia's Newest Graphics Gamebuster

  Pirate Press         February 2021


2020 was admittedly the strangest year in modern history, fraught with such bizarre shortages as vaccines, food, toilet paper, and, um, video cards.

Wait. What?

Yep, as perplexing as it may sound, it was nearly impossible (unless you were friends with Bill Gates or Jeff Bezos) to purchase any of the new video cards from Nvidia or AMD that were released last fall. 

Even more dumbfounding is that with over 40 million Americans losing their jobs and unemployment at higher levels than the Great Depression, economists believe that discretionary purchases like a $700 video card are not only callous, but unthinkable. 

Yet leave it to a crazy pandemic to turn the world upside down where nothing makes any sense.

Back in October, I promised that the new Nvidia 3080 video card I needed to complete my recent PC upgrade was in the pipeline and would be arriving soon. I even boasted that with 30 Teraflops of processing power, it would be a whopping three times faster than either the new Playstation 5 or Xbox Series X. It actually achieves 31.33, but who's counting? 😎

However, my assumption of perceived availability was based on forty years of buying PC peripherals and that clearly meant jack shit in a year where it was easier to buy marijuana than toilet paper.          

So, when 8 a.m. on September 17th rolled around, I naively believed that very few people would be lined up to drop $700 on something as silly and unnecessary as a new graphics card. 

My wife thinks I don't admit this often enough, so this one's for her: WOW, was I wrong!

The selection of 3080 inventory instantly went from available to sold-out before I could even click anything. Initially, I thought it was just poor luck on my end, until furious reports started tricking in from all over the world of kindred experiences where the cards were gone before they even appeared. In case you weren't aware, the collective "Angry Internet" is a very real thing. 

There was speculation that Nvidia didn't have enough cards to meet demand, and that automated online robots (or "bots" for short) were scooping up the cards by the dozens. Shortly after the 3080 went on sale, one user shared a screenshot bragging on social media that his bot was able to snag forty-two of them from Nvidia's website. From that moment forward, it was clear that video cards had joined the shadowy domain of online scalping which was previously reserved for the likes of popular concert tickets and limited-edition athletic shoes. It also illustrated that it was big business, as your average computer enthusiast doesn't have $30,000 laying around to blow on video cards to resell.           

I knew my odds of getting hit by a meteorite were better than me finding a 3080 for sale, so I enlisted the help of one of my employer's hardware vendors. I was told that they would receive their first shipment on October 2nd, so I gladly provided my credit card for an ASUS TUF 3080 that was priced at $785. It was nearly $100 more than the 3080's "suggested retail price" of $699, but given the lack of availability and the desire to have it in my sweaty little palms, I begrudgingly acquiesced.

But when there was no shipping notification on October 2nd, I inquired about the 3080 and discovered that the ETA had slipped almost two more months to November 30th! Even worse, our vendor had looped in the ASUS rep on the email and he was no help, simply stating the obvious: that there's always a video card shortage after every new launch. However, he conveniently neglected to acknowledge that this was not just a typical release deficiency, but rather a global shortage.

So, with no other viable options I reluctantly waited until December. Only the supply crunch didn't let up, generating even more pent-up demand for them. The shortage was so bad that it spawned countless articles and launched a thousand memes. Meanwhile, my shipping date kept getting pushed back even further, and on December 14th, I was notified that the latest ETA was now for December 30th. However, other gamers desperate to get a 3080 under their Christmas tree had resorted to paying online scalpers over $1000. Despite that, my ethics prevented me from buying such a criminally-overpriced item, and I ruefully resigned myself to the fact that I would not be getting a 3080 in 2020.

MIRACLE ON 3080 STREET

But on December 22nd, I experienced the closest thing I've ever had to a Christmas miracle— it was an email from graphics card maker EVGA stating that a 3080 FTW3 Ultra was being held for me for 8 hours if I wanted to buy it. As if that wasn't unusual enough, I couldn't even remember the last time I bought anything from them. I was fairly certain that it was a graphics card, but when I recalled the ones I'd acquired, none of them were from EVGA. My most recent video card, a 1080 Ti, was procured in 2017 directly from Nvidia. Likewise, my 1070 was from Gigabyte and my 970 came from MSI. Going even further back to 2013, I had a PNY 670 which succeeded a PNY 570 in 2011. Finally, I traced an MSI 260 all the way back to 2009, but there was still no record of any EVGA orders.  

Needless to say, I was completely nonplussed, both elated and perplexed at how I received an email for such a coveted item just three days before Christmas. My only other remotely similar stroke of good fortune came last year when my Landry's dining card was improperly credited for a $226 meal at a Morton's in Boca Raton. I haven't been to South Florida in 30 years, but I wasn't about to let that $25 dining credit go to waste. I immediately applied it to dinner at Saltgrass Steakhouse before the accounting error was discovered and rescinded.


Much like the Morton's mix-up, I figured my 3080 order would probably be cancelled, but on the off chance it wasn't, I went ahead and paid for it. Funny enough, I also had a $25 "EVGA Bucks" coupon obviously left over from whatever I bought eons ago. But given that the 3080 was still a whopping $863 after shipping and tax, it gave me serious reservations about the Rubicon I had crossed. For almost twenty years (1997-2016), I'd been able to keep my GPU upgrades to around $300 every couple of years. But with the purchase of a 1080 Ti ($699) in 2017 and now a 3080, I'm afraid I've passed a point of no return. Or as my friend observed, I practically could have bought a brand new Xbox Series X and Playstation 5 for almost the same amount of money.    


RAY GUN 

So, what is compelling otherwise reasonable and rational people to pay over $1000 for a 3080? For me, the appeal was not only faster frame rates in games, but the application of a new technology called "Real Time Ray Tracing" which Nvidia abbreviates as RTX, an obvious notch above their previous top-tier GTX acronym.

It's true that my 1080 Ti was capable of some very low-level ray tracing, but the technology was formally introduced with the 2000 series cards and was most capable on the 2080 Ti. However, because the ray tracing functionality was still in its infancy, enabling it in games introduced a significant performance penalty. For that reason, I chose to bypass the 2080 which came out in 2018, and wait for the 3080.

In any other year not dominated by unprecedented social unrest, countless wildfires, a record number of hurricanes, and a lethal influenza, it probably wouldn't have been a big deal to buy a 3080 on launch day. But 2020 became the biggest example we've ever seen of Murphy's Law, literally affecting everyone and everything on Earth.

Plus, it was also admittedly selfish reasons I wanted (and needed) a 3080. As an avowed eye-candy whore, I crave the maximum fidelity available when playing a game. And with blockbuster new titles like Cyberpunk 2077 utilizing Ray-Tracing, I didn't want to play it and not get the full, immersive experience.

Without getting too verbose, Real-Time Ray Tracing (RTX) is simulating rays of light and the different ways they interact with objects and surfaces around them. In short, everything casts a shadow and things like glass and water offer photo-realistic reflections. It may sound and look deceptively simple, but it's immensely taxing and requires an enormous amount of processing power. After playing through the entire Wolfenstein: Youngblood game at 4K Ultra settings, with and without RTX enabled, I can attest to the realism and beauty that Ray Tracing provides. Also, I'd never seen a reflective puddle of blood before and it looks awesome! 

FUZZY MATH

Following Nvidia's tradition of naming its video cards after notable scientists, the 3000-series was christened "Ampere" after Andre Ampere, a French researcher who formulated the law of electromagnetism in 1820. Ironically, he passed away shortly thereafter from pneumonia, a distant viral cousin of the Covid-19 epidemic that would be ravaging the world two-hundred years later when his namesake video card arrived. But if he were alive today, I'm sure he'd have a 3080 also.

And as with previous generations, Nvidia continues to improve on an already impressive architecture with their first single-digit lithography: That's right, Ampere is built on a Samsung 8nm process. Granted, it's not quite as compact as AMD's 7nm node for Big Navi, but it does manage to squeeze 52% more transistors than the 2080 Ti into a 17% smaller area. For reference, that's 28.3 billion transistors crammed into a 628.4mm square die. With that kind of otherworldly compression, maybe I could hire Nvidia to assist with packing my wife's suitcases for our next vacation!  

Living with high-performance computers that are routinely overclocked, I've become accustomed to buzzing fans, steep energy requirements and excessive heat, but the 3080 is another level entirely. On paper, it's 70% faster than the 2080 Ti it replaces, but all that performance comes at the expense of power. I thought it was bad that my 1080 Ti had one 8-pin power connector, but the 3080 has THREE, drawing an observed 383 watts of juice! That's an insane amount usually reserved for running an entire desktop computer, not a single video card. Thank God my recent upgrade included a new 850 watt Thermaltake power supply with an assortment of modular cables as my old OCZ 750 would have definitely flunked this portion of the test. And as expected, the side effect of that monstrous power consumption is heat— lots and lots of heat! Of course, that triggers the 3080's triple fans which spin up and roar like a Cessna on takeoff. Given the 3080's insatiable energy appetite and thermal output, I'm afraid my home's extreme heat signature might be mistaken for an illegal cannabis farm. Even worse, the 3080 is so loud I'll never hear the police until they break my door down. 

 
 

HEAVY METAL

Did I mention that in addition to being ridiculously hot and noisy, it positively dwarfs my 1080 Ti, which I didn't think was physically possible. It's over an inch taller with the custom EVGA Tri-Fan Cooler and 1.3 inches longer. The 3080 takes up three card slots, and is so heavy with the solid metal backplate that an additional bracket is provided. Fortunately, I didn't need to use it, but the ginormous girth did uncomfortably stretch some of my power supply cables as well as block all of my SATA ports. I suppose it's not MSI's best motherboard design, so I'll need to remove the card if I want to hook up another drive. Aside from that, my Rogue M925 case accommodated the 3080 admirably, with more than enough room to fit future videos cards even if they continue to swell like a contestant on My 500 Pound Life.

 
 
  
 
Perhaps my biggest disappointment with EVGA's 3080 stems from the cardboard coffin it arrived in. I've long enjoyed the unboxing process of each new video card, taking in that boozy perfume of expensive, fresh plastic and silicon, eager to see what treats and surprises I might unearth. Some of my best discoveries have entailed a unique mouse pad or a power supply adapter. And as my costliest video card ever, I was particularly looking forward to the special goodies I felt should accompany it. It was—in my eyes—a tiny reward for purchasing one of their top-tier cards, instead of the more plebeian $329 3060 or $499 3070. 
 
 
Alas, my optimistic merch haul was a bust; Aside from more foam cushioning than a Nike factory, the box was completely barren except for the aforementioned bracket, a puffy EVGA sticker, and a coupon for an X1 capture card so I could spend more money. Whoopee! 

In the past, I've joyfully derided video card companies for including hopelessly outdated driver discs, but even that was absent this time, cruelly depriving me of that small thrill. Even so, I reminded myself that the box did include the main thing I desired, a 3080, which for the estimated 99 out of every 100 people that cannot buy one, should be satisfactory enough.

So, without further delay, on with the benchmarks!

TEST SYSTEM: AMD RYZEN 9 3900X @ 4.6 GHz, Corsair Force MP600 M.2 2280 1TB, 32GB G.SKILL Ripjaws V Series (XMP Settings), MSI MPG X570 GAMING PLUS, Sound Blaster Z, Thermaltake PF1 850W. 

3DMARK PCI Express Feature Test: This is one of the newest benchmarking utilities in the 3DMark suite, and exists solely to measure the traffic between the PCI Express lanes which handle all the data sent to and received from the CPU. If PCI-E 3.0 is the Interstate, than PCI-E 4.0 is the Autobahn. Even though my X570 board is one of the first to offer PCI-E 4.0, the bandwidth was throttled by the 1080's older PCI-E 3.0 interface, topping out not far from it's theoretical maximum of 16 GB/s. But with the 3080 optimized for PCI-E 4.0, it didn't suffer such constraints and more than doubled the 1080's bandwidth. Without the system processes overhead, I'm confident it would have reached its 32 GB/s limit. This should settle the debate once and for all regarding the importance of pairing a PCI-E 4.0 graphics card with a PCI-E 4.0 motherboard. Finally, the test is eerily relaxing with the amber waves of grain (or whatever they're supposed to be) rolling hypnotically across the screen. It might be stressing the computer to the max, but it certainly looks serene!    

  



3DMARK Port Royal: Some of the more complex Ray Tracing benchmarks wouldn't run on my 1080 Ti, but thankfully Port Royal did and it offers the clearest comparison to date of how much more powerful the 3080 is when it's allowed to fully flex its Tensor Cores and GDDR6X memory. This is the Ampere architecture strutting its stuff and pulling out a whopping difference of more than 5.5 times the performance of the 1080 Ti. Enjoying Port Royal's beautifully-generated space station in the sky is simply the icing on the cake.


 

3DMARK Time Spy: The last of the 3DMark showcases, Time Spy is the boomer of the group, having been around since 2016. But despite that advanced age, Futuremark knows how to torture-test a video card and the well-worn demo of a futuristic female thief exploring a bizarre museum still seems as vivid as it did when it premiered half a decade ago. Rendered at a default setting of 2560 x 1440, it's heavy on the particle shadows and tesselation, but lacks any Ray Tracing which it predated by two years. Truthfully, the resolution is a bit of a waste for a card of the 3080's caliber, but greedy Futuremark wants $9.99 to unlock the 4K version called Time Spy Extreme. Had that option been available, the 3080 would have likely scored considerably higher.



BASEMARK GPU: Basemark gets the questionable distinction of oddest benchmark in the group. Like an alcoholic having an identity crisis, it can't even decide on a graphics interface, offering not only DirectX 12, but also Vulkan 1.0 and OpenGL 4.5. I suppose being API-agnostic is good, but development on OpenGL has been stagnant since v4.6 in 2017, and Vulkan has taken up the slack for AMD owners. If 3DMark has Time Spy, then Basemark is modeled after Time Bandits, the wacky 1981 fantasy from Monty Python-alum Terry Gilliam. Both the movie and the benchmark offer a 17th-century English three-masted galleon flying through the sky, and neither makes much sense. Without RTX or DLSS to stress the cards, the 1080 is just barely able to keep the 3080 from doubling its performance.   


 

FINAL FANTASY XIV Shadowbringers: I've never tried any of the wildly-popular Final Fantasy games, but the demos are a staple of my video card benchmarks with FINAL FANTASY XIV: HEAVENSWARD employed to test my 1070 in 2016. They're also quite a visual feast, with the demo unfolding like a bizarre bath salts trip: There's not only a giant insect that's half-man and half-praying mantis, but also a woman with giant rabbit ears who definitely didn't escape from the Playboy mansion. Described as "Japanese Science Fantasy" there's lot of fighting and flashy graphics, but again nothing to exploit the 3080's advanced architecture, so the delta between them is rather disappointing.    


 

NEON NOIR: Designed by Crytek, the studio behind the perennial Crysis benchmark, Neon Noir is a gritty, Ray Traced extravaganza that looks absolutely gorgeous. Set in a dark, dystopian city, a police drone zooms around gathering surveillance info while interacting with a lot of reflective surfaces. A vignette of shiny, spent bullet casings scattered on the wet pavement appears real enough to reach out and touch. Given the demanding 4K resolution and full RTX effects, the 1080 performs commendably, but even the mighty 3080 fails to attain the lofty goal of a steady 60 FPS. Perhaps the ultra-rare $1,500 3090 could pull it off, but for now it looks like it will require the next cycle of graphics cards before that feat is achievable with more mainstream metal.


 

Street Fighter V Champion Edition: This was the most obtuse benchmark of the bunch, requiring a separate game pad just to navigate the menus, as well as a ridiculous cap of 60 FPS that defeats the purpose of the entire program. But after manually editing the engine.ini config file to remove the frame-rate cap, the 1080 jumped 50% higher and the 3080 scored almost 100 FPS faster. If nothing else, the benchmark simply underscores how much better Street Fighter looks now than it did when new in 1991. Despite the impressive visuals, I'm not a fan of fighting games so I won't be trying it out.    


 

Unigine Superposition: Unigine is a company that seeming exists solely to develop awe-inspiring benchmarks. Of course, that's not true, but I've never actually played any of the few games, like 2011's Petshop, that employs their proprietary engines. Instead 2009's Heaven benchmark was a staple of my testing arsenal, much like Valley which supplanted it in 2013. Unigine's latest, Superposition, ditches the mythical and elemental sequences for an optical throwback panning around a mid-century office that houses a mysterious and futuristic chamber. It's purpose is never revealed, so all we're left with is the knowledge that Superposition is just a fundamental principle of quantum mechanics.

 
CONCLUSION
 
As of this writing, I've spent a little over 30 days with my 3080 and thankfully haven't been affected by any of the overheating issues or hardware crashes experienced by other 3080 owners. In this case, patience does seem to be a virtue, and it also serves as karma for those who procured them early through unscrupulous means. 
 
From what I've read, the 3080 instability stemmed from issues with unsuitable capacitors and imprudent clock speeds. That is, aftermarket and privately overclocked cards that exceeded 2.0 GHz were prone to frequent failures. On the initial batch of EVGA cards, a set of POSCAPs (Conductive Polymer Tantalum Solid Capacitors) located underneath the GPU were the culprit. And for once, the breakdown wasn't due to the manufacturer substituting cheap parts. POSCAPS are larger, more costly, and offer better heat resistance than similar capacitors but are unfortunately ill-suited for extreme operating frequencies. As a result, MLCCs (multilayer ceramic capacitors) were substituted since, while cheaper, they do tolerate and respond quicker to voltage changes.  
 
Nvidia attempted to band-aid this with a series of driver updates that down-clocked the speeds, and independent owners experimented with "under-volting" their cards to increase the stability. Personally, I feel that paying a premium for an aftermarket, overclocked 3080 and then having to throttle it is ridiculous. Luckily, my FTW3 Ultra is said to have the best cooling capacity of any 3080 model, so I suppose that living with the noisy fans and exorbitant power draw is a small price to pay for the associated reliability. 
 
The stock base clock for the reference Nvidia 3080 is 1440 MHz with a boost clock of 1710 MHz, while my EVGA's base clock is 1475 MHz with a boost clock of 1835 MHz. However, I've routinely seen it spike to 1950 MHz in gaming with no ill effects. TechPowerUp recently pushed an FTW3 Ultra like mine to 2032 MHz, while drawing 450 watts, to gain just 3.7 FPS. For my money, the accelerated wear-and-tear on the card isn't worth such a minimal gain.
 
But while I'm trying to protect my investment, most are still struggling to even buy one. Adding insult to injury, Trump's 25% Chinese Tariff means video cards are even more expensive if you're lucky enough to find one: The Asus TUF 3080 has jumped from $749 to $859, while my EVGA is up to $869 before tax and shipping. On the reseller market, Amazon's "best deals" show a 3080 like mine for $1479 and another for $1495, or roughly a 75% markup. If that's considered a great buy, I'd hate to see what the other ones are selling for! 


 
This limited availability is also responsible for Nvidia indefinitely postponing the launch of the 3080 Ti as it works to increase supply of the existing models. And to that end, internet scuttlebutt is that they're even going so far as to start re-manufacturing the affordable 2060 model in hopes of getting those out to desperate gamers.
 
Whatever the case, I'm (obviously) happy with my 3080 even if I don't think it's worth all the hype and inflated prices. It's disappointing that it only has 10 GB of VRAM, (albeit GDDR6X) instead of the 1080 Ti's 11 GB GDDRX5. Currently, this isn't an issue, but a year or two down the road it could be. It's been reported that Cyberpunk 2077 at 8K chews up nearly all the 3090's 24 GB frame buffer. Likewise, there's only a handful of games presently that support Ray Tracing, meaning a lot of the 3080's performance is sitting idle. 
 
As with my 1080 Ti, I plan to keep my 3080 for the foreseeable future. And given the supply shortages it may be even longer, as Nvidia optimistically predicts it will be this Summer before they expect the video card drought to ease. The production snafus have also reportedly delayed Nvidia's next-gen architecture dubbed "Hopper" which was a 5nm monster multi-chip module (MCM) GPU. Taking it's place is a new project known as "Lovelace" though I'm fairly certain it's named after the British mathematician Ada Lovelace, and not Linda Lovelace, the 1970s adult film star. Regardless, it'll be funny if the 4080 picks up the "Deep Throat" nickname. 
 
 
COMING SOON: How a 24 Year-Old PC Game Absolutely Crushed my 3080!         


 


  Pumpkin Spice It's not everyday you park next to an orange Lotus Elise       Pirate Press            November 2023          At the en...