Nearly nine months ago, the RTX 3000 series of Nvidia graphics cards launched in a beleaguered world as a seeming ray of hope. The series’ first two GPUs, the RTX 3080 and 3070, were nearly all things to all graphics hounds. Nvidia built these cards upon the proprietary successes of the RTX 2000 series and added sheer, every-API-imaginable rasterization power on top.
An “RTX”-optimized game ran great on the line’s opening salvo of the RTX 3080, sure, but even without streamlined ray tracing or the impressive upsampling of DLSS, it tera’ed a lot of FLOPs. Talk about a fun potential purchase for nerds trapped in the house.
Even better, that power came along with more modest MSRPs compared to what we saw in the RTX 2000 series. As I wrote in September 2020:
RTX 3080’s impact on the market will hopefully push the average GPU value proposition into reasonable territory. Its $699 price may not be your cup of tea, but if prices for everything beneath the RTX 3080 (and its sibling, the RTX 3070, slated to launch in October at $499) adjust according to the below benchmarks, that means a rock-solid 1080p or 1440p GPU may finally land within your budgetary reach.
…so, yeah. About that…
At the time, I was too busy running benchmarks to ask my crystal ball about an imminent future of exploding crypto values and diminishing chip and silicon supplies. The graphics card market went kablooie, and that left me holding the bag of a ridiculous claim about future GPU prices. Anyone who’s been paying attention has witnessed many instant GPU sell-outs and staggering eBay listings.
Yet somehow, even though the series’ existing cards are already difficult to track down, Nvidia’s RTX 3000 series continues to expand—as seen in the recent announcement of two new models, the RTX 3080 Ti and RTX 3070 Ti, which start at MSRPs of $1,199 and $599, respectively. (When fans asked you to make more graphics cards, Nvidia, I’m not sure this is what they meant.)
If you’re surprised by that news, you’re not alone. Last week, I learned about the new models via an unannounced knock at the door and an 11-pound box, packed with one of each new GPU as provided by Nvidia. This is the first time I can recall getting a graphics card sample from a supplier without an email letting me know that I should prep my front porch’s netting to catch eager cryptomining package thieves.
An asteroid-sized asterisk, as usual
Today, the embargo has lifted on reviews of the 3080 Ti. And as has become increasingly common at Ars, this review comes with an asterisk the size of an asteroid that will likely smash your hopes of buying this card in the near future. Nvidia didn’t offer any briefings about company efforts to stabilize supply, nor did the company try to guarantee that actual humans will be able to buy the RTX 3080 Ti beginning tomorrow, June 3, “starting at” $1,199. It remains anyone’s guess if, when, or how a mild-mannered tech enthusiast like yourself will get a fair shake in the current market.
Should a genuine opportunity to buy this card surface, I can say that the RTX 3080 Ti finishes what the RTX 3080 started, at least for this GPU generation. Its gains over the 3080 are interesting: They’re substantial, yet they aren’t necessarily worth another $400 in MSRP. But on a pure gaming basis, this week’s new card renders the RTX 3090, and its $1,499 MSRP, absolutely moot.
Specs for the RTX 3080 Ti land closer to the 3090 than the 3080, with the biggest difference being a gulf in VRAM. This card has 2GB more than the 3080, yet a whopping 12GB less than the 3090. That’s a big VRAM differential, yet between the spec table above and the performance results below, the 3080 Ti is clearly a better choice than the 3090 if high-res gaming is your GPU priority.
That makes sense, as the 3090 was a showcase VRAM card, perfect as either an entry-level option for high-end video editing or something to attach to an 8K display. If you don’t fall into either of those camps, rest assured that the 3080 Ti is a better option for GPU overspending, especially when your system is tuned to rev games at 4K resolutions and solid frame rates with very few compromises. Meanwhile, if “only 12GB of GDDR6X VRAM” is a sentence that you might say out loud with a tear falling out of your video-processing tear duct, the RTX 3080 Ti likely won’t charm you.
Identical build, with one notable difference
Last year’s RTX 3080 certainly isn’t chopped liver. If you find anything between the 3080 and 3090 priced near MSRP and are eager to fill your favorite 4K display with as many pixels as possible, buy first and ask model-number questions later. Depending on your ideal use case, you may want to add AMD’s RX 6800XT to that list of ideal high-end GPUs, as it still has significant victories in our battery of tests. But the newer 3080 Ti does face off remarkably well against every high-end option listed in this paragraph (especially if price for you is a wholly arbitrary concept; in this marketplace, that’s sadly likely).
I received the Nvidia-produced “Founders Edition” of the 3080 Ti, and its physical case and fan build look identical to the 3080 right down to its twin-fan build and “blow-through” cooling process. But one thing is definitely different this time around—the noise.
I’ve tested every RTX 3000-series Founders Edition up until this point, and most have done a tremendous job balancing heat, airflow, and fan speed to perform efficiently and quietly. These things rarely rev their fans to extremes. But the 3080 Ti’s physical structure accommodates a more densely packed board of chips, with default power draw jumping from 320 W to 350 W. That means push has come to decibel shove in terms of keeping this GPU performant and cool.
To clarify, the 3080 Ti’s decibel level does not exceed its non-Ti sibling. Rather, the 3080 Ti’s gains come at the cost of the increased likelihood that its fans will rev up at default tunings, triggered by what appears to be an 81°C-ish threshold.
Average gains over 3080: 10-12%
For this review, my benchmarking tests compare the 3080 Ti against two of its nearest neighbors: the 3080 and AMD’s RX 6800XT (MSRP: $649). I enlisted Ars’ Senior Technology Editor Lee Hutchinson to chip in a few GPU-specific tests of the RTX 3090 FE, but these were on a different rig. As such, those RTX 3090 FE results only appear in selective tests.
Any weaker cards in my tables were tested on the same rig, which has an Intel i7-8700K CPU overclocked to 4.6 GHz, 32 GB of DDR4-3000 RAM, an 850 W power supply, and SSD storage. These tests were done in older reviews on older drivers, so their counts come with a mild margin-of-error warning.
The easy part of the review is confirming that the 3080 Ti outperforms the 3080 across the board. In rare cases, the gain over the 3080 is mild, with Gears 5 4K performance looking nearly identical between the two cards and lower gains when resolutions drop from 4K to 1440p (as is to be expected in less CPU-limited scenarios). But usually, the gains approach 10-12 percent.
Additionally, the 3080 Ti’s beefy stats don’t come at the cost of frame rate volatility. I’ve tracked most benchmarks with MSI Afterburner’s “one percent lowest” frame rate count, and this time-intensive double-check answers a crucial question: whether the reported frame rate average comes with hidden frame time spikes. As the above stats show, this card holds up. (My testing rig’s CPU is getting a bit long in the tooth, so your one percent stats using any of these GPUs may look even better than mine.)
AMD’s juice in Nvidia’s caboose
AMD’s line of Big Navi cards arguably put some juice in Nvidia’s caboose, because the 3080 Ti gets its pure rasterization numbers up higher in places where AMD had previously led (and sometimes still does). While AMD’s latest top-performing cards each beat the 3080 Ti with more L3 cache and higher VRAM capacity, the 3080 Ti has its own tale-of-the-tape leads.
Since I don’t have an AMD RX 6900XT (MSRP: $999) in my testing lab right now, I’m left assuming that the 3080 Ti still falls behind in a few AMD-versus-Nvidia battles. And on my i7-8700K testing rig, the 6800XT still drives better 1440p performance in select titles at a much lower MSRP. (Once again, in 1440p resolution, the 6800XT cranks on the newer Assassin’s Creed Valhalla, but it can’t replicate that performance with the older Assassin’s Creed Odyssey.) This new Nvidia card’s gains against Team Red are still significant, particularly in surpassing the 6800XT in punishing 4K benchmarks like Borderlands 3.
In either case, the raw rasterization battle remains a neck-and-neck case of performance varying on a game-by-game basis—and that sometimes makes the 3080 Ti’s price-per-dollar proposition shaky. But once again, Nvidia has a massive leg up in cases where its proprietary cores kick in. “Second-generation” tensor cores fuel impressive DLSS upscaling, and “third-generation” RT cores directly power ray tracing APIs.
The $400 price gap in action: Impressive ray tracing
My favorite part of testing the RTX 3080 Ti came when I booted Quake II RTX, whose name is now deceptive. It’s no longer an Nvidia RTX exclusive. Earlier this year, the game received a GPU-agnostic Vulkan API update for its full illumination system. Quake II RTX remains the ultimate ray tracing software test on the market, since its “no light-bounce unturned” system can make the classic game’s frame rates plummet from the 1,000s to the teens. It’s that extreme.
If you downscale Quake II RTX to somewhere around 900p on AMD’s RX 6800XT and disable one or two of its RT toggles, you can get near 60fps performance, which still looks and plays great. But while the RTX 3080’s dedicated RT cores already had a lead over AMD in this department, the RTX 3080 Ti takes that lead even further: a whopping 33 percent performance jump over its non-Ti sibling. (There’s your $400 MSRP price gap in action.)
As a result, I went to bed a few hours later than expected after one evening of testing. I wound up ripping and tearing through familiar levels in Quake II RTX on the 3080 Ti, enjoying pure 1440p resolution and all RT effects maxed out while frame rates hovered between 75-90 fps. This version makes the 1997 classic feel very, very new, especially when dramatic lighting moments play out. Imagine a monster closet opening in nearly complete blackness, save a few realistically bouncing ground lights and a single light from an open door on the other side of the room. I’ve talked a lot about the game-changing wow factor of ray tracing, and support for the standard is now trickling into even more games, including this year’s handsome Metro Exodus re-release and the upcoming Doom Eternal RTX patch. Ray tracing may not be your gaming priority, but as every month passes, you’re more likely to own a game where you can see the standard in action.
This Quake II RTX test result, by the way, comes without DLSS support. Add DLSS to the formula, especially in ray tracing-intensive fare like Watch Dogs Legion, Minecraft, and Cyberpunk 2077, and the 3080 Ti’s lead grows all that much more.
Can AMD’s FSR unseat DLSS?
Speaking of DLSS: AMD has finally moved forward with announcements about its own DLSS rival, FidelityFX Super Resolution (FSR). AMD says this will boost frame rates on both AMD and Nvidia GPUs, since it’s an open API and part of the company’s open-sourced GPUOpen initiative. That’s great news, but until FSR launches within a select number of games on June 22, we’re left with a brief, blurry, and unimpressive first salvo as revealed during AMD’s Monday Computex keynote.
Conversely, DLSS’s earliest iterations had their own issues at launch, albeit less blurry than what FSR looks like right now. Nvidia eventually got its standard up to incredible refinement—so much so that DLSS now fully upscales to clean lines and small-text details in ways that generally surpass temporal anti-aliasing (TAA). Some of this review’s charts include comparisons of Nvidia DLSS performance to lower-resolution AMD test results, but at this point, DLSS’s upscale of 1440p content is so crisp that it’s nowhere near an apples-to-apples comparison with AMD’s native 1440p tests. And Nvidia is marching ahead with even more DLSS-supported games in the near future, particularly Red Dead Redemption 2. Buying an Nvidia card (or, heck, an Nvidia-fueled laptop) to cash in on DLSS is becoming a better idea every month.
While I’d love to see AMD blow us all away with a non-proprietary DLSS alternative, what AMD has shown thus far is not DLSS. It’s smudgy, quite frankly. Until AMD shows us something more refined, Nvidia will likely enjoy a lead on upscaling tech for the foreseeable future. Still, we’re glad to see this sector heat up.
Premiere Pro perusal
In terms of pushing VRAM to its limits, I admittedly don’t have a robust suite of non-gaming apps to test. In this review, at least, I wanted to add a new test to my usual gamut: Adobe Premiere Pro. Its series of GPU-intensive filters can shrink hours of 4K video processing into mere minutes, so I slapped six GPU-specific filters onto a 2 minute, 50 second 4K video, then exported the results as both H.264 and HEVC formatted video on a few graphics cards.
VRAM matters for video processing, and the RX 6800XT’s 16GB arguably helps keep it in the lead over all three Nvidia cards I tested it against in HEVC encoding. Specifically, it was 6.25 percent faster than the RTX 3080 Ti. Yet in the realm of H.264 encoding, the RTX 3080 Ti surges ahead at a rate 28.5 percent faster than the RX 6800XT.
No LHR sticker, yet all the LHR pain for miners
Ars Technica is not the place you want to go for insight on a GPU’s cryptomining potential, but a certain audience is probably wondering whether the RTX 3080 Ti exceeds its non-Ti sibling in mining rates. The short answer is that the higher-powered RTX 3080 Ti is considerably worse at mining than the older RTX 3080, at least in my cursory tests.
I had a hunch this would happen after seeing Nvidia’s new “Lite Hash Rate” series of GPUs, all tweaked at the firmware, BIOS, and hardware ID level to run crypto-related algorithms at a lower-than-maximum average. All future versions of existing GPUs will ship with “LHR” stickers on their boxes, indicating to consumers what they’re in for, but Nvidia didn’t put such a sticker on the 3080 Ti’s box. When asked for comment, an Nvidia rep confirmed that this is not by accident, as the brand-new 3080 Ti model “doesn’t need the LHR designation to set it apart from a previous version to eliminate confusion.” The rep then confirmed that this will be the case for all newly announced Nvidia GPUs going forward: “It’s all limited hash rate.”
To confirm the performance gap on 3080 Ti, I installed NiceHash, a simplified mining ecosystem that runs computations on a number of cryptocurrency algorithms. In short, NiceHash quietly tells your computer which desired currency it wants farmed. Once that’s done, NiceHash dumps those funnymoneys into its own marketplace, then gives that user a given amount of Bitcoin in exchange while skimming some amount off the top for running the NiceNash marketplace.
Ultimately, this means that NiceHash isn’t the most controlled environment for comparisons, but it was also easy for me to activate, test briefly, and disable—particularly after tuning my GPU via MSI Afterburner (clock speeds down, power level down, fans way up) so that its VRAM wouldn’t catch on fire.
The number you want is 61.5 MH/s, which is my testing rig’s unoptimized mining hash rate using the RTX 3080 Ti for one hour. That’s a small jump up from the RTX 3070’s 59 MH/s rate on the same PC, but it’s far lower than the 81 MH/s I measured on this machine’s RTX 3080 test. In other words: Spend $400 more, get 24 percent lower hashing rates.
Nvidia’s new cat-and-mouse game
Could miners unlock this card’s fuller potential down the road? Maybe! It happened with the RTX 3060, thanks to a leak of beta firmware that opened its mining potential to the fullest, though Nvidia insists that they’ve taken additional preventative measures for all commercial-grade GPUs going forward. We’re under the impression that crafty miners have a decent amount of work ahead of them, since we’ve yet to hear that the non-beta RTX 3060 firmware or BIOS were independently cracked.
But it’s hard to make firm predictions. Nvidia only recently entered this game of cat-and-mouse against miners, which they’ve kicked off under the guise of protecting average customers from price spikes. Really, it seems just as likely that Nvidia wants miners to pay top dollar for Cryptocurrency Mining Processors (CMPs), which are basically RTX 3000-series GPUs without any video-out capabilities (and, thus far, such cards look like they’re priced much higher than their unlocked GPU peers). So long as there’s a big-money bottom line for squashing miners’ attempts to unlock consumer-level GPUs, Nvidia will likely keep that fight up. That could send a ripple wave through the auction-listing ecosystem.
That is, it could whenever stock stabilizes. A 61.5 MH/s GPU isn’t earth-rattling for a high-end miner, but if Nvidia can’t stabilize prices, performance, or availability for its CMPs, miners will probably still elbow gamers out of the way to buy the crypto-hampered 3080 Ti.
When “$1,199” doesn’t mean $1,199
Within the vacuum of comparisons to other cards and the $1,199 MSRP, the RTX 3080 Ti is a heckuva GPU. It’s not the “wow, that’s the right price” stunner of the 3080’s original $699 MSRP, but it’s also not the clearly overpriced $1,199 MSRP originally attached to 2018’s RTX 2080 Ti. If all of these cards existed at retail at their listed prices, I’d say the 3080 Ti is priced for a certain kind of PC power user without gouging buyers 3090-style, while the 3080 and RX 6800XT make more sense on a power-per-dollar basis. (Will the upcoming 3070 Ti, launching on June 10 for $599, shake that $600-700 range up significantly? Stay tuned.)
If you can safely go to a store at this point in 2021, you might have a shot at lining up and buying one this week at MSRP, since brick-and-mortar retailers have more incentive to get you into their doors and limit purchases to one per customer. Recent card launches have seen continued movement in that direction. No retailer benefits from bot exploitation.
Yet Nvidia has been coy about loudly addressing the reality of GPU availability, and at this point, that sucks. Maybe they’re in an uncomfortable position as a publicly traded company and can’t admit what a mess GPU sales have become in the past year-plus, and maybe they’d rather dump cards into an unregulated market, watch them all sell out, and report the good news to shareholders. There’s also the reality of third-party vendors, who produce the majority of Nvidia GPUs, pricing the 3080 Ti however they see fit. Nvidia declined to offer a list of how its partner vendors were pricing their 3080 Ti models ahead of launch.
So “$1,199” doesn’t really mean $1,199. And while I can slap performance benchmarks onto charts and break down the ins and outs of graphical performance, I am not nearly as well positioned to do the same thorough evaluation with the traveling fair funhouse that is the modern GPU economy. If you’ve made it this far in my review, you’re clearly invested in high-end computing and in the dream of ever buying into it at a reasonable price. In that journey, dear reader, I wish you all of the luck.
Listing image by Sam Machkovech