Many of Nvidia's latest cards are paired with a worryingly low amount of memory, and it could become problematic for users.
People have criticized Nvidia for a few things: high GPU prices, bad names for graphics cards, creating a closed ecosystem, and so on. I recently wrote a piece about how RTX 40's pricing is damaging desktop PC gaming, so I'm no stranger to calling out Nvidia (or any company, for that matter). Even still, there was one complaint I didn't always agree with: that Nvidia GPUs didn't come with enough VRAM. I always figured Nvidia would know better than to skimp out on VRAM. It should know what it's doing, right?
Well, it seems the one time I gave Nvidia the benefit of the doubt, it was a bad idea. Test data from reviewers show that RTX 30 series cards can run out of VRAM in some of the latest AAA games, resulting in worse-than-expected performance. It's not just a terrible present for RTX 30 owners, but perhaps a look into the future for people who own RTX 40 GPUs, most of which I worry don't have enough memory for graphically intense games, high resolutions, and ray tracing.
A history of Nvidia GPUs and VRAM, and why RTX 40 cards look so concerning
Just by looking at a spec sheet, it's not exactly easy to tell if a GPU has enough VRAM or not. Benchmarking would obviously reveal that information, but while results might look fine and dandy on review day, that can quickly change as more modern games with higher VRAM requirements launch. By the time that happens, millions of people have bought new GPUs. For context, let's take a look at the past decade and evaluate Nvidia's track record. We compiled the data below on how much memory Nvidia has put on its GPUs by generation and class, from the top-end Titan (may it rest in peace) to the 90-class card and the more midrange 60-class model.
|10 Series||20 Series||30 Series||40 Series|
|80 Ti Class||11GB||11GB||12GB||20GB*|
|70 Ti Class||8GB||N/A||8GB||12GB|
|60 Ti Class||N/A||N/A||8GB||8/16GB|
* Rumored specification
What may surprise you is that from the 10-series to the 30-series, Nvidia hardly changed the amount of VRAM per class, with only the RTX 3060 being a notable exception.
You can also see that the RTX 40 series actually brings a big bump in VRAM across the board, so you might think most should be fine even if old RTX 30 series cards are struggling. Sure, it's bad for people who own a 30-series card, but at least the 40-series won't have the same problem thanks to all that extra VRAM. Well, I'm not so sure, especially when you factor in generation-to-generation performance gains.
|10 Series||20 Series||30 Series||40 Series|
|80 Ti Class||108%||84%||59%||N/A|
|70 Ti Class||105%||N/A||54%||57%|
|60 Ti Class||N/A||N/A||67%||N/A|
I've compiled a table here that relies on the TechPowerUp GPU performance database (which is admittedly imperfect for comparing such a wide variety of cards) to derive the memory-to-performance ratio for each card. Because higher performance justifies adding more memory, you should expect to see this ratio go up, stay flat, or only decline occasionally for each class. But it's pretty clear that the ratio has been going down for years, and the RTX 30 series is at a low point. Worse still, the RTX 40 series isn't much better.
Testing data already strongly indicates that the 3070, 3070 Ti, and 3080 don't have enough VRAM for modern titles. Techspot's 2023 retest of the 3070 and 3080 revealed they can run out of VRAM in AAA games with ray tracing enabled, though the 3070 was running out of VRAM even with it disabled. And it's not exclusively about games that came out in 2023 either; launch day 3070 Ti reviews from ArsTechnica and ExtremeTech show poor performance in a couple of games that came out in 2020.
By no means are GPUs like the RTX 4070 Ti guaranteed to run into memory issues, but the data isn't looking good. Additionally, the scenarios where 30-series cards run into issues are when using graphically intensive settings, which 40-series owners are more likely to want to enable since their cards have more horsepower than older models.
Nvidia might regret cutting corners on VRAM for once
GDDR6 memory came out in 2018, and being five years old, it's going to be cheaper than it's ever been. Even AMD can afford to put more memory on its cards with higher levels of performance while also selling them for less. It's not like Radeon is a lucrative business, and poor old Nvidia just can't budget out an extra couple of gigabytes per card. Nvidia deliberately decided to skimp on memory with the 20, 30, and 40 series, and this time I'm not sure if Nvidia is going to get away with it.
Nvidia already has a bad history when it comes to cards and VRAM (to the point of losing a lawsuit over one particularly bad instance), and AMD has been beating its drum about VRAM for most of 2023. Additionally, some of the latest AAA games to have come out aren't exactly super optimized, as illustrated by Techspot's benchmarks, and users are understandably starting to get skeptical about Nvidia cards. Maybe this is why the RTX 4060 Ti comes with a 16GB option, but Nvidia should offer the same for other GPUs too.
The 40 series is already somewhat unpopular for being extremely expensive, and if it runs into VRAM issues in the future, it could do some serious damage to Nvidia's credibility. Charging more while offering only a little bit more memory is already bad enough, but performance issues arising because of memory issues would be a PR nightmare. We'll just have to wait and see if "GDDRmageddon" ever happens, and for the sake of RTX 40 GPU owners, I really hope it doesn't.