What’ll be the best Lovelace GPU? Nvidia RTX 4080 rumor suggests it’ll be… the RTX 4070

RTX 4080 graphics card may not be quite as powerful as we thought

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

Nvidia’s RTX 4080 graphics cardcould be a little less beefy than has been previously rumored, which follows swiftly on the heels of news that theRTX 4070 might be a peppier GPU than anticipated.

This fresh Lovelace speculation again comes from Kopite7kimi, a regular hardware leaker on Twitter who has been supplying rumors at quite a pace of late (though that’s to be expected as theRTX 4000range nears its launch timeframe, with an expected debut in or around October, assuming nodelays).

I’m not a chatterbox, but I have to make some updates. I hope you don’t mind. a possible RTX 4080, PG136/139-SKU360AD103-300-A19728FP32256bit 16G 21Gbps GDDR6Xtotal power ~420WTSE ~15000Now I have completed the latest update for 4090, 4080 and 4070.August 5, 2022

The theory is that the ‘possible’ RTX 4080 – a reminder that nothing is necessarily set in stone, even if this isNvidia’s current thinking with the spec of the GPU – could run with 9,728 CUDA Cores, which is fewer than the previously rumored 10,240.

The leaker believes that the RTX 4080 will come equipped with 16GB of GDDR6X (at 21Gbps), and a power usage of around 420W, all of which remains in line with what was floated previously – the only change is that trimming down of the core count at this point.

Analysis: Nvidia’s positioning of RTX 4000 models

Analysis: Nvidia’s positioning of RTX 4000 models

The first thing to note here is that this isn’t a major change. Dropping down 512 CUDA Cores – which means reducing the streaming multiprocessors on-board by four, from 80 to 76 – is not a massive move, but it does still represent a slight reduction in raw grunt for theGPU.

Let’s bear in mind that this is just a rumor, and the leak landscape around Lovelace is seemingly shifting quite regularly these days – which may be a reflection of Nvidia tinkering with and adjusting relative specs as the firm moves towards a finalized bunch of products.

As mentioned at the outset, Kopite7kimi also recently brought us news on the RTX 4070, with the apparent idea for thatGPU being to up its performance level at this stage– to the spec that was previously rumored for the RTX 4070 Ti, in fact. A pretty big step up, for sure. So, assuming this is all correct, what might Nvidia’s reasoning be here?

Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.

Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.

Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.

Bringing down the CUDA Core count of the RTX 4080 a bit, while seriously beefing up the RTX 4070 – not just for CUDA Cores, but the loadout of VRAM as well – means, obviously enough, that the relative performance of the RTX 4070 and RTX 4080 is theoretically being brought more closely in line.

With less differentiation between the two, and Nvidia’s xx80 model generally being a good chunk more expensive than the more mainstream xx70 card – the relatively affordable high-end GPU, if you will – with Lovelace, are we going to be looking at an RTX 4070 which is even more in the sweet spot for most buyers?

Or to put it another way, if the RTX 4080 isn’tallthat much faster, who will fork out for one rather than a 4070? Unless stock is an issue for the latter, of course; that could come into play, particularly if it’s really popular.

Also, we need to remember that CUDA Cores aren’t the full story for performance, with many other factors coming into play, like clock speeds. It’s here that there could be a clue in the RTX 4070 supposedly sticking with around 300W for power consumption, whereas the RTX 4080 will purportedly hit 420W or thereabouts – this could point to Nvidiaseriouslybeefing up clock speeds (and therefore performance) with the higher-end GPU compared to the 4070.

Furthermore, Kopite7kimi has maintained the previous guestimate for a 3DMark TimeSpy Extreme score for the RTX 4080 (around 15,000), despite those CUDA Cores being shaved off, so interestingly enough, that could also suggest Nvidia might be pushing harder with clocks here for the 4080. Although the leaker doesn’t mention clock speeds specifically.

Surely it doesn’t make sense to have the RTX 4080 as a relatively unattractive proposition compared to the 4070 – a much higher power usage won’t help either, in terms of forcingPSU upgrades, and energy costs in general – so for now, we’ll assume that if this rumor is true, we can’t yet see the full picture of what Nvidia is trying to do with the positioning of Lovelace GPUs. Or that there’ll be another rejigging of specs yet to come…

Relative pricing could be a factor here, too – we haven’t heard much on that front from the rumor mill yet, but we can’t imagine the RTX 4000 series will be pitched with affordability in mind. Particularly not if Nvidia has a lot ofexcess RTX 3000 stock to deal with, as the grapevine contends, and shifting all that overlaps with the Lovelace launch (which is likely only a couple of months away now).

Whatever the case, you’d imagine Team Green would’ve learned from its mistakes in terms of the RTX 3080 getting off on the wrong foot, coming with only 10GB of VRAM in its initial incarnation – an unpopular move compounded by the later release of a 12GB variant. The latter was beefier not just in terms of the additional video memory, leaving some buyers of the originalRTX 3080with a serious case of remorse.

ViaVideoCardz

Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - ‘I Know What You Did Last Supper’ - was published by Hachette UK in 2013).

Intel Battlemage rumored for December – could new budget GPUs win over gamers neglected by Nvidia and save the Arc brand?

Nvidia RTX 5090 Ti suddenly pops up – and RTX 6000 GPUs are mentioned in trademark filings too – but don’t get excited

This new malware utilizes a rare programming language to evade traditional detection methods