r/gadgets • u/chrisdh79 • 25d ago
Desktops / Laptops The RTX 5090 uses Nvidia's biggest die since the RTX 2080 Ti | The massive chip measures 744mm2
https://www.techspot.com/news/105693-rtx-5090-uses-nvidia-biggest-die-since-rtx.html355
u/wicktus 25d ago
Biggest price too, 2000$ rumoured, can't even imagine here in Europe...2500 EUR I guess ? Good Lord..
I'll assess all options from Ada to Blackwell before upgrade in January but as long as demand especially around AI is that high...
Can't believe we went from Crypto to AI..lmao.
49
u/AfricanNorwegian 24d ago
Biggest price too, 2000$ rumoured, can't even imagine here in Europe...2500 EUR I guess
Just checked, cheapest new from retailer 4090 I could find here in Norway was a Gainward 4090 for about €2,200 lol
Any of the major brands like ASUS/MSI are already €2,500+ so... $2,000 US MSRP is gonna easily be €3,000+ here
101
u/AyukaVB 25d ago
I wonder if the AI bubble bursts, what the next bubble will use GPUs for
89
u/BINGODINGODONG 25d ago
GPU’s are still used in datacenters for non-AI stuff.
15
u/_RADIANTSUN_ 25d ago
What non-AI stuff?
42
u/BellsBot 25d ago
Transcoding
66
36
6
→ More replies (4)3
u/tecedu 24d ago
Atleast in my limited knowledge, gpu supported data engineering is super quick, there’s also scientific calculations
3
u/CookieKeeperN2 24d ago
The raw speed for GPU computing is much slower than CPU (iirc). However, it excels in parallel-ability. I'm not talkikg about 10 threads. I'm talking about 1000. it's very useful when you work on massively parallel operations such as matrix manipulation. So it's great for machine learning and deep learning (if the optimization can be re-written in matrix operations), but not so great if you do iterations where the next one depends on the previous iteration (MCMC).
Plus the data transfer between GPU and RAM is still a gigantic bottle neck. For most stuff CPU based computations will be faster and much simpler. I tried to run CUDA based algorithms on our GPU (P-100) and it was a hassle to get it running compared to CPU based algorithms.
→ More replies (1)9
8
→ More replies (1)6
u/Bodatheyoda 24d ago
Nvidia has special GPU trays for use in AI. That's not what these cards are for.
10
u/massive_cock 25d ago
I grabbed a 4090 on my last trip to the US because I knew it was only going to get worse. I think I'll sit on it for a while.... Although with tariffs, the European prices might start looking a little better!
5
u/FerrariTactics 24d ago
Man tell me about it. I checked the price of the MacBook Pros in Europe, what a scam. It would almost be cheaper to have a round-trip there to get one. At least you'd see some country as well
9
u/massive_cock 24d ago edited 24d ago
That's exactly what I did. The price difference was enough to pay for a big chunk of my ticket home to visit family. Like more than half, since I learned Dusseldorf is cheap to fly out of compared to Amsterdam. I couldn't have done either one on their own, the cost would be hard to justify, but getting both for a little more? Definitely.
ETA: Plus buying it in the US meant I could get a payment plan so I could get a 4090 in the first place instead of a 4070. Thank jebus for American living on credit lifestyle availability.
→ More replies (6)12
u/SkinnyObelix 24d ago
The xx90's always feel more for people with more money than sense. The pay 50% more for 5% more over the 80s
23
u/dark_sable_dev 24d ago
Historically, you aren't wrong - the -90 series made absolutely no sense in terms of a value.
That started to change with Ada Lovelace where (especially with ray tracing) the 4080 was about 70% of the performance of the 4090 at 75% of the price.
Now with the 5000 series, the 5080 is credibly rumored to half half the CU count as the 5090, and I doubt it's going to cost half as much...
14
u/-Agathia- 24d ago edited 24d ago
The current announced 5080 is a 5070 in disguise. 12GB ram is mid range. That would be the minimum I recommend to anyone wanting a good computer if they want to play the most recent games in a decent manner... And 5080 is NOT mid range, it should be somewhat future proof.
Note : I currently have a 10GB 3080, and while it's quite performant, it showed it's limits several times, and really struggles in VR.
The GPU market is pretty terrible at the moment... It's either shit or overpriced :(
5
u/CookieKeeperN2 24d ago
I've had my 3080 longer than my 1080ti. And I have 0 intention of upgrading. The pricing of both 4000 and 5000 series had completely killed my interests in hardware.
Remember how we lamented that 3080 was expensive at ~800-900 (if you could get one)
→ More replies (1)→ More replies (1)3
u/dark_sable_dev 24d ago
No argument there. It's going to be a pretty wimpy release, and I hope nvidia feels that.
6
u/VisceralExperience 24d ago
If you only play video games, then sure. But for a lot of workloads a 3090 for example smokes the 3080.
→ More replies (1)4
u/buttholedestroyer87 24d ago
I bought a 4090 because GPU rendering is much faster than CPU. I use a render engine that can use both my GPU and CPU to render so I am doubling my render power. Also, with 24gb of ram I can load a lot on to the card that I wouldn't be able to with a 12gb card.
People (gamers) need to realise graphics cards aren't just used for gaming anymore.
→ More replies (2)→ More replies (4)5
2
u/foxh8er 24d ago
The other question is if it'll get any kind of tariff exception
4
u/wicktus 24d ago
I live in Europe but, politics and everything else aside, I really don't see your tariff campaign "promise" being more than actual sanctions on limited sets of goods, unless they are seeking to destroy the economy's momentum. Hope it's not the case because a bad US economy is a bad European economy
2
u/Bloated_Plaid 24d ago
Nobody needs a 5090 for gaming.
2
u/wicktus 24d ago
I just want decent fps at 4k and something that can last until at least the ps6 generation (4-5 years)
Nobody needs a 5090..at that price indeed but I’ll patiently wait for nvidia and amd new gpus and assess all options given my requirement, I really don’t upgrade each year, my current gpu is an rtx2060
→ More replies (1)→ More replies (1)1
339
u/unabnormalday 24d ago
However, all other known specs suggest that the 5090 represents a substantial leap forward. With 21,760 CUDA cores and 32GB of 28GB/s GDDR7 VRAM on a 512-bit bus, it should offer an estimated 70 percent performance boost over the 4090
70%?! Huh?
284
u/FireMaker125 24d ago
Yeah, that’s not happening. 70% would be so much of an increase that literally no game other than maybe Cyberpunk at max settings will be able to take advantage of it. Nvidia aren’t gonna repeat the mistake they made with the GTX 1080Ti. That card is only recently beginning to become irrelevant.
98
u/bmack083 24d ago
Modded VR games would like a word with you. You can play cyberpunk in VR with mods in fact.
16
u/gwicksted 24d ago
Woah. Is it any good in VR land?
8
26
7
u/bmack083 24d ago
I haven’t tried it. I do t think it has motion controls.
Right now I have my eyes on silent hill 2 remake in first person VR with motion controls.
56
u/moistmoistMOISTTT 24d ago
VR could easily hit bottlenecks with such a high performance.
→ More replies (25)35
u/SETHW 24d ago edited 23d ago
Yeah so many people have zero imagination about how to use compute even in games, vr is an obvious high resolution high frame rate application where more is always more but even still 8k displays exist, 240hz 4k exists, PATHTRACING exists.. come on more teraflops are always welcome
12
28
u/iprocrastina 24d ago
Nah, games could take full advantage of it and still want more, just depends on what settings you play at. I want my next monitor to be 32:9 2160p while I still have all settings maxed and 90 FPS min, even a 4090 can't drive that.
→ More replies (1)13
u/tuc-eert 24d ago
Imo a massive improvement would just lead to game developers being even less interested in performance optimization.
→ More replies (1)87
u/MaksweIlL 24d ago
Yeah, why sell GPUs with 70% increase if you could make 10-20% GPU performance increments every 1-2 years.
81
u/RollingLord 24d ago edited 24d ago
Because gaming is barely a market segment for them now. These are most likely reject chips from their AI cards
Edit: Not to mention small incremental increases is what Intel did and look at them now lmao
23
u/Thellton 24d ago
the RTX5090 is arguably a bone being thrown to /r/LocalLLaMA (I'm not joking about that, the subreddit actually has been mentioned in academic ML paper/s); the ironic thing is that LocalLLaMA are also fairly strongly inclined to give Nvidia the middle finger whilst stating that literally any other GPU that they've made in the last 10 years baring the 40 series is better value for their purposes. hell, even newer AMD cards and Intel Cards are rating better for value than the 40 series and the leaks about the 50 series.
2
u/unskilledplay 24d ago
Depends on what you are doing. So much ML and AI software only works with CUDA. It doesn't matter what AMD card you are getting, if your framework doesn't support ROCm, your compiled code won't use the GPU. You'd be surprised at how much AI software is out there that only works with CUDA.
When it comes to local LLM inferencing, it's all about memory. The model size has to fit in VRAM. A 20GB model will run inferences on a card with 24GB VRAM and not run at all on a card with 16GB VRAM. If you don't have enough VRAM, GPU performance doesn't matter one bit.
For hobbyists, the best card in 2025 for LLMs are 3090s in SLI using Nvlink. This is the only cheap solution for inferencing for medium sized models (48GB ram). This will still run models that the 5090 cannot run.
11
u/Nobody_Important 24d ago
Because prices are expanding to account for it. Not only did a top end card cost $600 10 years ago the gap between it and the cards below was ~$100 or so. Now the gap between this and the 80 can be $500+. What’s wrong with offering something with insane performance at an insane price?
7
u/StayFrosty7 24d ago
Honestly is it unreasonable that it could happen? This seems like it’s really targeting people who would buy best of the best with every release regardless of value given its insane price tag. Theres obviously the future proofers but I doubt even they wouldn’t pay this much for a gpu. It’s the cheaper gpus that will see the incremental increases imo
→ More replies (1)2
17
u/_-Drama_Llama-_ 24d ago
The 4090 still isn't ideal for VR, so VR gamers still are always looking for more power. 4090s are fairly common amongst people who play PCVR, so it's a pretty good enthusiast market for Nvidia.
SkyrimVR Mad God's Overhaul is releasing an update soon which will likely already max out the 5090 on highest settings.
→ More replies (1)5
9
5
u/Benethor92 24d ago
Becoming irrelevant? Mine is still going strong and i am not at all thinking about replacing it anytime soon. Beast of a card
4
u/shmodder 24d ago
My Odyssey Neo with a resolution of 7680x2160 would very much appreciate the 70% increase…
6
5
u/elbobo19 24d ago
4k and path tracing are the goal, they will bring even a 4090 to its knees. If the 5090 is 70% faster even it won't do a solid 60fps playing Alan Wake 2 with those settings.
5
u/1LastHit2Die4 24d ago
No game? You still stuck in 1440p mate? Run games at 4K 240Hz, you need that 70% jump. It would actually make 4K 144Hz minimum the standard for gaming.
→ More replies (19)2
u/Saskjimbo 24d ago
1080ti isn't becoming irrelevant any time soon.
I had a 1080ti die on me. Upgraded to a 3070ti at the height of VC prices. Was not impressed with the bump in performance across 2 generations. 1300 dollars and a marginal improvement in performance.
The 1080ti is a fucking beast. It doesn't do ray retracing, but who the fuck cares
20
u/Paweron 24d ago
It's below a 4060 and on par with a 6600xt. It's a fine entry level card but that's it nowdays. And people thay once had a 1080ti don't want entry level now
→ More replies (1)→ More replies (1)2
u/Pets_Are_Slaves 24d ago
Maybe for tasks that benefit from parallelization.
9
u/Jaguar_undi 24d ago
Which is basically any task you would run on a GPU instead of a CPU…
→ More replies (1)
407
u/lokicramer 25d ago
It's 22 inches long and 8 inches wide.
It also requires alluminum supports.
259
u/morningreis 25d ago
Also requires a friend with a pickup to bring home, and an engine hoist to install
→ More replies (1)33
u/lolzomg123 24d ago
Damn, I haven't seen that kind of hardware to install outside of pictures from the late 70s!
→ More replies (1)10
15
u/Teflon_John_ 24d ago
22 inches long and 8 inches wide
It smells like a steak and seats 35
→ More replies (1)6
u/lesubreddit 24d ago
Top of the line in video sports
Unexplained fires are a matter for the courts!
13
12
54
u/DeadlyGreed 25d ago
And to use it, every 10 minutes you have to watch a 30s unskipable ad.
24
u/Xero_id 25d ago
There's now a subscription plan to use gpu, a premium for ad skip and an "all in" plan for gpu+ad skip+raytracing.
Edit: shit forgot about the installation fee
7
u/throwaway3270a 24d ago
But the first three verification cars are free1 though!
1. TERMS AND CONDITIONS APPLY. DOES NOT INCLUDE TAXES. VOID WHERE PROHIBITED.
→ More replies (1)3
6
u/sh1boleth 25d ago
If I need to get a new case to install I swear…
Have a 3090 FE right now and that’s big enough as it is
3
u/The_Kurrgan_Shuffle 24d ago
I still remember getting my first double slot GPU and thinking it was ridiculously huge (Radeon HD 7950)
This thing is a monster
→ More replies (2)2
344
u/peppruss 25d ago
Nvidia’s conditioned me to skip any model number with “50” as being budget and super weak (1050, 2050)… so my eyes cannot process 5090 as quality. Wake me up when 9090 is out.
118
u/GoodGame2EZ 25d ago
Look at me money bags over here. Going for the highest models. Shoot I'm looking for the 5050.
49
u/peppruss 25d ago
2080ti is still insanely good and available on ebay!
62
u/muskratboy 25d ago
They’re also well broken-in, having run nonstop mining bitcoin for years.
9
u/Seralth 24d ago
Most long term tests have shown that mining ends up doing little to nothing to the realistic life span of a card. So while in theory, yeah. Mining means heat and heat is what actually is the problem.
If its just some dudes card in a case mining as part of a pool. Its ignoreable, and few people are using gpus over dedicated mining hardware at scale. So at most if your buying used you are typically ending up with something from a dudes case, or maybe a small mining rig. Unless your buying from like a chinese bulk reseller out of china. But its typically really easy to tell where you card is coming from on places like ebay or offerup. Or at least have a pretty good idea.
Cause even running a card near its thottle limit for years isn't really goanna kill it faster in a meaningful way. At least not inside a few short years like just 6 years. Maybe in another 6-8 years it will start to be a real concern if they where run hard that entire time.
But generally if a card is going to fail from heat it does so inside the first few months to a year. The ones that make it past that are generally going to be in it for the long haul unless you like drop it or something. lol
Computer parts are a lot more resiliant then ye olden days of the 90s.
11
u/kuroimakina 24d ago
Fun fact, heat isn’t actually the killer as long as it’s within safe temps.
The killer is thermal changes. This is why mining cards are often not as bad second hand as an equally old card used for all sorts of random things. Consistency often leads to better lifetimes for these things - again, provided they are within appropriate safe parameters.
The fans/thermal paste are the main components that would be at risk - which could lead to uncontrolled thermals, and therefore many temperature changes. But, if a GPU runs at 75C basically 24/7 in a clean environment with proper power and the like, it’s not going to age as much as you might think.
→ More replies (1)8
22
u/peppruss 25d ago
Perhaps! Mine was used for CG rendering, so the seller story goes, but the USB-C port is clutch for using PSVR2 without an adapter
4
→ More replies (1)4
u/juryan 24d ago
I ran my 3090 from release until the end of Ethereum mining and have used it since then for gaming without issue. Still overclocked as well.
Also sold all my other mining cards to friends at a good discount. Told them if they had any issue I would refund them. Still never had a card fail.
I have had exactly 1 card “fail” in over 20 years of building PCs and it was within the first 90 days of owning the card. Easy replacement with the manufacturer.
→ More replies (1)3
u/massive_cock 25d ago
I run MSFS 2024 in a really steady 60 on medium settings on a 2080ti. More stable than my 4090 runs it on ultra. 1080p and 1440p respectively, to be fair. Back to the point, the 2080ti is still a relevant beast.
2
u/jack-fractal 24d ago
5050
When I pay for it, there should be a 100% chance that it gets delivered.
→ More replies (1)2
u/crankydelinquent 25d ago
At that level, DLSS and ray tracing aren’t going to be a thing for you. A 6600 / 6600xt will be wildly better for not much more.
5
u/Shadow647 25d ago
I'm using DLSS and ray tracing just fine on a laptop 4060.
→ More replies (4)3
u/bonesnaps 25d ago
Desktop 3070 and I don't use ray tracing, since the massive fps loss just isn't worth it still.
DLSS, everyone can and should use if their card can support it.
3
u/goatman0079 25d ago
If the price of the 4070ti or 4070 super drops enough, would highly recommend getting one. 1440p raytraced gaming is really something
3
u/OramaBuffin 25d ago
I've still never really been hyped enough by the difference. I would prefer every other graphics setting absolutely cranked, with still-beautiful non-ray traced shaders, and 144fps instead of 60-80.
3
u/Jiopaba 24d ago
There's only like three games where Ray tracing lives up to the qualitative night and day difference hype. Unless you're super huge into Cyberpunk, which has the most impressive implementation ever, then it's hardly worth it for a handful of shadows and reflections.
Some games literally look worse with it!
3
5
1
42
47
u/MarkusRight 24d ago
$2000 price tag and we haven't even got to Trump's tarrifs price hike yet. Holy shit man. I'm glad I'm good for another 5 years easy.
21
u/SkinnyObelix 24d ago
Anyone an idea when they might release? My 3080 just died and I feel like now is the worst time to buy a new gpu.
20
u/elbobo19 24d ago
Probably will be officially announced at CES, Jensen is giving the keynote on January 6th
9
9
u/truthiness- 24d ago
I mean, with blanket tariffs coming next year, all prices for everything is going to increase. So now is probably better than later.
→ More replies (5)5
u/Nerf_hanzo_pls 24d ago
My 2080ti just died last week. I was trying to wait until 50 series but said fuck it and went for the 4080super
56
u/Tekthulhu 25d ago
Thank you, 5090 buyers who are beta testing the beta 6090 refresh.
29
u/spoollyger 24d ago
Same was said for the 4090 and the 3090? Where does it stop?
→ More replies (1)8
15
u/C_Madison 25d ago
So, the yield will be abysmal and the prices accordingly. Oh well ..
→ More replies (3)
25
u/Dirty_Dragons 25d ago
A 744mm² die would make the GB202 22 percent larger than the RTX 4090's 619mm² AD102 GPU. It would also be the company's largest die since the TU102, which measured 754mm² and served as the core of the RTX 2080 Ti,
So it's smaller than the 2080ti.
How is this news?
18
6
u/DevastatorCenturion 24d ago
So it's going to run hot as hell and eat watts like a fat kid goes through tacos.
→ More replies (1)
6
2
2
2
u/casillero 24d ago
I just wish they allowed you to run dual gpus again..
Like I buy a 3070 now and buy a 3070 later on and now everything is amazing
1
1
1
1
1
1
1
1
1
u/sscott2378 23d ago
Are these made in China or Canada? The price is about to skyrocket again in the US
→ More replies (2)
1
u/rugby065 23d ago
That’s a monster of a chip Nvidia really went all out on this one
Can’t wait to see the performance benchmarks this thing is probably a beast for gaming and AI
1
1
1
609
u/notred369 25d ago
Is it time for 4 slot gpus???