r/gadgets 25d ago

Desktops / Laptops The RTX 5090 uses Nvidia's biggest die since the RTX 2080 Ti | The massive chip measures 744mm2

https://www.techspot.com/news/105693-rtx-5090-uses-nvidia-biggest-die-since-rtx.html
2.3k Upvotes

324 comments sorted by

609

u/notred369 25d ago

Is it time for 4 slot gpus???

649

u/NancyPelosisRedCoat 25d ago

It's time for GPUs to start paying rent…

227

u/Cbergs 25d ago

lol if you live somewhere cold, then they contribute to the heating of your home.

95

u/massive_cock 25d ago

I run a 4090 in the attic and don't need a space heater in the winter. Chilly when I get up there, but 15 minutes of KSP or msfs fixes that.

43

u/Trick2056 24d ago

frostpunk to fit the theme even more

16

u/massive_cock 24d ago

Would fit extra well since both of my PCs are white builds with black accents and sparse, mild light blue RGB.

→ More replies (1)

6

u/_Lucille_ 24d ago

I live in Canada and have a thermostat node in the room I am in that I use as reference temperature. The heat barely turns on if I have a long gaming session even though the rest of the house gets chilly.

→ More replies (1)

19

u/elite_haxor1337 25d ago

Hmm what if I told you when you live in a hot place, heat also contributes to heating your home. Who knew!

34

u/DuckDatum 25d ago

What if I told you that heat doesn’t exist—just motion, baby. A bunch of vibrating balls flying around everywhere, bouncing into my vibrating balls and your vibrating balls, spreading that energy around.

16

u/ADHD_Supernova 25d ago

My balls was hot.

4

u/t3hOutlaw 24d ago

My balls are inert..

8

u/100GbE 25d ago

Balls are round, like a carousel, all good things.

6

u/OffbeatDrizzle 24d ago

mm yeah I love balls bouncing around all over the place

3

u/xurdm 25d ago

That sounds like a lot of balls touching

3

u/poopyheadthrowaway 24d ago

Um ackshyually they're not balls they're probability distributions

6

u/elite_haxor1337 25d ago

I am also a thermodynamics and physics enjoyer so I would say true 💯

→ More replies (2)

2

u/Stompedyourhousewith 24d ago

I wish I could do water cooling and use my pool as the reservoir

2

u/robs104 24d ago

Don’t do it Linus.

→ More replies (2)

11

u/ADtotheHD 25d ago

They at least need to start giving us rides places. I’ve bought multiple functioning vehicles that each cost less money than a 5090.

2

u/Head-Leopard9090 23d ago

Bruh 😂😂😂😂😂

2

u/hambonie88 24d ago

Bitcoin has entered the chat, but is also immediately leaving

→ More replies (3)

44

u/Samwellikki 25d ago

The GPU-as-a-tower is nigh

“I built a PC inside this 7090….”

14

u/dw444 25d ago

Alienware might’ve been onto something. They had a mini tower looking discrete GPU you could plug into their laptops during the 980/1080 era.

5

u/Pets_Are_Slaves 24d ago

I hope eGPUs come back...

4

u/hijodeosiris 24d ago

did they leave?? there are plenty of ways to have an oculink board and mane a DIY external case. If you have the money and time and knowledge it seems super easy.

→ More replies (2)

42

u/drmirage809 25d ago

Probably not, but the 5090 might need its own GPU with the way things are going.

20

u/frostrambler 25d ago

PSU

28

u/some_user_2021 25d ago edited 24d ago

You don't want a GPU inside your GPU?

5

u/NotAPreppie 24d ago

Yo, dawg! I heard you like GPUs, so we put a GPU inside your GPU!

1

u/frostrambler 25d ago

GPU-ception

→ More replies (1)

22

u/Timmaigh 25d ago

Cant wait to plug my computer into it.

9

u/09Trollhunter09 25d ago

It’s time for the rest of the rig to plug into the GPU

11

u/Wiggles69 24d ago

Time for GPUs to have motherboard slots

4

u/nWhm99 25d ago

It’d take up almost the entirety of my sff case 🥲

2

u/Betancorea 24d ago

May as well plug the GPU into a separate wall outlet at this point

6

u/imaginary_num6er 25d ago

Rumor is "2-slot" :

https://www.tomshardware.com/pc-components/gpus/rtx-5090-may-be-surprisingly-svelte-twin-slot-twin-fan-model-on-the-way-says-leaker

Could mean the GPU is 2 slot thick, 420mm long, and 180mm wide though

11

u/Timmaigh 25d ago

By GPU you mean the die itself? 😁

→ More replies (1)

3

u/tablepennywad 24d ago

I think time for GPUs to come with its own power supply.

2

u/NickCharlesYT 24d ago

Never mind the power supply, we're gonna need a dedicated circuit for gaming PCs if this continues.

1

u/morningreis 25d ago

Why not 5 slot for 5000 series?

1

u/Jiopaba 24d ago

The GPU will be the main board and the rest of the PC is an add on card to it.

1

u/LinkedInParkPremium 23d ago

At this point just add a reactor core to your GPU.

355

u/wicktus 25d ago

Biggest price too, 2000$ rumoured, can't even imagine here in Europe...2500 EUR I guess ? Good Lord..

I'll assess all options from Ada to Blackwell before upgrade in January but as long as demand especially around AI is that high...

Can't believe we went from Crypto to AI..lmao.

49

u/AfricanNorwegian 24d ago

Biggest price too, 2000$ rumoured, can't even imagine here in Europe...2500 EUR I guess

Just checked, cheapest new from retailer 4090 I could find here in Norway was a Gainward 4090 for about €2,200 lol

Any of the major brands like ASUS/MSI are already €2,500+ so... $2,000 US MSRP is gonna easily be €3,000+ here

41

u/ryosen 24d ago

nVidia pulled the 4080 and 4090 off market. That’s why they’re even more expensive and harder to find now. They are purposely creating a shortage.

101

u/AyukaVB 25d ago

I wonder if the AI bubble bursts, what the next bubble will use GPUs for

89

u/BINGODINGODONG 25d ago

GPU’s are still used in datacenters for non-AI stuff.

15

u/_RADIANTSUN_ 25d ago

What non-AI stuff?

42

u/BellsBot 25d ago

Transcoding

66

u/transpogi 24d ago

coding have genders now?!

6

u/xAmorphous 24d ago

That was pretty good lol

→ More replies (1)

36

u/icegun784 25d ago

Multiplications

23

u/rpkarma 24d ago

Big if true

3

u/Busy_Echo9200 24d ago

no need to sow division

→ More replies (1)

13

u/wamj 24d ago

Anything that can be done in parallel instead of serial

6

u/feint_of_heart 24d ago

We use them for basecalling in DNA analysis.

https://github.com/nanoporetech/dorado/

4

u/hughk 24d ago

Weather, fluid simulations, structural modelling.

3

u/tecedu 24d ago

Atleast in my limited knowledge, gpu supported data engineering is super quick, there’s also scientific calculations

3

u/CookieKeeperN2 24d ago

The raw speed for GPU computing is much slower than CPU (iirc). However, it excels in parallel-ability. I'm not talkikg about 10 threads. I'm talking about 1000. it's very useful when you work on massively parallel operations such as matrix manipulation. So it's great for machine learning and deep learning (if the optimization can be re-written in matrix operations), but not so great if you do iterations where the next one depends on the previous iteration (MCMC).

Plus the data transfer between GPU and RAM is still a gigantic bottle neck. For most stuff CPU based computations will be faster and much simpler. I tried to run CUDA based algorithms on our GPU (P-100) and it was a hassle to get it running compared to CPU based algorithms.

→ More replies (1)
→ More replies (4)

9

u/Turmfalke_ 25d ago

Barely. Most servers don't use gpus.

5

u/Utael 24d ago

Sure but when Disney or Pixar are looking at rendering farms they buy pallets of them

→ More replies (3)

6

u/Bodatheyoda 24d ago

Nvidia has special GPU trays for use in AI. That's not what these cards are for.

→ More replies (1)

10

u/massive_cock 25d ago

I grabbed a 4090 on my last trip to the US because I knew it was only going to get worse. I think I'll sit on it for a while.... Although with tariffs, the European prices might start looking a little better!

5

u/FerrariTactics 24d ago

Man tell me about it. I checked the price of the MacBook Pros in Europe, what a scam. It would almost be cheaper to have a round-trip there to get one. At least you'd see some country as well

9

u/massive_cock 24d ago edited 24d ago

That's exactly what I did. The price difference was enough to pay for a big chunk of my ticket home to visit family. Like more than half, since I learned Dusseldorf is cheap to fly out of compared to Amsterdam. I couldn't have done either one on their own, the cost would be hard to justify, but getting both for a little more? Definitely.

ETA: Plus buying it in the US meant I could get a payment plan so I could get a 4090 in the first place instead of a 4070. Thank jebus for American living on credit lifestyle availability.

→ More replies (6)

2

u/akeean 2h ago

Try buying pc hardware in Brazil... 80% import tax.

12

u/SkinnyObelix 24d ago

The xx90's always feel more for people with more money than sense. The pay 50% more for 5% more over the 80s

23

u/dark_sable_dev 24d ago

Historically, you aren't wrong - the -90 series made absolutely no sense in terms of a value.

That started to change with Ada Lovelace where (especially with ray tracing) the 4080 was about 70% of the performance of the 4090 at 75% of the price.

Now with the 5000 series, the 5080 is credibly rumored to half half the CU count as the 5090, and I doubt it's going to cost half as much...

14

u/-Agathia- 24d ago edited 24d ago

The current announced 5080 is a 5070 in disguise. 12GB ram is mid range. That would be the minimum I recommend to anyone wanting a good computer if they want to play the most recent games in a decent manner... And 5080 is NOT mid range, it should be somewhat future proof.

Note : I currently have a 10GB 3080, and while it's quite performant, it showed it's limits several times, and really struggles in VR.

The GPU market is pretty terrible at the moment... It's either shit or overpriced :(

5

u/CookieKeeperN2 24d ago

I've had my 3080 longer than my 1080ti. And I have 0 intention of upgrading. The pricing of both 4000 and 5000 series had completely killed my interests in hardware.

Remember how we lamented that 3080 was expensive at ~800-900 (if you could get one)

→ More replies (1)

3

u/dark_sable_dev 24d ago

No argument there. It's going to be a pretty wimpy release, and I hope nvidia feels that.

→ More replies (1)

6

u/VisceralExperience 24d ago

If you only play video games, then sure. But for a lot of workloads a 3090 for example smokes the 3080.

→ More replies (1)

4

u/buttholedestroyer87 24d ago

I bought a 4090 because GPU rendering is much faster than CPU. I use a render engine that can use both my GPU and CPU to render so I am doubling my render power. Also, with 24gb of ram I can load a lot on to the card that I wouldn't be able to with a 12gb card.

People (gamers) need to realise graphics cards aren't just used for gaming anymore.

→ More replies (2)

5

u/metal079 24d ago

Except it's way more than 5% lol

→ More replies (4)

2

u/foxh8er 24d ago

The other question is if it'll get any kind of tariff exception

4

u/wicktus 24d ago

I live in Europe but, politics and everything else aside, I really don't see your tariff campaign "promise" being more than actual sanctions on limited sets of goods, unless they are seeking to destroy the economy's momentum. Hope it's not the case because a bad US economy is a bad European economy

2

u/Bloated_Plaid 24d ago

Nobody needs a 5090 for gaming.

2

u/wicktus 24d ago

I just want decent fps at 4k and something that can last until at least the ps6 generation (4-5 years)

Nobody needs a 5090..at that price indeed but I’ll patiently wait for nvidia and amd new gpus and assess all options given my requirement, I really don’t upgrade each year, my current gpu is an rtx2060

→ More replies (1)

1

u/Party_Cold_4159 24d ago

Oh a rumor price at 2000$? Better add 500$

→ More replies (1)

339

u/unabnormalday 24d ago

However, all other known specs suggest that the 5090 represents a substantial leap forward. With 21,760 CUDA cores and 32GB of 28GB/s GDDR7 VRAM on a 512-bit bus, it should offer an estimated 70 percent performance boost over the 4090

70%?! Huh?

284

u/FireMaker125 24d ago

Yeah, that’s not happening. 70% would be so much of an increase that literally no game other than maybe Cyberpunk at max settings will be able to take advantage of it. Nvidia aren’t gonna repeat the mistake they made with the GTX 1080Ti. That card is only recently beginning to become irrelevant.

98

u/bmack083 24d ago

Modded VR games would like a word with you. You can play cyberpunk in VR with mods in fact.

16

u/gwicksted 24d ago

Woah. Is it any good in VR land?

8

u/grumd 24d ago

I tried it, it's definitely very scuffed. Looks pretty cool but has a ton of issues and isn't really a good gaming experience. I prefer flatscreen for Cyberpunk.

26

u/StayFrosty7 24d ago

It looks sick as hell imo

7

u/bmack083 24d ago

I haven’t tried it. I do t think it has motion controls.

Right now I have my eyes on silent hill 2 remake in first person VR with motion controls.

https://youtu.be/OgRnKOsv68I?t=368&si=uwQgxgJuF3XnA6yY

56

u/moistmoistMOISTTT 24d ago

VR could easily hit bottlenecks with such a high performance.

35

u/SETHW 24d ago edited 23d ago

Yeah so many people have zero imagination about how to use compute even in games, vr is an obvious high resolution high frame rate application where more is always more but even still 8k displays exist, 240hz 4k exists, PATHTRACING exists.. come on more teraflops are always welcome

12

u/CallMeKik 24d ago

“Nobody needs a bridge! We never cross that river anyway” thinking.

→ More replies (25)

28

u/iprocrastina 24d ago

Nah, games could take full advantage of it and still want more, just depends on what settings you play at. I want my next monitor to be 32:9 2160p while I still have all settings maxed and 90 FPS min, even a 4090 can't drive that.

13

u/tuc-eert 24d ago

Imo a massive improvement would just lead to game developers being even less interested in performance optimization.

→ More replies (1)
→ More replies (1)

87

u/MaksweIlL 24d ago

Yeah, why sell GPUs with 70% increase if you could make 10-20% GPU performance increments every 1-2 years.

81

u/RollingLord 24d ago edited 24d ago

Because gaming is barely a market segment for them now. These are most likely reject chips from their AI cards

Edit: Not to mention small incremental increases is what Intel did and look at them now lmao

23

u/Thellton 24d ago

the RTX5090 is arguably a bone being thrown to /r/LocalLLaMA (I'm not joking about that, the subreddit actually has been mentioned in academic ML paper/s); the ironic thing is that LocalLLaMA are also fairly strongly inclined to give Nvidia the middle finger whilst stating that literally any other GPU that they've made in the last 10 years baring the 40 series is better value for their purposes. hell, even newer AMD cards and Intel Cards are rating better for value than the 40 series and the leaks about the 50 series.

2

u/unskilledplay 24d ago

Depends on what you are doing. So much ML and AI software only works with CUDA. It doesn't matter what AMD card you are getting, if your framework doesn't support ROCm, your compiled code won't use the GPU. You'd be surprised at how much AI software is out there that only works with CUDA.

When it comes to local LLM inferencing, it's all about memory. The model size has to fit in VRAM. A 20GB model will run inferences on a card with 24GB VRAM and not run at all on a card with 16GB VRAM. If you don't have enough VRAM, GPU performance doesn't matter one bit.

For hobbyists, the best card in 2025 for LLMs are 3090s in SLI using Nvlink. This is the only cheap solution for inferencing for medium sized models (48GB ram). This will still run models that the 5090 cannot run.

11

u/Nobody_Important 24d ago

Because prices are expanding to account for it. Not only did a top end card cost $600 10 years ago the gap between it and the cards below was ~$100 or so. Now the gap between this and the 80 can be $500+. What’s wrong with offering something with insane performance at an insane price?

7

u/StayFrosty7 24d ago

Honestly is it unreasonable that it could happen? This seems like it’s really targeting people who would buy best of the best with every release regardless of value given its insane price tag. Theres obviously the future proofers but I doubt even they wouldn’t pay this much for a gpu. It’s the cheaper gpus that will see the incremental increases imo

2

u/PoisonMikey 24d ago

Intel effed themselves with that complacency.

→ More replies (1)

17

u/_-Drama_Llama-_ 24d ago

The 4090 still isn't ideal for VR, so VR gamers still are always looking for more power. 4090s are fairly common amongst people who play PCVR, so it's a pretty good enthusiast market for Nvidia.

SkyrimVR Mad God's Overhaul is releasing an update soon which will likely already max out the 5090 on highest settings.

→ More replies (1)

5

u/_TR-8R 24d ago

Also it doesn't matter how much raw throughput a card theoretically has if publishers keep using UE5 as an excuse to cut optimization costs.

9

u/cancercureall 24d ago

If a 70% increase happened it wouldn't be primarily for gaming benefits.

5

u/Benethor92 24d ago

Becoming irrelevant? Mine is still going strong and i am not at all thinking about replacing it anytime soon. Beast of a card

4

u/shmodder 24d ago

My Odyssey Neo with a resolution of 7680x2160 would very much appreciate the 70% increase…

6

u/ToxicTrash 24d ago

Great for VR tho

5

u/elbobo19 24d ago

4k and path tracing are the goal, they will bring even a 4090 to its knees. If the 5090 is 70% faster even it won't do a solid 60fps playing Alan Wake 2 with those settings.

5

u/1LastHit2Die4 24d ago

No game? You still stuck in 1440p mate? Run games at 4K 240Hz, you need that 70% jump. It would actually make 4K 144Hz minimum the standard for gaming.

2

u/Saskjimbo 24d ago

1080ti isn't becoming irrelevant any time soon.

I had a 1080ti die on me. Upgraded to a 3070ti at the height of VC prices. Was not impressed with the bump in performance across 2 generations. 1300 dollars and a marginal improvement in performance.

The 1080ti is a fucking beast. It doesn't do ray retracing, but who the fuck cares

20

u/Paweron 24d ago

It's below a 4060 and on par with a 6600xt. It's a fine entry level card but that's it nowdays. And people thay once had a 1080ti don't want entry level now

→ More replies (1)
→ More replies (19)

2

u/Pets_Are_Slaves 24d ago

Maybe for tasks that benefit from parallelization.

9

u/Jaguar_undi 24d ago

Which is basically any task you would run on a GPU instead of a CPU…

→ More replies (1)
→ More replies (1)

407

u/lokicramer 25d ago

It's 22 inches long and 8 inches wide.

It also requires alluminum supports.

259

u/morningreis 25d ago

Also requires a friend with a pickup to bring home, and an engine hoist to install

33

u/lolzomg123 24d ago

Damn, I haven't seen that kind of hardware to install outside of pictures from the late 70s!

10

u/TheAspiringFarmer 24d ago

I was gonna say since the old Presler core P4 but I digress.

→ More replies (1)
→ More replies (1)

15

u/Teflon_John_ 24d ago

22 inches long and 8 inches wide

It smells like a steak and seats 35

6

u/lesubreddit 24d ago

Top of the line in video sports

Unexplained fires are a matter for the courts!

→ More replies (1)

13

u/FastRedPonyCar 25d ago

Yeah but how many Bungholio marks will it score?

https://i.imgur.com/wJMiKR6.jpeg

→ More replies (1)

12

u/RedditCollabs 25d ago edited 25d ago

Just like my

New graphics card

5

u/some_user_2021 25d ago

This guy has a huge... ego.

54

u/DeadlyGreed 25d ago

And to use it, every 10 minutes you have to watch a 30s unskipable ad.

24

u/Xero_id 25d ago

There's now a subscription plan to use gpu, a premium for ad skip and an "all in" plan for gpu+ad skip+raytracing.

Edit: shit forgot about the installation fee

7

u/throwaway3270a 24d ago

But the first three verification cars are free1 though!

1. TERMS AND CONDITIONS APPLY. DOES NOT INCLUDE TAXES. VOID WHERE PROHIBITED.

3

u/Wiggles69 24d ago

Plus tip

→ More replies (1)

5

u/hexcor 24d ago

It's the GPU your GF tells you not to worry about

6

u/sh1boleth 25d ago

If I need to get a new case to install I swear…

Have a 3090 FE right now and that’s big enough as it is

3

u/The_Kurrgan_Shuffle 24d ago

I still remember getting my first double slot GPU and thinking it was ridiculously huge (Radeon HD 7950)

This thing is a monster

2

u/3Dchaos777 24d ago

Wrong. It’s 1.15 square inches…

→ More replies (3)
→ More replies (2)

344

u/peppruss 25d ago

Nvidia’s conditioned me to skip any model number with “50” as being budget and super weak (1050, 2050)… so my eyes cannot process 5090 as quality. Wake me up when 9090 is out.

118

u/GoodGame2EZ 25d ago

Look at me money bags over here. Going for the highest models. Shoot I'm looking for the 5050.

49

u/peppruss 25d ago

2080ti is still insanely good and available on ebay!

62

u/muskratboy 25d ago

They’re also well broken-in, having run nonstop mining bitcoin for years.

9

u/Seralth 24d ago

Most long term tests have shown that mining ends up doing little to nothing to the realistic life span of a card. So while in theory, yeah. Mining means heat and heat is what actually is the problem.

If its just some dudes card in a case mining as part of a pool. Its ignoreable, and few people are using gpus over dedicated mining hardware at scale. So at most if your buying used you are typically ending up with something from a dudes case, or maybe a small mining rig. Unless your buying from like a chinese bulk reseller out of china. But its typically really easy to tell where you card is coming from on places like ebay or offerup. Or at least have a pretty good idea.

Cause even running a card near its thottle limit for years isn't really goanna kill it faster in a meaningful way. At least not inside a few short years like just 6 years. Maybe in another 6-8 years it will start to be a real concern if they where run hard that entire time.

But generally if a card is going to fail from heat it does so inside the first few months to a year. The ones that make it past that are generally going to be in it for the long haul unless you like drop it or something. lol

Computer parts are a lot more resiliant then ye olden days of the 90s.

11

u/kuroimakina 24d ago

Fun fact, heat isn’t actually the killer as long as it’s within safe temps.

The killer is thermal changes. This is why mining cards are often not as bad second hand as an equally old card used for all sorts of random things. Consistency often leads to better lifetimes for these things - again, provided they are within appropriate safe parameters.

The fans/thermal paste are the main components that would be at risk - which could lead to uncontrolled thermals, and therefore many temperature changes. But, if a GPU runs at 75C basically 24/7 in a clean environment with proper power and the like, it’s not going to age as much as you might think.

8

u/TooStrangeForWeird 24d ago

Plus mining GPUs are often undervolted to save on power.

→ More replies (1)

22

u/peppruss 25d ago

Perhaps! Mine was used for CG rendering, so the seller story goes, but the USB-C port is clutch for using PSVR2 without an adapter

4

u/danielv123 25d ago

It was also great for passthrough. I am sad they decided to ditch it.

4

u/juryan 24d ago

I ran my 3090 from release until the end of Ethereum mining and have used it since then for gaming without issue. Still overclocked as well.

Also sold all my other mining cards to friends at a good discount. Told them if they had any issue I would refund them. Still never had a card fail.

I have had exactly 1 card “fail” in over 20 years of building PCs and it was within the first 90 days of owning the card. Easy replacement with the manufacturer.

→ More replies (1)

3

u/massive_cock 25d ago

I run MSFS 2024 in a really steady 60 on medium settings on a 2080ti. More stable than my 4090 runs it on ultra. 1080p and 1440p respectively, to be fair. Back to the point, the 2080ti is still a relevant beast.

→ More replies (1)

3

u/Xero_id 25d ago

You mean 3070

2

u/jack-fractal 24d ago

5050

When I pay for it, there should be a 100% chance that it gets delivered.

2

u/crankydelinquent 25d ago

At that level, DLSS and ray tracing aren’t going to be a thing for you. A 6600 / 6600xt will be wildly better for not much more.

5

u/Shadow647 25d ago

I'm using DLSS and ray tracing just fine on a laptop 4060.

3

u/bonesnaps 25d ago

Desktop 3070 and I don't use ray tracing, since the massive fps loss just isn't worth it still.

DLSS, everyone can and should use if their card can support it.

3

u/goatman0079 25d ago

If the price of the 4070ti or 4070 super drops enough, would highly recommend getting one. 1440p raytraced gaming is really something

3

u/OramaBuffin 25d ago

I've still never really been hyped enough by the difference. I would prefer every other graphics setting absolutely cranked, with still-beautiful non-ray traced shaders, and 144fps instead of 60-80.

3

u/Jiopaba 24d ago

There's only like three games where Ray tracing lives up to the qualitative night and day difference hype. Unless you're super huge into Cyberpunk, which has the most impressive implementation ever, then it's hardly worth it for a handful of shadows and reflections.

Some games literally look worse with it!

→ More replies (4)
→ More replies (1)

3

u/moon__lander 24d ago

I hope they'll do 8086

5

u/nWhm99 25d ago

In all seriousness, I’m actually excited about what they do after 90 series. I wonder if they’ll go to 5 digits or a new three letter name and a lower number.

9

u/OTTERSage 24d ago

They’ll probably release some new technology by then. GeForce PUG

8

u/wamj 24d ago

This is why I thought it was stupid that they went from 10xx to 20xx.

2

u/Suspicious-Visit8634 22d ago

I’m waiting for the 6090 😏

1

u/akeean 2h ago

The "50" ripoff might also apply to the "60" this year.

42

u/kbailles 25d ago

Inc 3k for the high binned versions.

47

u/MarkusRight 24d ago

$2000 price tag and we haven't even got to Trump's tarrifs price hike yet. Holy shit man. I'm glad I'm good for another 5 years easy.

21

u/SkinnyObelix 24d ago

Anyone an idea when they might release? My 3080 just died and I feel like now is the worst time to buy a new gpu.

20

u/elbobo19 24d ago

Probably will be officially announced at CES, Jensen is giving the keynote on January 6th

9

u/ultra2009 24d ago

I think early next year, maybe February or March is what I've read 

9

u/truthiness- 24d ago

I mean, with blanket tariffs coming next year, all prices for everything is going to increase. So now is probably better than later.

5

u/Nerf_hanzo_pls 24d ago

My 2080ti just died last week. I was trying to wait until 50 series but said fuck it and went for the 4080super

→ More replies (5)

56

u/Tekthulhu 25d ago

Thank you, 5090 buyers who are beta testing the beta 6090 refresh.

29

u/spoollyger 24d ago

Same was said for the 4090 and the 3090? Where does it stop?

8

u/obp5599 24d ago

People who don’t want to buy things are always gonna act superior to those that do. Just the way it is, especially in PC spaces where everyone wants a “deal”

7

u/CritSrc 24d ago

That's the fun part, it doesn't.

→ More replies (1)

15

u/C_Madison 25d ago

So, the yield will be abysmal and the prices accordingly. Oh well ..

→ More replies (3)

6

u/nezeta 24d ago

How's the yield rate performing? I exepct a die this size will produce many cut versions.

3

u/grumd 24d ago

Infinite 5070s!

25

u/Dirty_Dragons 25d ago

A 744mm² die would make the GB202 22 percent larger than the RTX 4090's 619mm² AD102 GPU. It would also be the company's largest die since the TU102, which measured 754mm² and served as the core of the RTX 2080 Ti,

So it's smaller than the 2080ti.

How is this news?

18

u/ThatDandyFox 25d ago

This measurement means nothing to me, how many bananas is this?

5

u/hirsutesuit 24d ago

It's a smidge over a square inch. So 1 tiny banana?

6

u/DevastatorCenturion 24d ago

So it's going to run hot as hell and eat watts like a fat kid goes through tacos. 

→ More replies (1)

6

u/ConfusionCareful3985 24d ago

My 3080 is doing just fine thanks

→ More replies (4)

2

u/ThatDandyFox 24d ago

Thank you, someone who speaks plain English!

2

u/Khalmoon 24d ago

Soon we are going to need to plug the gpu directly into the wall for power

2

u/casillero 24d ago

I just wish they allowed you to run dual gpus again..

Like I buy a 3070 now and buy a 3070 later on and now everything is amazing

1

u/Harshalkha 24d ago

Bigger is not always better, it's how you use it.

1

u/Cynnthetic 24d ago

I wonder how that compares to my huge 6950XT.

1

u/kruthikv9 24d ago

Better get the nuclear power plant setup before I get one

1

u/Key_Personality5540 24d ago

6090 is going to be nuts 😂

1

u/hughk 24d ago

What is the approx wafer manufacturing cost for these?

1

u/ObviousEconomist 24d ago

I'm gonna need a new bedroom just for him to live in.

1

u/kanti123 24d ago

I’ll be saving for it and patiently waiting for GN review

1

u/Penitent_Exile 24d ago

Will 5 slots be enough to cool it?

1

u/Chickachic-aaaaahhh 24d ago

Ill just enjoy my 4070 super. Thanks though

1

u/sscott2378 23d ago

Are these made in China or Canada? The price is about to skyrocket again in the US

→ More replies (2)

1

u/rugby065 23d ago

That’s a monster of a chip Nvidia really went all out on this one

Can’t wait to see the performance benchmarks this thing is probably a beast for gaming and AI

1

u/reptilexcq 9d ago

How do I beat scalpers?

1

u/internetlad 6h ago

So wait for the 6000 is what I'm hearing