FP91G (OP)
Legendary
Offline
Activity: 2296
Merit: 1428
✅ NO KYC
|
 |
December 26, 2025, 02:01:31 PM |
|
The next generation of RTX 6000 could be the most expensive in history. NVIDIA's next-generation Rubin architecture graphics cards, the GeForce RTX 6000, will be released amid fierce competition for resources driven by the AI boom. Extensive purchases of memory for neural networks, the growth of data centers, and a shortage of new capacity at TSMC and Samsung factories have led to a sharp rise in the cost of key components. GDDR7 price — the main type of memory for new ones GPU — has increased 3-4 times compared to the GDDR6 and GDDR6X generations. While a single 2-gigabyte chip previously cost $2-3, prices now fluctuate around $8-12, with some shipments reaching up to $15 per chip. Due to limited supply, memory manufacturers are prioritizing shipments to major AI players such as OpenAI, Meta, and Amazon, creating a shortage for the consumer market. Because of this, NVIDIA is already laying down increased MSRP for the RTX 6000-series: RTX 6090 — $2,999 RTX 6080 — $1,699 RTX 6070 Ti — $999 RTX 6060 Ti — $599 However, retail prices will be even higher. Historically, high-demand graphics cards with limited capacity received a 25-40% premium at launch. In this case, the benchmark is plus 35% from MSRP: RTX 6090 — $4,048 RTX 6080 — $2,293 RTX 6070 Ti — $1,349 RTX 6060 Ti — $808 Thus, Rubin will be the first generation where the cost of memory and the struggle with AI infrastructure directly impact the final price. The official announcement of the graphics cards is expected in end of 2026 year, and the release date is in first half of 2027. Source
|
|
|
|
1971ECPT
Jr. Member
Offline
Activity: 68
Merit: 2
|
 |
December 27, 2025, 08:57:44 PM |
|
A graphic card as expensive like a good car? No, I stick at my 2080Ti, enough for what I need in gaming. Anyway, graphic card mining is dead and burried. Thanks to that skinny MF, with his POS instead let ETH to POW.
|
|
|
|
|
FP91G (OP)
Legendary
Offline
Activity: 2296
Merit: 1428
✅ NO KYC
|
 |
December 29, 2025, 05:05:04 PM |
|
Next year, we're predicted to experience a major shortage of memory chips and video cards. Miners are primarily using top-end Series 3 and Series 4 video cards. Buying the latest video card models is very unprofitable for mining. In 2026, in the first quarter, a 15-20 percent increase in prices for video cards is expected.
|
|
|
|
GEMIN_M4
Member

Offline
Activity: 269
Merit: 26
|
 |
December 29, 2025, 05:35:29 PM |
|
Must it be Nvidia? AMD graphics cards are still far more affordable and they do gaming and editing very well, the only difference is AMD. Using FSR4 which is a bit weaker than Nvidia counterpart, it's better to skip this cycle GPUs and wait for RTX7 id Nvidia is your favourite GPU maker.
As for crypto mining, they aren't even profitable anymore, I would rather spend that money on BitAxe than a graphics card at this point, since Ethereum mining stopped everything good about PoW faded very faster, how many PoW coins are even good for mining and holding for long term now? If you have the money it makes all sense to choose Bitcoin mining over Altcoins right now.
|
|
|
|
|
FP91G (OP)
Legendary
Offline
Activity: 2296
Merit: 1428
✅ NO KYC
|
 |
December 30, 2025, 11:44:08 AM |
|
Must it be Nvidia? AMD graphics cards are still far more affordable and they do gaming and editing very well, the only difference is AMD. Using FSR4 which is a bit weaker than Nvidia counterpart, it's better to skip this cycle GPUs and wait for RTX7 id Nvidia is your favourite GPU maker.
As for crypto mining, they aren't even profitable anymore, I would rather spend that money on BitAxe than a graphics card at this point, since Ethereum mining stopped everything good about PoW faded very faster, how many PoW coins are even good for mining and holding for long term now? If you have the money it makes all sense to choose Bitcoin mining over Altcoins right now.
Mining with graphics cards isn't currently profitable, but the rental market for top-end graphics cards offers good results. However, this market requires a different hardware setup and high-speed internet. Using an ASIC at home or in an apartment is difficult, but a powerful PC is quite feasible.
|
|
|
|
aysha9853
Full Member
 
Offline
Activity: 751
Merit: 101
Rainbet #1 Non KYC Crypto Casino & Sportsbook
|
 |
January 29, 2026, 02:08:53 PM |
|
The next generation of RTX 6000 could be the most expensive in history. NVIDIA's next-generation Rubin architecture graphics cards, the GeForce RTX 6000, will be released amid fierce competition for resources driven by the AI boom. Extensive purchases of memory for neural networks, the growth of data centers, and a shortage of new capacity at TSMC and Samsung factories have led to a sharp rise in the cost of key components. GDDR7 price — the main type of memory for new ones GPU — has increased 3-4 times compared to the GDDR6 and GDDR6X generations. While a single 2-gigabyte chip previously cost $2-3, prices now fluctuate around $8-12, with some shipments reaching up to $15 per chip. Due to limited supply, memory manufacturers are prioritizing shipments to major AI players such as OpenAI, Meta, and Amazon, creating a shortage for the consumer market. Because of this, NVIDIA is already laying down increased MSRP for the RTX 6000-series: RTX 6090 — $2,999 RTX 6080 — $1,699 RTX 6070 Ti — $999 RTX 6060 Ti — $599 However, retail prices will be even higher. Historically, high-demand graphics cards with limited capacity received a 25-40% premium at launch. In this case, the benchmark is plus 35% from MSRP: RTX 6090 — $4,048 RTX 6080 — $2,293 RTX 6070 Ti — $1,349 RTX 6060 Ti — $808 Thus, Rubin will be the first generation where the cost of memory and the struggle with AI infrastructure directly impact the final price. The official announcement of the graphics cards is expected in end of 2026 year, and the release date is in first half of 2027. SourceWow looking at these prices, the high-demand graphics cards probably will hold value if not going up more. Since AI and more the chips or the graphics cards are very hard to get. Prediction on price maybe? id say 200% in the next 2 years. (just a guess btw)
|
|
|
|
|
Xylber
|
 |
January 29, 2026, 03:27:54 PM |
|
My recommendation is the same we are having AI related Reddit since 2023: Get any high VRAM card, even if it is an used 3090 (24gb vram). Don't wait for newer models, they don't want GPUs nor AI in hands of common people.
If you want to use AI models locally, VRAM is king even if it is an older model. And you'll be able to generate images and use some LLMs for free, on your own computer, with no internet connection nor sharing your prompts with OpenAI, Google, etc.
|
|
|
|
|
FP91G (OP)
Legendary
Offline
Activity: 2296
Merit: 1428
✅ NO KYC
|
 |
January 31, 2026, 02:08:53 PM |
|
My recommendation is the same we are having AI related Reddit since 2023: Get any high VRAM card, even if it is an used 3090 (24gb vram). Don't wait for newer models, they don't want GPUs nor AI in hands of common people.
If you want to use AI models locally, VRAM is king even if it is an older model. And you'll be able to generate images and use some LLMs for free, on your own computer, with no internet connection nor sharing your prompts with OpenAI, Google, etc.
An RTX 3090 graphics card is hard to find in good condition. Most of them were used for mining. If you're looking for a rental card, the latest high-end graphics cards are popular there. Rumor has it that the RTX 6090 will have up to 48GB of GDDR7 memory, but memory is currently a major issue.
|
|
|
|
safar1980
Legendary
Offline
Activity: 2380
Merit: 2070
✅ NO KYC
|
 |
February 09, 2026, 05:39:52 PM |
|
No NVIDIA GeForce RTX 50 "SUPER" GPUs This Year, RTX 60-Series Also Pushed BackArtificial Intelligence may be eating the world of software now, but gamers are suffering. According to The Information, NVIDIA has reportedly entirely postponed the launch of its GeForce RTX 50 "SUPER" refresh, as the company's executives are prioritizing AI accelerators over the gaming sector, which consumes precious cutting edge GDDR7 memory. The GeForce RTX 50 "SUPER" refresh was originally scheduled for an announcement at CES 2026, with shipping in Q1 or Q2 of 2026. However, the GDDR7 memory used in the SUPER lineup was a high-capacity 3 GB version, which NVIDIA managers in December deemed too important for gamers, postponing the refresh entirely. The "SUPER" series was planned with denser GDDR7 memory modules, offering 3 GB of capacity per chip, increasing the memory configuration of the standard GeForce RTX 5070, RTX 5070 Ti, and RTX 5080. Initially, the RTX 5070 SUPER was planned with an upgrade to offer 18 GB, while the RTX 5070 Ti SUPER and RTX 5080 SUPER would each provide 24 GB of GDDR7 memory. As NVIDIA's AI GPU portfolio also uses the high-density GDDR7 memory, like the RTX PRO 6000 "Blackwell" and "Rubin CPX" the company has decided to instead prioritize this high-margin business, leaving gamers with inflated prices of the regular GeForce RTX 50-series.
|
|
|
|
FP91G (OP)
Legendary
Offline
Activity: 2296
Merit: 1428
✅ NO KYC
|
 |
February 18, 2026, 11:40:45 AM |
|
Report: Nvidia's RTX 6000 Graphics Cards Might Not Launch Until 2028 We're potentially facing a three-year wait between gaming GPUs as Nvidia focuses on AI chips. LinkIn some unwelcome news for PC builders, Nvidia might delay next-generation RTX 6000 graphics cards because of the ongoing memory shortage. Mass production of the RTX 6000 series was originally slated for late 2027, but the global memory shortage caused by the race to build AI data centers has derailed those plans, according to The Information, which suggests that RTX 6000 GPUs won’t arrive until 2028. That would mean a three-year stretch of not releasing a new gaming GPU generation since the RTX 5000 series was announced in January 2025.
|
|
|
|
|