Bitcoin Forum
May 25, 2024, 03:26:02 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 4 5 6 7 8 9 »  All
  Print  
Author Topic: New NVIDIA Geforce RTX 30 series GPUs  (Read 3867 times)
niksdt101
Hero Member
*****
Offline Offline

Activity: 803
Merit: 501



View Profile
September 02, 2020, 11:24:51 AM
 #21

I currently have RX480's in 2 Rigs of 4 each , which already got past the ROI mark . AMD Big Navi will be the next step for me but these 3070 are tempting , as by the time bignavi comes to consumer market , i guess ETH may swtich to POS.
philipma1957
Legendary
*
Offline Offline

Activity: 4130
Merit: 7925


'The right to privacy matters'


View Profile WWW
September 02, 2020, 03:08:33 PM
 #22

I currently have RX480's in 2 Rigs of 4 each , which already got past the ROI mark . AMD Big Navi will be the next step for me but these 3070 are tempting , as by the time bignavi comes to consumer market , i guess ETH may swtich to POS.

Eth has been switching to pos since 2017.

Once it switches it will die.

▄▄███████▄▄
▄██████████████▄
▄██████████████████▄
▄████▀▀▀▀███▀▀▀▀█████▄
▄█████████████▄█▀████▄
███████████▄███████████
██████████▄█▀███████████
██████████▀████████████
▀█████▄█▀█████████████▀
▀████▄▄▄▄███▄▄▄▄████▀
▀██████████████████▀
▀███████████████▀
▀▀███████▀▀
.
 MΞTAWIN  THE FIRST WEB3 CASINO   
.
.. PLAY NOW ..
SalvajeX
Newbie
*
Offline Offline

Activity: 20
Merit: 0


View Profile
September 02, 2020, 03:21:15 PM
 #23

I believe once Eth switches, GPU mining the way it exists today will be gone.
Anyway to the topic, not sure about Hashrate but these cards are monsters, good move from Nvidia.
waggy459
Jr. Member
*
Offline Offline

Activity: 41
Merit: 2


View Profile
September 02, 2020, 03:25:05 PM
 #24

I currently have RX480's in 2 Rigs of 4 each , which already got past the ROI mark . AMD Big Navi will be the next step for me but these 3070 are tempting , as by the time bignavi comes to consumer market , i guess ETH may swtich to POS.

Eth has been switching to pos since 2017.

Once it switches it will die.

I don't really understand why you say it'll die on POS.  It won't be the only one having done POS, and IMO if you read on the work going on at that end I do think there is some very weighty legitimacy in the project.  I do think if a person is "in crypto" it makes sense to have your eye on not only mining, but also staking, liquidity and validating.  It's all part of the system and you can make money on all parts.
chafer99
Member
**
Offline Offline

Activity: 99
Merit: 10


View Profile
September 02, 2020, 06:26:29 PM
 #25

I think, today already its not very likely that ETH switch to POS (same as assuming that Bitcoin switch to POS).
jsanzsp
Newbie
*
Offline Offline

Activity: 72
Merit: 0


View Profile
September 02, 2020, 06:55:52 PM
 #26


I hope they will be on sale soon in stores, this way the prices of the 5700 will be lowered a lot. AMD WILL ALWAYS BE BETTER FOR MINING
JayDDee
Full Member
***
Offline Offline

Activity: 1397
Merit: 221


View Profile
September 02, 2020, 07:10:19 PM
 #27

I think, today already its not very likely that ETH switch to POS (same as assuming that Bitcoin switch to POS).

Except that Bitcoin never said it would go POS. Maybe it's a perpetual threat to keep miners interested.

Anyways back on topic...

The VRAM size gap between the 3090 & 3080 way out of line with the cuda core count difference.
I wonder why.

Not an issue for mining but the 3090 is the only one that supports SLI but it's a 3 slot card, leaves no room in the case
for ventilation.

Ampere uses Cuda 11 and compute 8 but I don't see anything that helps mining. Significant software improvements
look unlikely.

sxemini
Member
**
Offline Offline

Activity: 1558
Merit: 69


View Profile
September 02, 2020, 07:43:58 PM
 #28

I've jotted down some quick napkin math.  Hashrate isn't necessarily perfectly linear to total memory bandwidth, but it is a somewhat reasonable analogue to give round numbers.

RTX 3090* ~935-1008 GBps
RTX 3080* 760 GBps
RX 5700 XT - 480 GBps  -  58 mh/s (eth)
RTX 2080ti - 616 GBps - 52 mh/s (eth)
GTX 1660 super - 336 GBps - 30 mh/s (eth)
RX 580 - 256 GBps - 30 mh/s (eth)

I think it would be somewhat reasonable to take a guess at 3080 hashrates in the 60s or low 70s and 3090 in the 90s.  They are neither low tdp nor cheap, so to me this doesn't represent a value proposition.  But ymmv.

Source for the RTX 3000 card : https://wccftech.com/nvidia-geforce-rtx-3070-8-gb-official-launch-price-specs-performance/

RTX 3090          - 936 Gbps - 75~80mh/s (eth) ??
RTX 3080          - 760 Gbps - 63~67mh/s (eth) ??
RTX 3070          - 512 GBps - 40~45mh/s (eth) ??
RX 5700 XT       - 480 GBps - 58 mh/s (eth)
RTX 2080ti        - 616 GBps - 52 mh/s (eth)
GTX 1660 super - 336 GBps - 30 mh/s (eth)
RX 580             - 256 GBps - 30 mh/s (eth)

If that's the case, it looks like sticking with RX 5700 XT would be better.

We can't really just linearly correlate hashing power like that. Factors such as the new GDDR6X compared to the previous GDDR5 versions could offer a boost to hash power and much more, or not. If i had to guess the hashing power boost should be much better.

Anyway i'm convinced to get a 3080 with the new price range. At least one.

GDDR6X is the same thing as GDDR5X in the past, it will not boost anything, can only clock higher than GDDR6 and thats it.
Beyerd17
Full Member
***
Offline Offline

Activity: 826
Merit: 103


View Profile
September 02, 2020, 09:20:54 PM
 #29

The 3080 seems reasonably priced for the performance. All who sold their 2080 in the last few months have made the right call, now the 2080 just lost a lot of value. I'm talking gaming performance here, for mining we shall wait for some tests/optimizations.

These prices also make me think that nvidia expects a strong showing from AMD so that's a good thing, too.

Yes, I believe so as well, there was talk from AMD of the Nvidia killer, so perhaps Nvidia is gearing up with top performance from the 3000 series to meet the threat from Big Navi. Will be exiting to see AMD's answer to this surprising big move in performance from the 3080. Competition is great.
ZeeeN
Jr. Member
*
Offline Offline

Activity: 297
Merit: 3


View Profile
September 02, 2020, 10:19:54 PM
 #30

rtx 3090 have NVIDIA CUDA® Cores   10496  maybe mining other algo is better than eth?
Beyerd17
Full Member
***
Offline Offline

Activity: 826
Merit: 103


View Profile
September 03, 2020, 11:37:05 AM
 #31


I hope they will be on sale soon in stores, this way the prices of the 5700 will be lowered a lot. AMD WILL ALWAYS BE BETTER FOR MINING

That will depend on what coin you want to mine. AMD is more efficient on some algos, while Nvidia gpu's are more efficient on other algos. I've already seen a lot of mining hardware come down in price on Ebay and other sites. Right now you can buy a used 2080ti for around 500 dollars, that is a lot lower than just recently.
aesma
Hero Member
*****
Offline Offline

Activity: 2408
Merit: 921


fly or die


View Profile
September 03, 2020, 02:30:09 PM
 #32

I think, today already its not very likely that ETH switch to POS (same as assuming that Bitcoin switch to POS).

Except that Bitcoin never said it would go POS. Maybe it's a perpetual threat to keep miners interested.

Anyways back on topic...

The VRAM size gap between the 3090 & 3080 way out of line with the cuda core count difference.
I wonder why.

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.
Metroid
Sr. Member
****
Offline Offline

Activity: 2142
Merit: 353


Xtreme Monster


View Profile
September 03, 2020, 09:35:42 PM
 #33

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

BTC Address: 1DH4ok85VdFAe47fSVXNVctxkFhUv4ujbR
JayDDee
Full Member
***
Offline Offline

Activity: 1397
Merit: 221


View Profile
September 03, 2020, 10:04:20 PM
 #34

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

I thought the 3090 was targetted at 8K gaming. Maybe the big VRAM is related to SLI support. My understanding is that only
one GPU's VRAM is used in SLI.

It's also interesting that the 3080 is marketed as the flagship suggesting the 3090 is experimental or special.

Metroid
Sr. Member
****
Offline Offline

Activity: 2142
Merit: 353


Xtreme Monster


View Profile
September 03, 2020, 10:20:34 PM
 #35

I thought the 3090 was targetted at 8K gaming. Maybe the big VRAM is related to SLI support. My understanding is that only
one GPU's VRAM is used in SLI.

It's also interesting that the 3080 is marketed as the flagship suggesting the 3090 is experimental or special.

Yes, only one vram is used while on SLI and the 3090 is the only gpu that can do SLI, reason they said 8k gaming at 60 fps but majority of these cards are for deep learning, reason they left the 3080 for dead, people would rather buy the 3080 16gb plus for deep learning, so basically what the did here is, want deep learning? 3090 is the only card to do it, want 8k gaming? 3090 is the only gpu to do cause sli support, and lets price the 3090, 2 x the 3080. I believe the 3090 is not even 50% faster than the 3080, more like 25% more performance and that is going overboard with it.

Looking back the last x90 we had was the GeForce GTX 690, price $1000 in 2012.

BTC Address: 1DH4ok85VdFAe47fSVXNVctxkFhUv4ujbR
arielbit
Legendary
*
Offline Offline

Activity: 3416
Merit: 1059


View Profile
September 04, 2020, 04:55:13 AM
 #36

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

or..... 3080ti/super is an ace under nvidia's sleeve, to be released if amd managed to pull off a 3080 killer hehe
arielbit
Legendary
*
Offline Offline

Activity: 3416
Merit: 1059


View Profile
September 04, 2020, 05:10:04 AM
Last edit: September 04, 2020, 05:59:08 AM by arielbit
 #37

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

monitor size, screen resolution  plus other graphics quality config on PC games tends to eat a lot of vram, my 1080ti 11gb ram gets filled when i max out game settings with my 24 inch 1900x1200 resolution..having ~34 inches monitor will need that 16gb at least, 3080 "10gb" is a teaser, gamers will hate that later on.

1080p gaming is dead, when i played new dawn of far cry, in the hazy/hallucinogen part of the game, the difference with 1080p vs 1440p and 4k with enemy visibility is huge.

10gb of vram for mining, i think it will be fine for quite a long time unless there is some mining software tweak of new algorithms and/or existing algorithms that will hash a lot more when more vram (greater than 10gb) is utilized.

besides, check this out https://www.youtube.com/watch?v=EGzsYCRMVu4 doom eternal 4k gaming, the fps dips as low as 110fps. for gamers that uses and want a 144 hz and above monitor and fps that does not dip below 144fps (hz and fps in sync), 3090 is the answer.
aesma
Hero Member
*****
Offline Offline

Activity: 2408
Merit: 921


fly or die


View Profile
September 04, 2020, 06:23:18 AM
 #38

According to this article : https://www.guru3d.com/articles-pages/far-cry-new-dawn-pc-graphics-performance-benchmark-review,6.html#:~:text=Far%20Cry%20New%20Dawn%20will,the%20upper%20limit%20being%20used.&text=Again%2C%20all%20cards%20consume%20roughly%20the%20same%20amount%20of%20VRAM.

Far Cry New Dawn is very far from filling 11GB, not even half that, in 4K.
arielbit
Legendary
*
Offline Offline

Activity: 3416
Merit: 1059


View Profile
September 04, 2020, 06:59:07 AM
Last edit: September 04, 2020, 07:11:03 AM by arielbit
 #39


i didn't specifically say far cry about the vram size issue, far cry is about 1080p vs 1440p and 4k gaming

the textures (other game settings that eats vram besides resolution) in resident evil 2 is a good example LOL

https://linustechtips.com/main/topic/1141516-wtf-my-2080-ti-doesn%E2%80%99t-have-enough-vram-for-resident-evil-2-%3F/

other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

the 3080ti or 3080 super (16gb or 20gb) is supposed to be the best bang for the buck but nvidia must be saving that "ace card" for amd release, if amd manages to pull off a 3080 killer.
Metroid
Sr. Member
****
Offline Offline

Activity: 2142
Merit: 353


Xtreme Monster


View Profile
September 04, 2020, 01:08:20 PM
 #40

other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.

BTC Address: 1DH4ok85VdFAe47fSVXNVctxkFhUv4ujbR
Pages: « 1 [2] 3 4 5 6 7 8 9 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!