Got 2 bottom rung gigabyte 3060ti eagles. Both can only achieve +1025 on the mem (effective speed 7825mhz) probably due to the cheap cooler design. 60MH @ 140W
|
|
|
63Mhs with 140w is not a RX 5700 killer, more efficient ok, but not so much
in the sense is 24% better than rx 5700 for gaming and yet the same price, will hold better resale value but that is it, efficiency not sure yet, need to see the power at wall, 130 watts on software usually is 160 watts real. my 2080supers and 3070 have been oced to +1500 and have been running fine without problems
as long you keep it cool yeah, problem is you cant see memory temperature on 3xxx series, on rx 5700 you can and that is very important. This is the reason why i've avoided the 3080s after seeing people experience throttling - some of my 5700s, those that throttle due to high mem temps such as the sapphire pulse and asus dual evos, have seen max mem oc degrade. At new they started fine at 1800, but now some can only run at 1724.
|
|
|
All depends on if you can buy them. And if eth price stays high.
"Retailers say Nvidia RTX 3060 Ti launch stock is like all the 3080, 3090, and 3070 cards combined" https://www.pcgamer.com/nvidia-rtx-3060-ti-stock-levels-3080-3090-3070-combined/I guess if true then will help to offload all the drama somewhere else hehe If price is close to msrp then I will buy one, need a replacement for my gtx 1070, was going to buy a 3080 but with all the nonsense, might settle for something that is easier to find and uninterested for scalpers hehe Sounds about right. saw stock for about 3seconds instead of 1
|
|
|
Yesterday saw video test Gigabyte GPU - 63Mh at windows, pl 80%, memory +1500, 140Wt. In stock it gives about 51Mh and 200Wt. I dont know about price but seems it would be about 3070.
Do you have the source of this test? I saw some people talking about overclocking +1500mhz on memories of new RTXs, and now with 3060ti, but is this level of overclock sustainable? Previous generations I was able to achieve +1000Mhz with good memories (Samsung) and totally stable, more than that become a little messy, sometimes +1100 but nothing like +1500mhz... 63Mhs with 140w is not a RX 5700 killer, more efficient ok, but not so much my 2080supers and 3070 have been oced to +1500 and have been running fine without problems
|
|
|
50x for 35k? should have jumped on it and be a gpu reseller ![Grin](https://bitcointalk.org/Smileys/default/grin.gif)
|
|
|
hmmm 61MH @ 120W makes it about 20% more efficient than a 5700 and 10% more efficient than a 5600. Definitely worth considering if the price is right.
I agree amd rigs do require a little bit of TLC, but once you do a couple it isn't _that_ bad. I used up the last few amps of capacity by building rx5600 rigs. 44mh @ 90+W. Managed to cramp 8 of these in a server case with a single 1000w psu, so in terms of density, it is about on par with the 3060ti
|
|
|
Hi,
Is it possible to get awesomeminer to display gpu stats when connected to lolminer/phoenixminer using external miner function? Awesomeminer displays hashrate but nothing else.
When using the External Miner concept, Awesome Miner can only display the mining details provided by the mining software. For GPU it can typically include the temperature, but not all details you get if the mining is fully controlled by Awesome Miner or Remote Agent. Hi Patrike, Lolminer for example provides most of the important gpu stats in json format: "Index": 0, "Name": "AMD Radeon RX 5600 XT", "Performance": 43.13, "Consumption (W)": 72.001000000000005, "Fan Speed (%)": 51, "Temp (deg C)": 57, "Mem Temp (deg C)": 88, "Session_Accepted": 52, "Session_Submitted": 52, "Session_HWErr": 0, "Session_BestShare": 930349369113, "PCIE_Address": "3:0" awesomeminer already display these stats if using the remote agent. Could we get them included when used as an external miner too? Awesome Miner was already reading a number of these properties for Lolminer, but was missing a few important like fan speed, temperature and power usage. They may have been added to the Lolminer API at some point. I will improve this in the next release. Thanks for your feedback. For Phoenix Miner via an External Miner I can already see the fan speed, hashrate and temperature reported for each GPU. Power usage isn't provided by the API. Thanks Patrik, it would be awesome if you could also add lolminer's stats via external miner too
|
|
|
Hi,
I'm unable to get awesomeminer to link up with trm running on hiveos. I have --api_listen=4028 in hiveos config, however awesomeminer is unable to see it. Is there something else i should be doing or is my config wrong?
Correct port for TRed on HiveOS is 65078. Thanks! works perfectly.
|
|
|
Hi,
Is it possible to get awesomeminer to display gpu stats when connected to lolminer/phoenixminer using external miner function? Awesomeminer displays hashrate but nothing else.
When using the External Miner concept, Awesome Miner can only display the mining details provided by the mining software. For GPU it can typically include the temperature, but not all details you get if the mining is fully controlled by Awesome Miner or Remote Agent. Hi Patrike, Lolminer for example provides most of the important gpu stats in json format: "Index": 0, "Name": "AMD Radeon RX 5600 XT", "Performance": 43.13, "Consumption (W)": 72.001000000000005, "Fan Speed (%)": 51, "Temp (deg C)": 57, "Mem Temp (deg C)": 88, "Session_Accepted": 52, "Session_Submitted": 52, "Session_HWErr": 0, "Session_BestShare": 930349369113, "PCIE_Address": "3:0" awesomeminer already display these stats if using the remote agent. Could we get them included when used as an external miner too?
|
|
|
Hi,
Is it possible to get awesomeminer to display gpu stats when connected to lolminer/phoenixminer using external miner function? Awesomeminer displays hashrate but nothing else.
|
|
|
Hi,
I'm unable to get awesomeminer to link up with trm running on hiveos. I have --api_listen=4028 in hiveos config, however awesomeminer is unable to see it. Is there something else i should be doing or is my config wrong?
|
|
|
Yes, i recommend replacing cards. It's a numbers game - the going is good now and if I can boost roc for minimal outlay, I will hit it!. i paid £280 for the 5700, sold the 1080ti for £350. IMO, it is not a good idea to sell the 1080ti and hold cash while you hope for the rtx 30s and hope for the market to stay good. Hoping is not my strategy, but you do you.
Good luck buying 3060ti anytime soon. I use stockinformer and i'm never quick enough to get 3080s for my gaming pc. I can get the 3090 but i wouldn't even buy that with your money.
|
|
|
Normally, i'd say just do it and go big. But with ETH going pos soonish, i think it is a really bad time unless your breakeven point can be achieved within a year. If you have existing cards, then i think you should sell them off and replace them with new gear
I disagree with this, you should not buy anything till you know what the hell is going to happen in few months. hence why i said it is a bad time to buy, unless your breakeven is short. when i said in my previous post to buy new cards, it is to replace old inefficient cards, without adding more capital, unless that additional capital can reach breakeven before eth going pos within its current timeline. for example, i sold 2 1080tis and replaced them with 3 5700.
|
|
|
Normally, i'd say just do it and go big. But with ETH going pos soonish, i think it is a really bad time unless your breakeven point can be achieved within a year. If you have existing cards, then i think you should sell them off and replace them with new gear
|
|
|
just swapped a couple of rigs over to lolminer. looking good so far!
by the way, what efficiency numbers are you seeing for the 6800 vs 5700?
|
|
|
from your screenshot, you could reduce your gpu voltage to 740mV and run the core much lower. With timing mods, and mem at 1800, core can be as low as 1250 before hashrate starts taking a hit. It looks like you didn't do any timing mods, so you could run the core at 1150-1200
|
|
|
Its not fast but its cool. I can up the mem a bit but it crashes. I can lower the voltage a bit but it crashes. I can lower core clock but hashrate degrades a lot. I wonder how many watts, 120v, on 80+ bronze psu 1000GQ, it uses at the wall. Any guesses before I measure ? any ideas to improve, this rig is a bit weird. edit: its the asus 5700 oc evo dual. I wouldnt say very stable but were 340 in usd equivalent so very cheap to me ![](https://ip.bitcointalk.org/?u=https%3A%2F%2Fi.imgur.com%2FamHPUpa.jpg&t=663&c=d2Aj-TCfAByRyw) your memory temps are causing it to throttle and crash. I have two of these cards and have replaced thermal pads, added pads to back plate, increased mounting pressure, but nothing works. The only way to make these cards work is to blast the fans (i have mine at 75%) and keep them as far apart as possible.
|
|
|
yeah can't go wrong with evga. You should get these instead ![Grin](https://bitcointalk.org/Smileys/default/grin.gif)
|
|
|
Hi guys,
I have the same problem here. Im using a Gigabyte RTX 3080 Gaming OC.
The only settings that I dont get a Thermal Throttling is: TDP - 52%; Mems: -500; Core -500; Fans - 90%, this give me 72 MH/s at 190W.
It took me days to find this thread and finally discover why my RTX 3080 dont generate the 100 MH/s.
Anyone has found a solution that doesnt involve voiding the warranty? Because I cannot disassambly the cooler without voiding the warranty (at least here in my country).
Also, I have a friend with a Asus RTX 3080 TUF OC with the same problem.
Never ever go cheap with cards and expect wonders... If you cant disassemble the crap coolers and lower the clocks is no option anymore, increase the fan speed further and/or use external 3000(+) high pressure fans in addition (like the Noctua Industrial). But this will hurt the lifespan of the original fans even more. So expect to replace them in the near future. Go cheap? Its a bloody RTX 3080 card, even the entry level RTX 3080 is far cry from cheap, It SHOULD work properly. I dont really care about the lifespan of the fans, since this card has a 4 year warranty here. I haven't watched all of hardware unboxed videos to see their reviews on this specific Gigabyte model but if you're dropping TDP down to 1/2 and lowering the clocks across the board you're basically only using 1/2 the card. I would just sell the card locally for a nice markup and find a different model of card or just buy the crypto directly. The Gigabyte doesn't have mixed caps so it was probably going to have issues right off the bat. A friend got the Asus RTX 3080 TUF OC with the fancy MLCC caps and has the same problem. I dont think that is a cap issue, Gigabyte uses the 470u caps instead of the 220u caps (ZOTAC like), more than the double specified by nvidia. It`s revolting that they have done something like that. As Phil noted this is a case where mixed caps seems to filter multiple frequencies so that the heat buildup does get to extremes (see futher back in the thread for a chart he linked). Both the Asus and Gigabyte are thus bad models to use for mining because they're going with a single cap type. Obviously nVidia will remedy this with Rev 2 cards from AIB just like Asus had to RMA all those bad RX 5700 XTs that made poor thermal contact. Honestly, this is all nVidia's desire to say they were first to market even if it was a paper launch. Their Founders Editions had 6+ months to test out configs that worked and it looks like AIBs got screwed by having to rush to production. Sometimes you get lucky buying the first batch because you might get better binned VRAM or higher binned chips (whereas lower yield chips would be saved for some cut-down refresh card). Sometimes you get screwed by meeting design defects and get to be the guinea pig. In this I couldn't even be a guinea pig because I could get my hands on one. It still games well though as no game is going to run a constant 300w TDP through it. If you want the hashes, sell the card for a profit and just pick up a Founders edition. Phil sitting on 94MH/s sounds decent. I have trouble getting my head around this explanation. The caps issue had to do with the cards being unstable when it boosted to high clocks - a remedy was to reduce the max clock speed. Igorslab has done a test and showed high memory temps on the FE. The slowdown in hashrates is very similar to my sapphire rx5700 when the memory temp reaches 104-106C. It wouldn't surprise me if the AIB models that have this hashrate drop issue cheaped out on thermal pad quality. Perhaps the reason why you are not seeing hashrate drops on higher end, mixed cap models is because they didn't cheap out on the memory cooling.
|
|
|
So 170watts at the system level to pull in an extra 10MH/s over a 5700XT that can do the same with 100-110watts. Looks like the miners are going to need some tweaking to get anywhere near 70MH without melting the memory. No stock on Amazon and Microcenter had low amounts so meh thus far. it was running at 880mV and there was only a small overclock on the memory. 70MH @ 150w might be achievable
|
|
|
|