Given it hasn't even been released yet (later this week), how are we going to share hashrates?
|
|
|
I seriously doubt it would be worth using as a space heater, the heating efficiency is going to be super low
The "heating" efficiency out of any ASIC-based cryptocoin miner is actually very high - directly competative with any "space heater". The PRICE efficiency though might be a different story, depending on the cost for the old miner - and temperature control might be an issue unless it's only a SMALL part of the total heat input to the room or you set the room with a termostatically-controlled damper/fan setup (or the room is on some sort of zone-controlled heat setup already).
|
|
|
I ran several machines for a while on a 500 kbps (on a GOOD night, more commonly 200-300) 3g Cell connect for years.
10-20kbps per machine should handle the load reasonably well for most cryptocoin mining.
|
|
|
14 nm and 16 nm from a PERFORMANCE standpoint are essentially identical - and are the current semiconductor state-of-the-art.
Bitmain CAN'T release a miner that is significantly more efficient than the S9 - they're having enough issues making the S9 work reliably at the MIDrange of the chip's design specs.
When the next node (be it 10nm or 7nm) finally hits production will be the SOONEST Bitmain (or anyone else) will be able to come up with a significantly more efficient miner - and current indications are "2019 or 2020" for that timeframe, even *IF* the new node has design tools available to small designers well before the timeframe of actual production arrives.
The days of a miner only having 6-10 months as the "most efficient" are OVER - there is no more "catch up TO the general semiconductor state of the art" left.
At this point we're probably looking at 3-5 YEAR generation cycles - possibly LONGER as all of the major chipmakers have stated that "10nm is the end of the road for pure silicon".
Yeah your points are good. 7 nm will use material other them silicon maybe carbon. Carbon isn't a semiconductor, so it's not an option. Germanium is possible - IBM did state at some point that their 10nm process would be using a germanium/silicon hybrid wafer, and germanium was widely used for a long time but doesn't handle high power as well as silicon so started losing out. In any event, Moore's Law is in trouble - the next couple generations might see it finally come to an end and process advancements slow down a LOT as quantum effects have made it increasingly difficult to reduce feature size over the last 10-15 years....
|
|
|
Did you try ETH or ZEC?
2 GB card, forget even trying ETH (though ETC might still be possible as it has a smaller DAG file) as the DAG file is too big to fit on the card any more.
|
|
|
14 nm and 16 nm from a PERFORMANCE standpoint are essentially identical - and are the current semiconductor state-of-the-art.
Bitmain CAN'T release a miner that is significantly more efficient than the S9 - they're having enough issues making the S9 work reliably at the MIDrange of the chip's design specs.
When the next node (be it 10nm or 7nm) finally hits production will be the SOONEST Bitmain (or anyone else) will be able to come up with a significantly more efficient miner - and current indications are "2019 or 2020" for that timeframe, even *IF* the new node has design tools available to small designers well before the timeframe of actual production arrives.
The days of a miner only having 6-10 months as the "most efficient" are OVER - there is no more "catch up TO the general semiconductor state of the art" left.
At this point we're probably looking at 3-5 YEAR generation cycles - possibly LONGER as all of the major chipmakers have stated that "10nm is the end of the road for pure silicon".
|
|
|
I suspect it's about existing marketing leverage, while moving to the *current* CPU platform - which has quite a few inexpensive CPUs available, you don't HAVE to use a high-end i5 or i7 after all.
|
|
|
I just saw an article about that Alienware Amplifier thing last night - interesting solution but for the $300+ you pay JUST for the chassis (no card included), you can easily build a complete MACHINE (not counting cost of the GPU to be fair to both sides) and a 2-port KVM switch and end up with a MORE powerful machine than your laptop that you can run at the same time.
Also, if you dig into it, it's connection is either 2x or 4x PCI-E (not an issue for *most* mining or riser-based rigs running off 1x slots wouldn't work, but it's a definite handicap for Folding).
|
|
|
what do you think is the ideal size and specs of a mining hdd guys?
On new drives, the Seagate 8GB Archive drive (if they have decent longevity) are the current king. What you want for BURST mining is max capasity at lowest cost with reasonable reliability - write speed doesn't matter except while plotting, read speed on ANYTHING serial-ATA is plenty enough to keep up. Even the later PATA drives were plenty fast enough to handle the capasity they have so that the plots get read in a timely way.
|
|
|
In my experience the SP20 was not quiet, but it was a LOT quieter than my Antminer S5 and Innosilicon A2 units - definitely LOW end for a vaccum cleaner on noise level.
|
|
|
the current 8.3 cent offer i got is for 580 000 kwh cosumption per year if i head towards 2 mwh i will get 0.03 cents
If your actuall "ALL costs included" price per KWH of electric is in the 8.3 cents (US equivilent) range, that's not going to work well IF at all for a major mining facility. 3c / KWH on the other hand IS a good price range to run a major facility. Make sure you are looking at ALL of the costs though, not just the base rate charge. In many cases the "added charges" and surcharges total up to MORE than the base rate does. 2 MW is not "substation" power draw range, but it DOES take some substantial investment in infrastructure. Check out the Great Northern Data thread for some information - they're bigger than that, but not in a range that they're completely different on infrastructure requirements. 675 TH by current standards isn't all that big of a farm - it's definitely not SMALL but by the standards of the Major Farms it's definitely on the low end.
|
|
|
Same reason I've got about 8 TB running Burst - already had the drives, they weren't doing anything else, and the CPUs in those machines are underutilised.
|
|
|
Are you guys finding that different makes/speed of ram make any difference to mining rigs ?
Not that much on ZEC, which seems to be more compute-bound than memory bound. It DID make a big difference on Ethereum, which is pretty heavily memory-limited on most cards.
|
|
|
^You aren't only one with Tonga.
It's true that this chip is inferior than Tahiti's, but man 160 hash/s is pretty low. Make it at least 220-230 h/s.
It looks like there's at least 4 of us with Tonga chipsets who have replied to this thread. Claymore can you please see about making some improvements for this chipset? Sending a little love towards the "budget" miners using Tonga cards would be much appreciated. These cards really are capable of a lot more speed. Thanks again for all the hard work you do for the community! I wish folks would talk about card models, not chipset names - some of us do NOT have the HUGE number of chipset names memorised.
|
|
|
This is good news. I hope they don't stop selling them like last time Bitmain put out a Scrypt ASIC Miner.
Bitmain AFAIK never actually sold their previous Scrypt design - announced they were working on one then dropped it.
|
|
|
There's no point buying 8GB for mining purposes only. Draws more power, doesn't perform better than 4GB 8GB cards usually have 8000 Mhz ram vs 7000 or slower on 4gb cards - which IS a significant speed factor on Etherium.
|
|
|
16 = green 15 = black
should work
thanks Phil. Using the pin hole not used as the marker, the pins 15 and 16 should be the correct ones. I should really get a proper jump kit for this but for now the quick and dirty way will sufficient for testing purposes. NOTE: working great, tested with a fan 1300 G2 when I bought my pair included a black plastic plug-in piece that acted as a jumper.
|
|
|
HDD data speeds matter a little, but is probably only a significant factor on large arrays or if you're using old PATA interface drives. I suspect that what you're using to process the blocks (CPU or GPU) is quite a bit more of a factor especially if you optimised your plots.
Also, you're not using all THAT much internet bandwidth - plotting uses NO bandwidth at all, and I'm pretty sure that even the "found good plots" don't send a TON of data. Definitely not even in the ballpark of 1 TB per block.
|
|
|
You guys do realize that 90% of all people mining ZEC are using AMD cards.
Even if the new version is 10x as fast, the hashrate will go up 9x anyways and there won't be much profit anyways.
Our competition is Nvidia GPUs and CPU botnets however I am sure the total hashrate of all those devices with ZEC is less than 10%.
Most Nvidia's went back to ETH mining and most botnets went back to XMR.
We are all on "borrowed time" pretty much when it comes to mining. Our only catch at success is to sell ZEC, buy BTC and keep it and hope SEC approves the ETF and bitcoin goes to $10,000/coin.
I'd guess closer to 80% but definitely a large majority are AMD - and quite a bit of it is likely 2GB cards that CAN"T do ETH any more. NVidia has NOT moved to ETH - 1070s in particular are better off on ZEC than ETH and have been since the Nicehash NVidia miner got up to speed. There HAS been some NVidia move elsewhere though - stuff like LBRY that NVidia dominates and AMD is a non-factor.
|
|
|
I hope that Claymore did it with the new drivers from AMD 17.1.1 so at the moment it is the best driver in the increase in speed
That would be wrong, 17.1.1 is not working with moded(even for ETH) bioses and such cards are wast majority. So 16.11.5 (or 3 ) is best shot. 16.10.1 as the last WHQL qualified drivers that allowed mod BIOS
|
|
|
|