AMD Chief Executive Lisa Su said on her company’s earnings call last month : “But it’s important to say we didn’t have cryptocurrency in our forecast, and we’re not looking at it as a long-term growth driver.” Hmm That is clearly a strategic mistake. Bye bye AMD I strongly suspect they are being realistic about it. The massive surge in mining the last 4 months is NOT sustainable unless major altcoins like ETH start climbing hard on pricing again.
|
|
|
First tests and benchmark were available for the Vega yesterday. In the gaming area the card was in the middle between the 1080 and 1080 ti and at a price of 599 USD I don't think that is appealing, for 700 you can get the 1080 ti nowadays. We have to see it in action in mining if the HWB2 type of memory can make any difference. I am curious too to see a few videos or at least a few pics of this card(legit version and not chinese copies) running in mining softwares.
VEGA 56 widely reported to have a 399 price point, Vega 64 air cooled 499. As I recall the WATER cooled Vega 64 is 599, but water-cooled cards are ALWAYS more expen$$$ive.
|
|
|
Sure, but competitors will not sleep forever (there are already quite a few good ASICs for altcoins) and may overtake the market of bitcoin miners, should they release a more powerful and economical miner for BTC sooner than Bitmain.
The only folks I can see managing that would be BitFury (their chip WAS in production before Bitmain's was on the current node, but Bitfury sells to BIG farms and manufacturers only so it ended up under the radar for quite a while). Nobody else to date has managed to MATCH the S9 much less beat it - and I don't see any significant probability of that happening on the next node (very long shot if Intel, AMD, Samsung, or NVidia decided to enter the ASIC miner business, but I don't think that's likely as it's such a SMALL business compared to what any of those folks do NOW). The reason the L3+ isn't as close to saturation as the S9 is threefold. (1) S9 has been getting sold a LOT longer, there are a TON more of them in use. (2) The S9 has competition (Caanan/Avalon 721/741, whatever BW.COM has been using internally, Bitfury) that is fairly CLOSE to it's efficiency that has been selling for almost a year now (bit over perhaps) and HELPING it saturate the Bitcoin market. (3) The price rise of Litecoin during the last 4 months is over 10 to 1 - the comparable price rise of Bitcoin on a 10 to 1 ratio needed most of the last *2 years*, which has allowed a lot more time for folks to buy the hashrate up to the same ballpark as the price has gone up.
|
|
|
Is there any similar coin GPU ? ( with similar purposes )
EDIT - I've found out Foldingcoin.
I also discovered yesterday that there is a "dogeathome" project, that pays out DOGE to members of their team. Returns though are a LOT less than for CURE or FLDC. The total WEEKLY total PPD for the team is less than my DAILY PPD - so it would be trivially-easy for a "big producer" to dominate the very small returns.... It would be possible to merge CURE and "dogeathome" but not FLDC and "dogeathome" as they require you to use a DOGE address as your name.
|
|
|
One issue is that if the 3-phase 480 is in the USA it is typically Wye (3 lines and Neutral) meaning that any 1 line-to-Neutral voltage is 277V and PSU's won't like that.
At one Dell data center I used to work at, they had a few "small" autotransformers to step down the 277V to 240V to neutral. Much cheaper and more efficient than converting to 208V phase to phase. Additionally, PSUs that directly accept 277V exist, but are pretty hard to find. I am fairly sure that PFC is required as part of the 80Plus specification. At least one of Bitmain's power supplies is specifically designed for 277 volt input, might be all of them. Most server or ATX supplies however are NOT rated quite that high.
|
|
|
The period starting around March of this year has been one of the 2 most profitable cryptocoin mining periods EVER.
It's only real rival was the big Litecoin surge in late 2013/early 2014 timeframe - that lasted 3-4 months, was followed by a few weeks that X11 (DarkCoin, now DASH) was profitable 'till a bunch of the GPUs that HAD been mining Litecoin shifted, then pretty much died unless you have SUPER cheap electric.
Up side this time around - a LOT more "profitable" coins to choose from, and network hashrates and profitability that could soak a LOT more hashing power.
Down side - a lot more folks that already knew about Cryptocoins to jump ON the bandwagon sooner, plus more and more newbies since Cryptocoins have quite a bit more "legit" perception to them this time around.
|
|
|
Since ALL of the large supercomputers are massively parallel machines (some massively GPU parallel), they might actually manage to beat a S9 at Bitcoin mining.
Tradeoff - the ELECTRIC usage of the things is commonly measured in MEGAwatts.....
|
|
|
I have MSI GeForce GTX 1070 GAMING X 8G and I achieve 450-460 Sol/s easily at 85% +90 Core +645 Mem with EWBF's CUDA Zcash miner. This card can take a lot more punishment than my other 1070s from ASUS but there are diminishing returns in further overclocking.
85% on THOSE cards though is 204 watts - which is a lot more than 100% TDP on the EVGA 1070 SC (which only has a 151 watt TDP). As I recall, the EVGA FTW models also have a higher TDP (180 watts I think), but I don't have any of those myself so not sure there. My SC cards don't handle +550 memory worth beans long-term, but they do handle +500 well - I have the originals that didn't have the added thermal pads though. Probably depends on the memory in a specific card.
|
|
|
Even the "buy 5" offer is WAY too expensive.
This miner needs to be under 1 Bitcoin at current Bitcoin pricing to be worth even THINKING about.
|
|
|
Go into the benchmarks page and unclick the algorithms you do NOT want to mine, then restart mining.
**************** BUT *******************
IF you want to mine a CURRENCY, you need to mine through a pool for that currency or solo mine, NOT use Nicehash - you don't mine CURRENCIES through their service, you provide hashrate that is paid for via their market for that ALGORITHM then whoever is buying your hashrate uses it to mine whatever THEY chose to mine with it.
|
|
|
Lets say I buy a power supply which is rated 2400W (2560W at the wall), now I connect an Antminer L3+ to it which is 800W. Will the actual consumption of electricity be 2560W or 800W (plus power supply factor 10-15%)?
That power supply rating is a MAXIMUM - the L3+ will only cause the power supply to pull however much power the L3+ actually uses plus the inefficiency factor of the power supply, so ballpark 900 watts. It's like a car engine rated for 300 horsepower max - but most of the time you are only using 10-20 HP when driving down the freeway at the speed limit.
|
|
|
AWG 14 is NOT going to be safe to operate a device 24/7 at so close to it's RATED capacity - that capacity is an INTERMITTANT capacity rating, like any other electrical wiring you need to derate it at least 20% for continuous usage.
AWG 12 or don't do it - and IMO it's not a good idea to run ANY high power miner on an extension cord.
Also, the COLOR of a cord means nothing - orange is common in cords intended for outdoor usage but the actual capacity of orange cords varies a LOT.
|
|
|
BTC hasn't averaged 10% jumps for a while - gets 1 or 2 in a row when Bitmain ships out a batch of S9 units then flattens out for a while.
Also, once profitability for folks in the 3c/kwh electric range drops far enough, SALES of new miners will start dropping and diff increases on average will drop quite a bit.
We're in the middle of a transition phase right now, the current generation miners like the S9 are the FIRST that were built on a "current" process node ever, and folks are still getting used to the fact that "new generation of miners every 6-9 months" isn't going to HAPPEN any more - and ROI timeframes are also likely to extend out a lot more than the old-time "better ROI in 6 months or less or the new stuff is going to kill your profitability" norm.
The big price surge on almost all cryptocoins this spring confused things a lot, but the NORM in the future is going to be an expectation of 1-3 YEARS to achieve ROI for most coins most of the time.
Unfortunately, most folks haven't wrapped their mind around that yet, they're too used to the OLD model where you never HAD 1-3 years to achieve ROI because the generations were so short.
|
|
|
Win 7 Win 10 both have their places I prefer linux for up time
All three suck at times All three work well at times.
My win10 boxes run for 3 weeks no problem. I bounce them around 3 weeks mainly to proactively avoid crashes. Probably could go longer than 3 weeks no reason to risk it My "BigTrio" mining box (LINUX) routinely went MONTHS without downtime - and I don't think it's EVER had downtime unless I had a power outage, had to shut it down to move it, or was doing some sort of work on it. I've had more than a few LINUX machines go YEARS with no downtime except for power outages and hardware failures while doing crypto work - in fact that's been the NORM for me, not the exception. I tend to laugh at the idea of "weeks of uptime" meaning high reliability.
|
|
|
look at market cap
Market cap has ZERO direct effect on mining profitability. PRICE does, but price is only one of the components of market cap. Keep in mind that the coin with the SECOND highest number of GPUs pointed at it is probably ZEC (at ballpark 1/2 to 1/3d as many cards as ETH has) yet it's only #17 on your market cap list. How long a coin has been around has a major effect on market cap, as a coin that has been around a long time like ETH will have a lot more coins in existance vs one that is less than a year old like ZEC. I don't see ANY chance of a lawsuit over "lost mining potential" due to poor drivers getting traction against AMD or NVidia - they do NOT make their cards for mining, they make NO guarentee of their cards for mining, etc - the only possible tort would be for "breach of contract" and there IS NO SUCH CONTRACT for them to breach. Might have a faint prayer vs the AFTERMARKET makers that are making "mining cards" but I doubt even THAT would get anywhere in a court.
|
|
|
Some bench from vega 64 with nicehash 1.8.1.0
keccak 0.724 GH/s DaggerHashimoto 31.275 MH/s DaggerDecred 31.317 M/0.940G H/s DaggerPascal 30.993 M/0.930G H/s DaggerSia 30.718 M/0.922G H/s Decred 2.659 GH/s CryptoNight 800.000 H/s Lbry 0.192 Gh/s Pascal 1.668Gh/s X11Gost 13.400 Mh/s
Claymore 9.8 ETH + SIA 33MH/s - 1011MH/s
Fake Could have been from the Frontier Edition, which IS a "Vega 64" based card.
|
|
|
There have been quite a scam few sites that "spoofed" the real Bitmain site. That's not BITMAIN'S fault, and if you got taken that's YOUR fault for not watching the adddress (that hyphen is NOTICEABLE). Bitmain themselves have a very good reputation for not shafting folks on orders - the bad part of their rep is about pathetic support on broken miners, and somewhat poor reliability on recent designs.
|
|
|
Funny, I've got the opposite, I dropped my mem clock by almost 150 on each card +40 core, +25% power and I am consistently getting around 780 sol/s per card - 2x Aorus GTX 1080ti xtreme edition - Using GFE, AfterBurner doesn't let me adjust it properly
What is GFE? He is probably talking about Geforce Experience, the addon program (that I don't recommend). Best and most simply OC tool to use is Nvidia Inspector by Orbmu2k. Portable and does not clutter up the system. I make a point of NEVER installing Geforce Experience. Horrid excuse for a program, even more worthless than Wattman (and I have no use for Wattman)....
|
|
|
So an ASIC chip could be developed to mine ZEC as it has been the case for BTC?
Possible, and I would not be shocked to see one show up before the end of the year, but I'm pretty sure the huge memory usage of ZEC compared to those algos that HAVE an ASIC available would make the cost and performance not all that great compared to GPU mining. I don't see it likely an ASIC could manage 100x the performance, and iffy if one could manage 10x the performance of current GPUs - and it would take something in the ballpark of 10x the performance to make a ASIC-based mining chip viable to create.
|
|
|
Quite a few older cards work well on ZEC mining, though nowhere near as efficiently as modern cards.
For perspective, my HD 7750 cards managed ballpark 60 sol/s, but were probably eating around half as much power to do so as a GTX 1070 does to mine 400+
|
|
|
|