I'm mining ETN fine with CCminer 2.2.2 but when I try to mine MONA, I always get the same error from the start: Segmentation fault (core dumped)
Any idea what that might be?
Thanks!
|
|
|
I have tried cryptonight and my hash rate is almost one third of what I used to have with the previous version. I have also tried lyra2rev2 and I couldn't even start the miner, I get: Segmentation fault (core dumped)
|
|
|
Thanks, it was right in my face I tried to compile it and everything went fine, now trying it with cryptonight to compare with the previous version...
|
|
|
hello all,
I'm facing a similar issue. I have Ubuntu 16.04 running with CUDA 9. I'm trying to compile the latest version of ccminer and I get two errors:
/usr/local/cuda/bin/nvcc -I. -I/usr/local/cuda/include -O3 -lineno -Xcompiler -Wall -D_FORCE_INLINES - gencode=arch=compute_20,code=\"sm_50,compute_50\" -o scrypt/salsa_kernel.o -c scrypt/salsa_kernel.cu nvcc fatal : A single input file is required for a non-link phase when an outputfile is specified Makefile:2253: recipe for target 'scrypt/salsa_kernel.o' failed make[2]: *** [scrypt/salsa_kernel.o] Error 1
make[2]: Leaving directory '/home/laurent/Desktop/Mining/ccminer-1.8.2-tpruvot' Makefile:1757: recipe for target 'all-recursive' failed make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory '/home/laurent/Desktop/Mining/ccminer-1.8.2-tpruvot' Makefile:578: recipe for target 'all' failed make: *** [all] Error 2
In my makefile.am I have: nvcc_ARCH = -gencode=arch=compute_50,code=\"sm_50,compute_50\" nvcc_ARCH += -gencode=arch=compute_52,code=\"sm_52,compute_52\" #nvcc_ARCH += -gencode=arch=compute_35,code=\"sm_35,compute_35\" #nvcc_ARCH += -gencode=arch=compute_30,code=\"sm_30,compute_30\" #nvcc_ARCH += -gencode=arch=compute_20,code=\"sm_21,compute_20\"
the compute_20 is not enabled but I see later on: scrypt/salsa_kernel.o: scrypt/salsa_kernel.cu $(NVCC) $(JANSSON_INCLUDES) -I. @CUDA_INCLUDES@ @CUDA_CFLAGS@ - gencode=arch=compute_20,code=\"sm_21,compute_20\" -o $@ -c $<
Somehow it seems to force the compute_20 even if it's in comment at the start of the file.
Any idea of what I should do?
Thanks
|
|
|
At this point, on both the 1070 ti and the 1080 ti my "go to" card is the EVGA SC version.
Agreed, but at $572USD for the 1070ti and $798USD for the 1080ti, I am forced to consider all other options. In my country; GTX 1060 3GB = $194 USD GTX 1060 6GB = $281 USD GTX 1070 = $417 GTX 1070ti = $505 GTX 1080 = $535 GTX 1080ti = $764 Vega 56 = $535 Vega 64 = $558 Not really liking any of those options for a new rig. Sadly the 3GB 1060 seems most attractive. Sadly? Why? 194$ is really good
|
|
|
It's a hunt for the best price but it's critical for ROI. I'm collecting prices for EU here GTX 1060 but I haven't done it for outside EU. There is not a GTX 1060 below 200€, it's the first time I see this in EU since I have started mining a couple of months ago.
|
|
|
It depends on WHAT you are mining.
1070 ti is the most efficient card at ZEC mining right now with the 1080 a close second (but worse on hash/$ at any setting and the 1070 and 1080 ti are both fairly close on best efficiency OR hash/$).
For ETH the 1070 matches the 1070 ti hashrate exactly (same memory system on a VERY memory-hard algorithm) and the 1070 blows the 1080 completely out of the water - the 1080 ti beats the 1070 but not by a lot, making the 1070 by far the hash/$ leader of those 4 cards on ETH. SOME 1070 cards will mine ETH at 32 Mhash/s but many of them do good to get to 30 - but even at stock clocks they pretty much all do more than 28. But for $400 OR MORE they are not cost effective vs a 1060 3GB card that does 22+ for about half the cost, and they are not even CLOSE to cost effective vs RX 470/480/570/580 cards in the UNDER $250 range that can generally get to 28-30 Mhash (with BIOS mods) and are very close on efficiency.
Then there is Monero, where the Vega 56 is the current king of hash/watt (though some of the NVidia cards can argue there) and *WHEN* you can get one at semi-close to MSRP the hash/$ winner by a LOT. Vega 56 is even beating those old "open-compute" refurb Intel servers on hash/$....
If you can find a Vega 56 for under $500, it currently has an ROI in the 85 day ballpark, and has been consistantly under 105 days for the last couple weeks.
I fully agree with you. In theory those RX cards should be better than the GTX 1060 but they are either outpriced or not available, GTX 1060 has become the best alternative for ETH. For ZEC or XMR, that's a different story. I tried my GTX 1060 rig on those currency and even if it's not bad, those cards are not high performing cards for those currencies. Somehow my whole rig was consuming way less power with ZEC and XMR than ETH. For some reason, the algorythm behind doesn't get all the juice from the GTX 1060. EVGA GTX 1060 (got a couple of those) just went under 200$ now at Amazon: GTX 1060 on Amazon
|
|
|
My experience is the following:
For ETH, the GTX 1060 3GB seems to be the best. I get 24MH/S for 75-80W (for around 200$) while with a 1070 I get 32MH/S for 90W (for around 400$). Now if you want to mine ZEC for example, the result will be different, a 1080TI would probably be a better option.
You seems foolish buddy. GTX 1070 8GB card can be produce the 32 MH so you will be able to more income that card alone but if can get the electricity more cheaper in the sense we can make more money with that bro. Performance wise alone 1080 8GB card is better than 1070. You can check in any site you will find the information states 1070i is the best card to mine. I'm not foolish, it's pure math. You'll have a faster ROI with GTX 1060, on the long run you'll earn more with the GTX 1070. Today 1 GTX 1060 can generate 1.65$/day on ETH while a GTX 1070 will deliver 2.2$ (in the current conditions). At that rate you need 121 days to break even on GTX 1060 while it would take 181 for the GTX 1070 (that's almost 50% more!) and this is without taking power consumption into account because the gap will grow bigger. The GTX 1060 has a better MH/s per W. When mining you need the fastest ROI because nobody knows until when mining will be profitable. The time to be break even will increase with difficulty increasing so it's not getting any better.
|
|
|
My experience is the following:
For ETH, the GTX 1060 3GB seems to be the best. I get 24MH/S for 75-80W (for around 200$) while with a 1070 I get 32MH/S for 90W (for around 400$). Now if you want to mine ZEC for example, the result will be different, a 1080TI would probably be a better option.
|
|
|
you should buy from a site where they have a good return policy, you buy the cards, you test if it's Samsung, if not, send it back. But this way you'll never get to keep a single card since all 1070s come with Micron now. That's how I got Samsung models for all my cards. Have you recently bought any 1070s? Or are you talking about 1060s? It's a lot easier to get Samsung ram with 1060s, still plenty of them out there. But not with 1070s, not anymore. I got Samsung ram on GTX 1060 recently, I bought my GTX 1070 last year so I don't know.
|
|
|
Why buy such an old GPU? VEGA one will give approximately the same income. But this GPU will be the future. In the next year will increase production capacity and an old GPU you can throw in the trash. VEGA will work a very long time. It seems to me that my variant is more promising and cheaper.
VEGA is indeed newer (and probably better in some way) but it's much more expensive and consumes more power. Don't count too much on resell value, all GPU cards currently used for mining will flood the market at some point (if they are not broken before). There are way too many cards on the market because of mining, the gamer market is not big enough to swallow the second hand cards that will be in circulation in the coming months/years. If you mine, you need to accept the risk of not being able to sell your GPU (or for a very low price).
|
|
|
for that budget you'll need to go cheap on everything in order to buy as many GPUs as possible. You could probably be around 300$ for everything but the GPU, leaving 700$ for GPU. For that price you get 3x GTX 1060 with a 24MH/s hashrate, you would be around 4-5$/day with current price.
|
|
|
you should buy from a site where they have a good return policy, you buy the cards, you test if it's Samsung, if not, send it back. That's how I got Samsung models for all my cards.
GTX 1070 is a good card (I have one in my desktop) but it's twice the price of a GTX 1060 for only 33% more hash rate. With my GTX 1070 I get 32MH/s for a card around 400€, my GTX 1060 reach 24MH/s for 200€ (even less if you're in the US) and less power (75W!)
|
|
|
AMD cards are much better for CryptoNight, 516h/s is good result for your card. You can go to http://monerobenchmarks.info/ and look at what hashrate are people getting with different cards. Thanks for the link, do you think it would be better to mine something else... Is there somewhere I can find out info on this? Anything with Ethash should be better with Nvidia cards.
|
|
|
For 8000 USD, you can build 3 rigs like mine (it's worth 2400$) + add some additional cards. With 3 rigs you would have a hashpower of 690MH/s and a power consumption around 2550W. With your budget you could add 4 GPUs and end up with 34 GPUs for a total hash power of 782MH/s. That hash power means 37$ / day and a bit more than 1100$ per month for a total power of around 2700W. That would be 194$ electricity if you pay 0.1. If you go for a Nvidia rig you would pay less to buy and less for the power.
Like the others are saying here, it's without taking difficulty into account so in the end it will always be lower than what the calculators are saying today.
|
|
|
Electricity is key but you need to choose your hardware carefully too as it will define how much power you will need. You'll see a lot of debate here around AMD vs Nvidia but from my experience Nvidia is the best in terms of power consumption. For 200$ (or less) you get a GTX 1060 3Gb that can go up to 23-24MH/s and 75W power. The rest is simple math. Take other examples to compare. AMD has better hashrate but purchase price and power consumption is making those cards less profitable. I have posted the hardware I chose here: https://bitcointalk.org/index.php?topic=2289941.0
|
|
|
Can somebody help me with GPU load trouble? I have 8 x GTX1080 rig. Miner shows anomaly with GPU1 performance. It can be some number in range 300-500 Sols/s and staying stable til next run of miner. Meanwhile the Afretburner shows partial load of that GPU (in range 50-95%) that also stable til next run of miner. What's wrong with my GPU or with miner and how to load it for 100% of power? Other GPUs working fine. Changing risers or PCI-e slots has no effect. With ZEC, you don't have a 100% GPU load all the time, it consumes also less power. There is nothing wrong here.
|
|
|
24.3 MH/s stable on claymore 1060 6gb armor ocv1 (msi) Power limit: 57% Core: -197 Memory: +870 Fan: 65% Temp: 54-56°C Why do you guys reduce the core? Can't seem to get a response I'm wondering the same. it's because the impact on performance is less important than memory clock and by lowering core clock you need less power and less heat. It's all a question of trying out. I see that many people achieve a better result with negative core clocks. Same for me, cards crashes faster when using positive core clocks.
|
|
|
P4ndoraBox: you could probably earn more by sticking to ETH alone. With that type of card you could reach 24MH/s for ETH instead of 20, it's probably worth more than the other currencies you're going to mine.
|
|
|
Bitcoin gold site went down today, DDOS attack apparently...
|
|
|
|