Bitcoin Forum
May 28, 2024, 04:38:37 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 »
141  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 31, 2014, 01:21:53 PM
Its faster to use each HDD on its own

Also, it would be nice to get some performance tips for mining (NOT generating plots, there are lot's of tips regarding generating plots already present in this thread) ... are there any factors, that affect mining performance, besides the total storage space in the plots?

Does it matter if pocminer runs on a slow or fast computer?

Does it matter if pocminer runs on a computer with fast (latency & bandwidth wise) access to disks (like directly attached storage VS. USB storage, etc.)?
142  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 31, 2014, 01:14:55 PM
Hi HDD Miners,

is it better to mine on a 16TB raid0 (4x4GB) then on each of them (plots from a to b, b+1 to c, c+1 to d)?

Thanks :-)

Its faster to use each HDD on its own


dcct, can you please elaborate why is this faster? What kind of disk stress is present during mining? Also, are you suggesting one should use 4x pocminers, instead of only miner?
143  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 31, 2014, 11:27:58 AM
In the meantime, we should be trying to be added on other exchanges.

Dev should contact support (at) cryptsy.com for a request to put BURST on https://www.cryptsy.com/coinvotes/
144  Economy / Service Announcements / Re: [ANN] NiceHash.com - innovative professional cryptocurrency cloud mining service on: August 31, 2014, 11:25:08 AM
Now I've tried just to mine at port 3335, without all the other algos, as always with the Linux Nicehash build of SGMiner 5 from July 29th but I still get only hardware errors Huh
X11 till X15 works fine.

#!/bin/sh
export DISPLAY=:0
export GPU_MAX_ALLOC_PERCENT=100
export GPU_USE_SYNC_OBJECTS=1
cd /home/xxx/sgminer5-0729
./sgminer --algorithm nscrypt -o stratum+tcp://stratum.nicehash.com:3335 -u xxx -p x --gpu-powertune 20 --gpu-fan 88 --thread-concurrency 8192 -g 2 -w 64 -I 17

Any ideas Huh

If you have AMD drivers 14.x then you'll get hw errors on scrypt and scrypt-n ... for scrypt and scrypt-n you should use 13.x drivers.
145  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 31, 2014, 09:51:31 AM
I'm still generating plots (several plot files, each of size 1933 GB)), and I'm also already solo mining (already for approximately 24h), but still haven't found any block, here is my latest output from poc_miner:

Code:
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
Error reading file: 172584643045544809_237680001_7920000_120000
{"baseTarget":"17597142","height":"6959","generationSignature":"ee821348dd2a62c031fe59dbb51f8f1c8c01639057ba4cef57516e4764acb07d"}
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_269360001_7920000_120000
Error reading file: 172584643045544809_213920001_7920000_120000
Error reading file: 172584643045544809_166400001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_158480001_7920000_120000
Error reading file: 172584643045544809_206000001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
New best: 172584643045544809:214311855
Submitting share
{"result":"success","deadline":359852}
Error reading file: 172584643045544809_110960001_7920000_120000
Error reading file: 172584643045544809_293120001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
New best: 172584643045544809:160097947
Submitting share
{"result":"success","deadline":50564}
Error reading file: 172584643045544809_1_8000000_500000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
New best: 172584643045544809:206628251
Submitting share
{"result":"success","deadline":47208}
Error reading file: 172584643045544809_118880001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_79280001_7920000_120000
Error reading file: 172584643045544809_71360001_7920000_120000
Error reading file: 172584643045544809_198080001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_277280001_7920000_120000
Error reading file: 172584643045544809_253520001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_182240001_7920000_120000
Error reading file: 172584643045544809_174320001_7920000_120000
Error reading file: 172584643045544809_95120001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_301040001_7920000_120000
Error reading file: 172584643045544809_15920001_7920000_120000
Error reading file: 172584643045544809_103040001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_39680001_7920000_120000
Error reading file: 172584643045544809_221840001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}
Error reading file: 172584643045544809_47600001_7920000_120000
Error reading file: 172584643045544809_150560001_7920000_120000
{"baseTarget":"15975501","height":"6960","generationSignature":"c92ac734253d2d21a80c81ad08bbfd7b5b769d9913ebcd6c2c95271e2c4d1b28"}

What are these "Error reading file:", are these errors present because of plots still being generated?

Currently I have almost 40 TB of plot files ... is there something wrong with my setup/mining, shouldn't I already find some block? Is there any option withing the BURST client to actually verify if it is accepting shares?
146  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 30, 2014, 08:10:20 PM
Just wanted to ask you if anybody tested pocminer with shared storage? What happens if I run multiple instances of pocminer on different computers using the same shared storage for plot files? Do I get a multiplied "hashrate" (X=hashrate from one computer, N*X if running on multiple computers)?

Also, what happens if I run multiple instances of pocminer on the same computer, is the "hashrate" the same as I would run a single instance? Is there any kind of protection that a single plot file can't be simultaneously used by multiple miners (either on the same or from different computers)?
147  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 30, 2014, 10:38:25 AM
Question regarding the "staggnation" parameter when generating plots:

"staggnation : is the number of plots to group together per run, which causes less disk reads / seeks the higher it is, the lower the more disk reads/ seeks"

Does this parameter only affects performance when generating plots or does it also affect performance later on when mining with these generated plots?
148  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 30, 2014, 05:21:37 AM
start_plot_num : is the starting plot, every plot file you make, just start the plot from the previous generate run start_plot + num_plots.
total_num_plots : is the number of plots you wish to generate, each plot is 256kb.

I have a few questions regarding plot sizes and number of plots. Let's say I have 10 TB of free space in a RAID volume (linux). Does it affect performance if I create one big 10 TB plot file or is it better to create for example 5 plot files, each of 2 TB? Is mining more efficient if there is one big single plot file, or it doesn't matter?

Also, will my miner automatically use all plot files if there are many of them or any special configuration is needed?

When I create multiple plots, is this the correct way:

1st plot:

./plot [myid] 1 2048000 8000 2
Creating plots for nonces 1 to 2048001 (500 GB) using 2000 MB memory and 2 threads

2nd plot:

./plot [myid] 2048002 2048000 8000 2
Creating plots for nonces 2048002 to 4096002 (500 GB) using 2000 MB memory and 2 threads

Do I start the 2nd plot at nonce 2048001 or at 2048002 (I'm not sure if nonce 2048001 is included in the 1st plot)?
149  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 29, 2014, 06:00:10 PM
How can I setup wallet to listen to not only to localhost, but on my specified local IP address?
150  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 29, 2014, 03:10:07 PM
number (starting nonce) (number of nonces to plot) (memory allocation) (number of cores)
Take that and combine with the guide and i'm sure you'll figure it out.

OK, thanks, I didn't scroll down to the bottom of the guide to see the 256kb*X formula ... sorry

OK, I've started the C-version of plot (on linux), I can see process "plot" running at one core 100% (I have 12 cores and I started plot with 12 cores/threads) .... is this normal? Also, I don't see any file being created, does it first fill the memory and then flushes to disk?

*edit: yes, this turned out to be true (I have lots of RAM - 144GB) therefore the "initial" phase obviously took a while and now it runs at full speed with all 12 cores being utilized and I can see file being created under "plots" directory...
151  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 29, 2014, 02:14:46 PM
A few questions I couldn't find answers for:

How can I specify what is the maximum space that BURST takes? Or how does this exactly works, it creates a big file on hard drive?

Also, does the speed of mining (hashrate) depend on how powerfull the CPU is? Or is the hard drive free space the only performance parameter?

You set the number of nonces, you can get this info from the guide. The CPU only matters during plotting, after that it barely gets used.

Hmm, "nonces" are not mentioned neither in windows or linux guide ... can you please point me to the guide where it shows how to set the size I want to use for mining BURST?
152  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 29, 2014, 01:57:55 PM
A few questions I couldn't find answers for:

How can I specify what is the maximum space that BURST takes? Or how does this exactly works, it creates a big file on hard drive?

Also, does the speed of mining (hashrate) depend on how powerfull the CPU is? Or is the hard drive free space the only performance parameter?
153  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][BURST] Burst | Efficient POC Mining | Update to 1.0.3 before block 6500 on: August 27, 2014, 10:22:37 AM
I have a couple of questions:

a) Does the effectiveness/performance of mining BURST depends on disk speed, disk size, both of them or also something else? If yes, are there any reference/baseline performance numbers?

b)

dcct's plot generator(twice the speed, linux only): https://bchain.info/plotgenerator.tgz
Plot range checker: https://bchain.info/BURST/tools/overlap (thanks to dcct)
Plot merge tool(significantly reduce disk stress. linux only): https://bchain.info/merge.c (thanks to dcct)

Can somebody explain how can I actually use these performance tips, where and how do I use these files (on CentOS Linux)?

Thanks!
154  Alternate cryptocurrencies / Mining (Altcoins) / Re: [ANN] sgminer v5 - new unified multi-algorithm on-the-fly kernel switching miner on: August 22, 2014, 01:31:15 PM
lol what a little fucking bitch you are kenshirothefist.

Offer $100 to finish what so many people have worked on and tested for the last 3 months, and you've benefited from nearly all the hashpower.

FUCK YOU.

Take it easy @platinum4 ... I guess you misunderstood my message - what I was trying to say it that these are difficult days for GPU miners and I wanted to give out some positive note by helping to release official sgminer_v5. So, take it easy @platinum4 and sorry if you felt offended as a GPU miner ... and fingers crossed for some new profitable GPU-based coin to come out soon!

And remember - we were the original sponsors for sgminer_v5 (initial development was paid), put a lot effort in testing and helping with the development and by releasing multi-algo mining we enabled many miners (including you) to earn more then would earn without multi-algo. Sure, we also made some profit, but it is was a win-win situation for everybody. Now, let's see if we can get this sgminer_v5 finally released...
155  Alternate cryptocurrencies / Mining (Altcoins) / Re: [ANN] sgminer v5 - new unified multi-algorithm on-the-fly kernel switching miner on: August 22, 2014, 06:50:30 AM
Hey, sgminer devs. I don't like partly-finished sw projects, therefore I would really like to see sgminer_v5 to be finally released as stable and merged into master tree with version bump to 5.0.0. The community is waiting for the v5 release, all those depressed GPU miners with miserable payouts on GPU mining needs some good news Wink

Anyway - I'm putting a 0.2 BTC bounty for the one that will finalize this: release of v5.0 as stable by moving it to master and do the version bumping as I would really like to see this job finalized. Payment will be made to the BTC address, listed in the https://github.com/sgminer-dev/sgminer/blob/v5_0/AUTHORS.md for the author that will do the finalization and release process.

See also: https://github.com/sgminer-dev/sgminer/issues/375

*edit: bounty valid until 31/08/2014 - that is, job must be completed by the end of August 2014
156  Alternate cryptocurrencies / Mining (Altcoins) / Re: [ANN] sph-sgminer: multi-coin multi-algorithm GPU miner | added MaruCoin on: August 20, 2014, 03:58:11 PM
FYI: this thread is a dead-end.

shm is not anymore actively involved in sph-sgminer development (https://bitcointalk.org/index.php?topic=475795.msg6731253#msg6731253)

sph-sgminer is not being developed anymore

sph-sgminer has been merged with the new sgminer_v5: please take a look at this thread: https://bitcointalk.org/index.php?topic=632503.0

I do believe that you can consider this thread closed.
157  Bitcoin / Mining software (miners) / Re: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.5.0 on: August 14, 2014, 08:49:18 AM
Hi,

Many of cgminer users are already using NiceHash.com, especially Scrypt and SHA256 ASIC miners; therefore it would be very nice if a small stratum protocol extension called mining.extranonce.subscribe (as described here https://www.nicehash.com/software/#devs) would be implemented into cgminer.

I understand that this is a custom feature and I know you are not very fond of custom features. Nevertheless, there are already many custom features implemented in cgminer to support various ASIC devices and it would be really nice if you could do it since it would improve efficiency for users using cgminer when mining on NiceHash (because no disconnects would be needed when we switch buyers orders internally in our system - there is no other way for us to do it since we can't delegate the work any other way but to cut part of extranonce for ourself). Of course this can be implemented as an optional feature, enabled by a parameter (e.g. "extranonce-subscribe" : true).

It is especially important for this feature to be implemented in cgminer since the majority of ASIC device manufacturers grabs your cgminer, burns it into their devices and in some cases it is hard to put a custom (patched) version into such devices.

This feature has already been included, for example, in the latest sgminer (search for "extranonce" here: https://github.com/nicehash/sgminer/commits/master) or in official sgminer GIT source code at https://github.com/sgminer-dev/sgminer/commits/v5_0

p.s.: I do realize that this is not an optimal implementation; it would be better to extend stratum protocol in such a way that it would make extensions easily auto-detectable, however we can probably consider the stratum protocol not to be worth spending too much time on, since it is a dead end as a centralized protocol ... that's why we proposed this kind of simplified patch.

Thanks and let me know if you need any help!
158  Economy / Service Announcements / Re: [ANN] NiceHash.com - innovative professional cryptocurrency cloud mining service on: August 01, 2014, 05:39:13 PM
A few comments about WestHash and motivation behind it:

- due to the nature of our service (hash power proxy) we can't just setup geographical distributed servers and point all hash power to one system - it is technically impossible due to network latency and  overall network stability (it would be very inefficient for providers as well as for buyers); thus it makes sense to bring our service as close as possible to miners as well as close as possible to pools (to be effective for buyers)
- WestHash is in some way a kind of experiment: it brings some opportunities for providers and for buyers with a new market which extends NiceHash; we'll see how it goes and will adjust accordingly

About instability in the past few days: partly related to expanding with WestHash but mainly due to yet another specialized attack with ten thousands of fake miners (bots), it was a tough one, but we've resolved it now - sorry for the inconvenience.
159  Economy / Service Announcements / Re: [ANN] NiceHash.com - innovative professional cryptocurrency cloud mining service on: August 01, 2014, 04:14:43 AM
Providers and Buyers, we are pleased to announce WestHast.com[/b]]WestHast.com - geographically separated system, running as a whole under global NiceHash.com master system.

WestHash.com brings attractive opportunities for providers as well as for buyers. Market is independent from the NiceHash.com, however accounting, payments and everything else is fully integrated into a single system. We delivered WestHash.com to come closer to all those miners and end-target pools, located in western and central USA - to provide better efficiency and also to shuffle the market a bit. And yes, you guessed, in the near future we will also cover Asia by introducing AsiaHash.com. Stay tuned!



We welcome you to try our new WestHash.com system:

Buyers - you can place orders just the same as you are doing it on NiceHash, with the same accounts and the same online wallet.

Providers - you can point miners to stratum.westhash.com with the same rules that applies on NiceHash as well (same ports, multi-algo, etc.).

We wish you lot's of fun and profit by using our global professional cryptocurrency cloud mining service! Thank you!
160  Economy / Service Announcements / Re: [ANN] NiceHash.com - innovative professional cryptocurrency cloud mining service on: July 31, 2014, 10:23:58 PM
Providers: as part of our servers upgrade we have changed IP address of our main proxy. This is a long-term setting as we have now enabled dynamic scaling.

If your miner can't connect, please do the following (not necessarily all of it):
- flush DNS on your rig (windows: "ipconfig /flushdns")
- restart your mining software
- restart your rig (in case DNS won't refresh)

In the following days we will put even more power to our proxy servers and, most importantly, introduced additional USA proxy. Stay tuned.

Pool operators: please whitelist our new IP (contact us for details if you need assistance).

Thanks for using NiceHash!
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!