Vorksholk
Legendary
Offline
Activity: 1713
Merit: 1029
|
|
September 03, 2014, 02:41:36 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files?
|
|
|
|
neite99
|
|
September 03, 2014, 02:42:04 AM |
|
Someone just destroyed a 2 btc buy wall
2014-09-02 19:22:12 Sell 273369.80112995 2.41932274 0.00000885
Its been happening like that all day.
|
|
|
|
uray
|
|
September 03, 2014, 02:42:04 AM |
|
80 btc volume on CCEX.
Wow...
will keep rising until we surpass Nxt with their shitty Proof of Stake
|
|
|
|
Razerglass
Legendary
Offline
Activity: 1036
Merit: 1000
https://bmy.guide
|
|
September 03, 2014, 02:43:13 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000
|
|
|
|
neite99
|
|
September 03, 2014, 02:49:43 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap
|
|
|
|
cyberspacemonkey
Legendary
Offline
Activity: 1288
Merit: 1002
|
|
September 03, 2014, 02:58:48 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up?
|
|
|
|
Vorksholk
Legendary
Offline
Activity: 1713
Merit: 1029
|
|
September 03, 2014, 02:59:41 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Yeah, those files are fine. With 8GB of RAM, either not enough of it is free, you are on a 32-bit version of Java (or OS!) or your -Xmx value is too small. Either way, stagger isn't too important, it just makes parsing the files a tad faster.
|
|
|
|
Vorksholk
Legendary
Offline
Activity: 1713
Merit: 1029
|
|
September 03, 2014, 03:00:25 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file.
|
|
|
|
Razerglass
Legendary
Offline
Activity: 1036
Merit: 1000
https://bmy.guide
|
|
September 03, 2014, 03:01:18 AM |
|
im doing pooled mining and so far i have 160gb plotted. it says is no valid shares to submit to pool. is this normal i was told to verify im mining to check the pool for my user name, it hasnt showed up
|
|
|
|
cyberspacemonkey
Legendary
Offline
Activity: 1288
Merit: 1002
|
|
September 03, 2014, 03:04:27 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right?
|
|
|
|
Vorksholk
Legendary
Offline
Activity: 1713
Merit: 1029
|
|
September 03, 2014, 03:08:43 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right? Yup, plot generation all hinges on CPU power.
|
|
|
|
cyberspacemonkey
Legendary
Offline
Activity: 1288
Merit: 1002
|
|
September 03, 2014, 03:12:48 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right? Yup, plot generation all hinges on CPU power. Is that Linux plot generator on the OP really twice as fast? is it only for Ubuntu? would it work on Opensuse? I really would like to use it, are there any instructions?
|
|
|
|
Razerglass
Legendary
Offline
Activity: 1036
Merit: 1000
https://bmy.guide
|
|
September 03, 2014, 03:14:28 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right? Yup, plot generation all hinges on CPU power. Is that Linux plot generator on the OP really twice as fast? is it only for Ubuntu? would it work on Opensuse? I really would like to use it, are there any instructions? https://bitcointalk.org/index.php?topic=731923.msg8299637#msg8299637 im curious as well is it faster?
|
|
|
|
Vorksholk
Legendary
Offline
Activity: 1713
Merit: 1029
|
|
September 03, 2014, 03:16:54 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right? Yup, plot generation all hinges on CPU power. Is that Linux plot generator on the OP really twice as fast? is it only for Ubuntu? would it work on Opensuse? I really would like to use it, are there any instructions? https://bitcointalk.org/index.php?topic=731923.msg8299637#msg8299637 im curious as well is it faster? It's significantly faster, and will work on any Linux system which uses makefiles and gcc--Opensuse included. wget https://bchain.info/plotgenerator.tgz tar zxvf plotgenerator.tgz make
You can edit the Makefile to add -O3 and -march=native, though I've seen no performance boost from doing so personally. On a 2xIntel E5-2560v2 (so 16 cores, 32 threads) this plot generator gets around 14000 nonces/minute, while the traditional linux plotter gets a bit over 7000.
|
|
|
|
zhaohui1985
Newbie
Offline
Activity: 34
Merit: 0
|
|
September 03, 2014, 03:20:45 AM |
|
Is Pool v1 faster or v2 faster when the plots are ready?
|
|
|
|
paradigmflux
|
|
September 03, 2014, 03:21:27 AM |
|
In the process of bringing an additional 1680 TB of usable space online
|
|
|
|
cyberspacemonkey
Legendary
Offline
Activity: 1288
Merit: 1002
|
|
September 03, 2014, 03:21:47 AM |
|
so what happens to me i did mine in 400gb plots, 1-1800000 and 1800001-36000000 am I ok to mine or should i of done it differently
What are the names of your plot files? 16036893174223139082_1_1800000_1000 16036893174223139082_1800001_3600000_1000 nobody explained to me that it was important!! just to not have duplicate plots. .. edit: could somebody explain the importance of staggering and ram and such i have 8gb ram and cant set it over 1200without it erroring and stagger i cant set it higher than 2400 without error so i just left them both at 1000 Your fine with those two. Staggering is nothing but how big each section is. You can use up to 8191 mb of ram to "hold" the plots in ram as they are created. It doesn't matter if they are 500 or 8191 big. In the end they are the same. As long as they don't overlap. He asked for the names of your files because it shows everything about how big the are and whether they overlap Anyt tips on how to speed up the plots generation? Does staggering size matter to speed it up? To a point stagger size will speed up plot generation--a stagger of 100 will be far faster than a stagger of 10. However, once you get over 1000 or so, the benefit from higher staggers mostly lies simply in the time it takes to parse each file. So I guess it depends on the speed of the CPU right? Yup, plot generation all hinges on CPU power. Is that Linux plot generator on the OP really twice as fast? is it only for Ubuntu? would it work on Opensuse? I really would like to use it, are there any instructions? https://bitcointalk.org/index.php?topic=731923.msg8299637#msg8299637 im curious as well is it faster? It's significantly faster, and will work on any Linux system which uses makefiles and gcc--Opensuse included. wget https://bchain.info/plotgenerator.tgz tar zxvf plotgenerator.tgz make
You can edit the Makefile to add -O3 and -march=native, though I've seen no performance boost from doing so personally. On a 2xIntel E5-2560v2 (so 16 cores, 32 threads) this plot generator gets around 14000 nonces/minute, while the traditional linux plotter gets a bit over 7000. Thanks, so after I enter those commands, what do i do? will it make a text file somewhere? where do I find it? and also where do I edit the details of the plots? Also, how do I make it run, if yo could please provide me with the command (sorry I'm a linux newbie over here).
|
|
|
|
bathrobehero
Legendary
Offline
Activity: 2002
Merit: 1051
ICO? Not even once.
|
|
September 03, 2014, 03:32:45 AM Last edit: September 03, 2014, 03:44:33 AM by bathrobehero |
|
im doing pooled mining and so far i have 160gb plotted. it says is no valid shares to submit to pool. is this normal i was told to verify im mining to check the pool for my user name, it hasnt showed up
Same. I moved my plots from pocminer to pocminer_pool. Also, my numerical address is not showing on the pool's list.
|
Not your keys, not your coins!
|
|
|
SpeedDemon13
|
|
September 03, 2014, 03:34:32 AM |
|
Hopefully the Windows port of the Linux version is out soon. I don't mind doing it on Linux, just a preference using Windows in general. It does plot faster on Linux on my Ubuntu machine.
|
CRYPTSY exchange: https://www.cryptsy.com/users/register?refid=9017 BURST= BURST-TE3W-CFGH-7343-6VM6R BTC=1CNsqGUR9YJNrhydQZnUPbaDv6h4uaYCHv ETH=0x144bc9fe471d3c71d8e09d58060d78661b1d4f32 SHF=0x13a0a2cb0d55eca975cf2d97015f7d580ce52d85 EXP=0xd71921dca837e415a58ca0d6dd2223cc84e0ea2f SC=6bdf9d12a983fed6723abad91a39be4f95d227f9bdb0490de3b8e5d45357f63d564638b1bd71 CLAMS=xGVTdM9EJpNBCYAjHFVxuZGcqvoL22nP6f SOIL=0x8b5c989bc931c0769a50ecaf9ffe490c67cb5911
|
|
|
IncludeBeer
Legendary
Offline
Activity: 1164
Merit: 1010
|
|
September 03, 2014, 03:36:53 AM |
|
I'm having a problem generating a 2nd plot on my hdd. I started pool mining, so moved the run_generate.bat and pocminer jar file over to my poolminer folder, then edited the file to this:
C:\Windows\SysWOW64\java -Xmx1000m -cp pocminer.jar;lib/*;lib/akka/*;lib/jetty/* pocminer.POCMiner generate 4163282010088137402 4000000 5434000 1000 6%* PAUSE
...Same as I used for my first plot file, except the ranges. My 2 files are named: 4163282010088137402_0_3890000_1000 4163282010088137402_4000000_5434000_1000
The problem I'm seeing is this: the run_generate script keeps going past nonce "5434000"...and keeps doing this until I run out of room...at which point I get an ugly error message, and the generated plot is deleted. To fix it, I figured I could just kill it after writing nonce "5434000" (once it started generating 5435000). However, after restarting my miner, I still get an error saying it can't read the newly generated plot file (Error reading file: 4163282010088137402_4000000_5434000_1000). What's wrong with my run_generate?? How can I kill it safely? I had no problems generating my first plot file. :/
|
|
|
|
|