Isokivi
|
|
September 08, 2013, 08:21:30 AM |
|
Thanks! Is there a logfile for each mining startup? I think its much easier to have such a logfile instead of 5 minutes samples to find the best settings.
A workaround cronjob has been gradually progressing for two days, but is not working yet.
|
Bitcoin trinkets now on my online store: btc trinkets.com <- Bitcoin Tiepins, cufflinks, lapel pins, keychains, card holders and challenge coins.
|
|
|
jlsminingcorp
|
|
September 08, 2013, 09:13:05 AM |
|
Thanks! Is there a logfile for each mining startup? I think its much easier to have such a logfile instead of 5 minutes samples to find the best settings.
A workaround cronjob has been gradually progressing for two days, but is not working yet. Yes, there's been some long distance debugging going on (thanks to Isokivi for trying stuff out on the real miner) as things are not always working as expected on the miner . I'd have thought that we'll have something sorted later today.
|
|
|
|
Conqueror
Legendary
Offline
Activity: 1354
Merit: 1020
I was diagnosed with brain parasite
|
|
September 08, 2013, 12:31:33 PM |
|
Hi,
I want to buy one starter kit and 2 additional boards. On the shop there is marked "October delivery". So if I order now, when should I expect to have it at home (Czech republic)?? Any real estimate will be very helpfull.
Thanks!
|
|
|
|
shade88
Member
Offline
Activity: 119
Merit: 10
|
|
September 08, 2013, 12:36:16 PM |
|
Hi,
I want to buy one starter kit and 2 additional boards. On the shop there is marked "October delivery". So if I order now, when should I expect to have it at home (Czech republic)?? Any real estimate will be very helpfull.
Thanks!
End of October, they're currently busy with 200TH farm.
|
|
|
|
Conqueror
Legendary
Offline
Activity: 1354
Merit: 1020
I was diagnosed with brain parasite
|
|
September 08, 2013, 12:44:14 PM |
|
Hi,
I want to buy one starter kit and 2 additional boards. On the shop there is marked "October delivery". So if I order now, when should I expect to have it at home (Czech republic)?? Any real estimate will be very helpfull.
Thanks!
End of October, they're currently busy with 200TH farm. You saying they use the hardware for their own enrichment instead of shipping to customers? That is not very nice at all...
|
|
|
|
dani
|
|
September 08, 2013, 12:51:50 PM |
|
You saying they use the hardware for their own enrichment instead of shipping to customers? That is not very nice at all...
Nothing new, when you order something with october delivery and you know it's gonna be the end of the month and don't like it, don't order.
|
Hai
|
|
|
jlsminingcorp
|
|
September 08, 2013, 12:57:39 PM |
|
Hi,
I want to buy one starter kit and 2 additional boards. On the shop there is marked "October delivery". So if I order now, when should I expect to have it at home (Czech republic)?? Any real estimate will be very helpfull.
Thanks!
End of October, they're currently busy with 200TH farm. You saying they use the hardware for their own enrichment instead of shipping to customers? That is not very nice at all... Actually that's not quite right. I think that the 200TH (formally 100TH mine) was supposed to be built from August-delivery kit. My impression was that the plan was always to split August hardware between customers and the 100TH mine. In fact, megabigpower seem to have prioritised their customers (rather than their mining operation) by using 100TH mine hardware to fulfil their customer's orders (not sure quite where bitfurystrikesback fit in with this). The October delivery of chips and subsequent hardware is something slightly different. Bitfurystrikesback need to receive chips (late Sept/early Oct apparently), build the hardware and ship it out to customers already in the order queue before they would get to new orders, which is why the current delivery estimate is late Oct for new orders. I think this is about right, somebody please correct me if I'm talking rubbish.
|
|
|
|
Meizirkki
|
|
September 08, 2013, 02:00:44 PM |
|
You saying they use the hardware for their own enrichment instead of shipping to customers? That is not very nice at all... I think you're getting it wrong. They aren't mining with customers hardware, but their own slice of the cake not promised to customers in the first place. Nothing wrong with that.
|
|
|
|
Sitarow
Legendary
Offline
Activity: 1792
Merit: 1047
|
|
September 08, 2013, 02:10:27 PM |
|
You saying they use the hardware for their own enrichment instead of shipping to customers? That is not very nice at all... I think you're getting it wrong. They aren't mining with customers hardware, but their own slice of the cake not promised to customers in the first place. Nothing wrong with that. The project financing was done with IPO funds, and not with orders that were placed in August. Also the IPO for that mine is available to the public as well.
|
|
|
|
BBQKorv
|
|
September 08, 2013, 02:13:00 PM |
|
If they say it ships in October, then it does. Package will spend some time in transit depending on the destination and if its expedited or regular speed. This is complete different from Avalon or BFL, don't worry
|
|
|
|
Isokivi
|
|
September 08, 2013, 02:16:06 PM |
|
The project financing was done with IPO funds, and not with orders that were placed in August. Also the IPO for that mine is available to the public as well.
Are you referring to https://picostocks.com/stocks/view/19Or something else ?
|
Bitcoin trinkets now on my online store: btc trinkets.com <- Bitcoin Tiepins, cufflinks, lapel pins, keychains, card holders and challenge coins.
|
|
|
Sitarow
Legendary
Offline
Activity: 1792
Merit: 1047
|
|
September 08, 2013, 02:16:54 PM |
|
The project financing was done with IPO funds, and not with orders that were placed in August. Also the IPO for that mine is available to the public as well.
Are you referring to https://picostocks.com/stocks/view/19Or something else ? Thats it.
|
|
|
|
Conqueror
Legendary
Offline
Activity: 1354
Merit: 1020
I was diagnosed with brain parasite
|
|
September 08, 2013, 02:31:31 PM |
|
If they say it ships in October, then it does. Package will spend some time in transit depending on the destination and if its expedited or regular speed. This is complete different from Avalon or BFL, don't worry Well it is a lot of money and question here cost nothing...so I asked first
|
|
|
|
xstr8guy
|
|
September 08, 2013, 03:43:58 PM |
|
Hi,
I want to buy one starter kit and 2 additional boards. On the shop there is marked "October delivery". So if I order now, when should I expect to have it at home (Czech republic)?? Any real estimate will be very helpfull.
Thanks!
In Europe you need to buy from BSB... http://www.bitfurystrikesback.com/
|
|
|
|
jlsminingcorp
|
|
September 08, 2013, 03:54:45 PM |
|
Hi All, Isokivi and I managed to hack together a script for logging data from the chainminer output. This might well be useful for those tuning chips by hand or for those that just like to gather data and make pretty graphs in Excel . Please change the user-configurable variables at the beginning of the script to suit your system and to change how long you would like to log for. The script will generate a log file for all of the H-boards "boards.log" and then separate log files for all of the chips in each H-board. Output is comma separated for easy import into your favourite spreadsheet. If you find any bugs then let me know, but it seems to be working OK on Isokivi's miner. #!/bin/bash # Bitfury chainminer logfile consolidation script # Jlsminingcorp and Isokivi, September 2013 # Version 1.3
# User configurable variables # $logfile is the path to the bitfury chainminer log file # $output is the path to the board-data output file that you would like to write to # $outputdir is the directory to store output in # $logtime is the time (in minutes) to collect data for # $numboards is the number of H-boards your the miner logfile="/run/shm/.stat.log" output="./boards.log" outputdir="." logtime="20" numboards="2"
# Timestamps datestamp=$(ls --full-time "$logfile" | awk '{print $6}') timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}')
# If log file or output files don't exist then take appropriate action if [ ! -e "$logfile" ]; then echo "$(date)"" : ""Logfile does not exist in the specified location" echo "$(date)"" : ""Logfile does not exist in the specified location" >> "$output" exit 1 fi
if [ ! -e "$output" ]; then echo "Date,Time,Board Position,Speed,Noncerate [GH/s],Hashrate [GH/s],Good,Errors,SPI-Errors,Miso-Errors" > "$output" fi
for (( i=1; i<="$numboards"; i++)); do chipout="$outputdir""/chips_board_""$i" if [ ! -e "$chipout" ]; then echo "Chip stats for board: ""$i" > "$chipout" echo "Date,Time,Chip,ProgParams,Speed,Noncerate,Hashrate,Nonces/round,False nonce,SPIerr,MISOerr,Jobs/5min (hash rate),ChipID,CoresOK" >> "$chipout" fi done
echo "Starting to log data" echo "Time to collect data for: ""$logtime"" minutes"
# During the data collection period (set by $logtime) parse data from the logfile to the output files let countdown="$logtime"*"60" while [ "$countdown" -ge "0" ]; do
# If timestamp in the log file is the same as the timestamp on the last entry in the output file then sleep for a while # Should make sure that we're somewhere in the middle of the 5 minute chainminer logging period # Could use "while" here, but risk getting stuck in a never-ending loop if log file is not being updated prevtimestamp=$(tail -n 1 "$output" | awk -F"," '{print $2}') if [ "$timestamp" == "$prevtimestamp" ]; then echo "Chainminer log file not yet updated. Will now sleep for a short while." echo "Chainminer log file not yet updated. Will now sleep for a short while." >> "$output" sleep 60 timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}') fi
# Strip board data out of the chainminer log file and copy to the output file IFS=$'\r\n' datalines=($(grep -A "$numboards" record "$logfile" | tail -n "$numboards" )) for i in "${datalines[@]}"; do echo -ne "$datestamp","$timestamp", >> "$output" echo "$i" | tr ":" " " | awk '{$1=$1}1' OFS="," >> "$output" done
# Strip chip data out of the chainminer log file and copy to chip output files (one for each H-board) for (( i=1; i<="$numboards"; i++)); do chipout="$outputdir""/chips_board_""$i" let startline="$i"*"16"-"15" let endline="$i"*"16" while read line; do echo -ne "$datestamp","$timestamp", >> "$chipout" echo "$line" | awk '{for (i=1; i<=12; i++) printf("%s%s", $(i), i<12 ? OFS="," : "\n"); }' >> "$chipout" done < <(awk 'NR==v1,NR==v2' v1="${startline}" v2="${endline}" "$logfile") done
echo "Time remaining: ""$countdown"" seconds" if [ "$countdown" -gt "0" ]; then sleep 300 fi let countdown="$countdown"-"300" timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}')
done
echo "Finished logging data" exit 0
|
|
|
|
dani
|
|
September 08, 2013, 04:28:19 PM |
|
Hi All, Isokivi and I managed to hack together a script for logging data from the chainminer output. This might well be useful for those tuning chips by hand or for those that just like to gather data and make pretty graphs in Excel . +1. I managed to get it running - i guess. reporting back when it's finished (set total runtime to 180min, don't expect a reply shortly).
|
Hai
|
|
|
Isokivi
|
|
September 08, 2013, 06:16:14 PM Last edit: September 09, 2013, 03:02:59 AM by Isokivi |
|
Heres how you use the make-shift stat collector with the tools currently available, please note that this is a work in progress. But asfar as I can see it works: ssh in to your bitfury-pi make a folder /home/pi/logs in the folder: nano logger.sh paste the following code in, modify the board count and desired data collection time and save the file. #!/bin/bash # Bitfury chainminer logfile consolidation script # Jlsminingcorp and Isokivi, September 2013 # Version 1.3
# User configurable variables # $logfile is the path to the bitfury chainminer log file # $output is the path to the board-data output file that you would like to write to # $outputdir is the directory to store output in # $logtime is the time (in minutes) to collect data for # $numboards is the number of H-boards your the miner logfile="/run/shm/.stat.log" output="./boards.log" outputdir="." logtime="20" numboards="2"
# Timestamps datestamp=$(ls --full-time "$logfile" | awk '{print $6}') timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}')
# If log file or output files don't exist then take appropriate action if [ ! -e "$logfile" ]; then echo "$(date)"" : ""Logfile does not exist in the specified location" echo "$(date)"" : ""Logfile does not exist in the specified location" >> "$output" exit 1 fi
if [ ! -e "$output" ]; then echo "Date,Time,Board Position,Speed,Noncerate [GH/s],Hashrate [GH/s],Good,Errors,SPI-Errors,Miso-Errors" > "$output" fi
for (( i=1; i<="$numboards"; i++)); do chipout="$outputdir""/chips_board_""$i" if [ ! -e "$chipout" ]; then echo "Chip stats for board: ""$i" > "$chipout" echo "Date,Time,Chip,ProgParams,Speed,Noncerate,Hashrate,Nonces/round,False nonce,SPIerr,MISOerr,Jobs/5min (hash rate),ChipID,CoresOK" >> "$chipout" fi done
echo "Starting to log data" echo "Time to collect data for: ""$logtime"" minutes"
# During the data collection period (set by $logtime) parse data from the logfile to the output files let countdown="$logtime"*"60" while [ "$countdown" -ge "0" ]; do
# If timestamp in the log file is the same as the timestamp on the last entry in the output file then sleep for a while # Should make sure that we're somewhere in the middle of the 5 minute chainminer logging period # Could use "while" here, but risk getting stuck in a never-ending loop if log file is not being updated prevtimestamp=$(tail -n 1 "$output" | awk -F"," '{print $2}') if [ "$timestamp" == "$prevtimestamp" ]; then echo "Chainminer log file not yet updated. Will now sleep for a short while." echo "Chainminer log file not yet updated. Will now sleep for a short while." >> "$output" sleep 60 timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}') fi
# Strip board data out of the chainminer log file and copy to the output file IFS=$'\r\n' datalines=($(grep -A "$numboards" record "$logfile" | tail -n "$numboards" )) for i in "${datalines[@]}"; do echo -ne "$datestamp","$timestamp", >> "$output" echo "$i" | tr ":" " " | awk '{$1=$1}1' OFS="," >> "$output" done
# Strip chip data out of the chainminer log file and copy to chip output files (one for each H-board) for (( i=1; i<="$numboards"; i++)); do chipout="$outputdir""/chips_board_""$i" let startline="$i"*"16"-"15" let endline="$i"*"16" while read line; do echo -ne "$datestamp","$timestamp", >> "$chipout" echo "$line" | awk '{for (i=1; i<=12; i++) printf("%s%s", $(i), i<12 ? OFS="," : "\n"); }' >> "$chipout" done < <(awk 'NR==v1,NR==v2' v1="${startline}" v2="${endline}" "$logfile") done
echo "Time remaining: ""$countdown"" seconds" if [ "$countdown" -gt "0" ]; then sleep 300 fi let countdown="$countdown"-"300" timestamp=$(ls --full-time "$logfile" | awk '{print $7}' | awk -F"." '{print $1}')
done
echo "Finished logging data" exit 0 chmod +x logger.sh ./logger.sh Once it has finished: less chips_board_1 Copy the contents and paste it to http://anduck.net/bfsb/#Press go. Repeat above three for the next board. Enjoy! If you feel like tipping the people who contributed jlsminingcorp (the script) 1JNeDQsT6Jh9XGqhcQPHZkpKzA9YASvNTT and Anduck (the web-app) 1Anduck6bsXBXH7fPHzePJSXdC9AEsRmt4 are the proper recipients.
|
Bitcoin trinkets now on my online store: btc trinkets.com <- Bitcoin Tiepins, cufflinks, lapel pins, keychains, card holders and challenge coins.
|
|
|
jlsminingcorp
|
|
September 08, 2013, 06:58:13 PM |
|
Heres how you use the make-shift stat collector with the tools currently available, please note that this is a work in progress. But asfar as I can see it works: ssh in to your bitfury-pi make a folder /home/pi/logs in the folder: nano logger.sh paste the following code in, modify the board count and desired data collection time and save the file. chmod +x logger.sh ./logger.sh Once it has finished: less chips_board_1 Copy the contents and paste it to http://anduck.net/bfsb/#Press go. Repeat above three for the next board. Enjoy! If you feel like tipping the people who contributed jlsminingcorp (the script) [address pending] and Anduck (the web-app) 1Anduck6bsXBXH7fPHzePJSXdC9AEsRmt4 are the proper recipients. That's really cool, congrats Anduck !
|
|
|
|
jlsminingcorp
|
|
September 08, 2013, 07:00:07 PM |
|
Hi All, Isokivi and I managed to hack together a script for logging data from the chainminer output. This might well be useful for those tuning chips by hand or for those that just like to gather data and make pretty graphs in Excel . +1. I managed to get it running - i guess. reporting back when it's finished (set total runtime to 180min, don't expect a reply shortly). Great stuff, would be interested to hear how it goes. We can tweak if necessary.
|
|
|
|
jlsminingcorp
|
|
September 08, 2013, 07:14:13 PM |
|
Heres how you use the make-shift stat collector with the tools currently available, please note that this is a work in progress. But asfar as I can see it works: ssh in to your bitfury-pi make a folder /home/pi/logs in the folder: nano logger.sh paste the following code in, modify the board count and desired data collection time and save the file. chmod +x logger.sh ./logger.sh Once it has finished: less chips_board_1 Copy the contents and paste it to http://anduck.net/bfsb/#Press go. Repeat above three for the next board. Enjoy! If you feel like tipping the people who contributed jlsminingcorp (the script) [address pending] and Anduck (the web-app) 1Anduck6bsXBXH7fPHzePJSXdC9AEsRmt4 are the proper recipients. Just a quick note for those who may not be linux natives. You can run the script in the background by typing "./logger.sh &". You will then be able to read the consolidated log files while they are being generated using the same terminal window (no need for the script to complete it's run) and copy and paste them as per Isokivi's instructions into Anduck's great web app. Oh, and just a word of warning. Be careful of logging for too long, as I don't know how much free storage space there is on the Pi and in principle you could log yourself out of free space . Keep an eye on the size of the log files, as the data from previous runs is not deleted on subsequent runs (you would need to delete the log files between runs to get rid of the previous data).
|
|
|
|
|