paradigmflux (OP)
|
|
July 19, 2014, 03:17:30 PM Last edit: July 19, 2014, 05:07:50 PM by paradigmflux |
|
In this thread, I am going to go through the step required to build your very own POS multipool, complete with pretty front end stats and automatic exchange trading.. This will only work for bitcoin related coins. I don't promise to share all the secrets, but enough to get a site up and running. I am going to tell you all this completely free. I am not going to polute this with donation solicitations - if you would like to send me some btc for writing this once you're done reading it, pm me and I'll send you my address. I will make some effort to support to help with the implementation of this if anyone has trouble. It might not be the prettiest, but it works - and I won't blackmail coins out of any of you. This is a big screw you to all the cunt MP operators who have forgotten where this entire scene came from. Satoshi releaed the bitcoin client opensource, quit charging people a bitcoin to set them up a multipool and quit trying to screw everyone over. This implementation btw, does not fudge any of the stats like some of the other major pools, and this offers complete visibility into stats. It also doesn't do any of that "everbody contributes shares to an algo specific bucket that pays out once per day" crap where the pool operator can steal 1/3 of the money off the top before anyone can tell. This uses a combination of python, node.js and even basic cronjobs to get the job done. Step #1 - start with a basic NOMP install. git clone from the NOMP repo: https://github.com/zone117x/node-open-mining-portal.gitStep #2 - Set up your coin daemons as if this was a basic NOMP build. There are a few exceptions. In the /coins/ directory you are going to be limited to only using coins posted on either Cryptsy or Mintpal - I could eaily have added some other markets to this, so feel free. For each coin, in the /coins directiory include an additional line: "ID": "<either the market ID from Cryptsy, or the word "mintpal" if the coin is only on mintpal>" IE: root@blackcoinpool# cat feathercoin.json { "name": "Feathercoin", "ID": "5", "symbol": "FTC", "algorithm": "scrypt" }
The pool is going to use this to help it look up the values for the coin when it is calculating some of the stats later. Also, when setting up the coin daemons - we aren't going to use the NOMP payment processor for anything more than calculating the balances tables in redis. So make sure to set every coin's minimum payout to 999999999999999 and in the main config.json disable the worker name authentication. main config.json...
"defaultPoolConfigs": { "blockRefreshInterval": 1000, "jobRebroadcastTimeout": 55, "connectionTimeout": 600, "emitInvalidBlockHashes": false, "validateWorkerUsername": false, "tcpProxyProtocol": false, "banning": { "enabled": true, "time": 600, "invalidPercent": 30, "checkThreshold": 500, "purgeInterval": 300 }, "redis": { "host": "127.0.0.1", "port": 6379 } },
"website": { "enabled": true, "host": "0.0.0.0", "port": 80, "stratumHost": "everypool.com", "stats": { "updateInterval": 60, "historicalRetention": 86400, "hashrateWindow": 300 },
this is from inside on the of the /pool_config/ coin configurations "paymentProcessing": { "enabled": true, "paymentInterval": 75, "minimumPayment": 99999999999, "daemon": { "host": "127.0.0.1", "port": <whatever port your coin daemon is listening on>, "user": "birdsofafeather", "password": "flocktogether" } },
|
|
|
|
|
|
|
|
"There should not be any signed int. If you've found a signed int
somewhere, please tell me (within the next 25 years please) and I'll
change it to unsigned int." -- Satoshi
|
|
|
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 03:17:48 PM Last edit: July 19, 2014, 05:06:24 PM by paradigmflux |
|
Next up, we are going to extend the stats.js file to extend the API to have some new functionality. This is going to be a long post. In the /libs/ directory, rename stats.js entirely to stats.old Open up a new stats.js and paste the following three snippets - modify the SECOND sections. If your target coin is only on mintpal, uncomment the lines calling the mintpal API and fill in your ticker symbol. If your coin is on cryptsy, fill in the appropriate market ID and the Cryptsy ticker symbol. This file will extend your API in a few ways. Most importantly, it will make <your url>/api/payout/<worker name> return the estimated number of coins a worker has earned during the current shift (in the payout coin of your choice). It will also extend the 'my miner' page so that every worker can have a complete list of exactly how many coins of what type they have earned during that current shift. var zlib = require('zlib');
var redis = require('redis'); var async = require('async'); var request = require('request');
var os = require('os');
var algos = require('stratum-pool/lib/algoProperties.js');
module.exports = function(logger, portalConfig, poolConfigs){
var _this = this;
var logSystem = 'Stats';
var redisClients = []; var redisStats;
this.statHistory = []; this.statPoolHistory = [];
this.stats = {}; this.statsString = '';
setupStatsRedis(); gatherStatHistory();
var canDoStats = true;
Object.keys(poolConfigs).forEach(function(coin){
if (!canDoStats) return;
var poolConfig = poolConfigs[coin];
var redisConfig = poolConfig.redis;
for (var i = 0; i < redisClients.length; i++){ var client = redisClients[i]; if (client.client.port === redisConfig.port && client.client.host === redisConfig.host){ client.coins.push(coin); return; } } redisClients.push({ coins: [coin], client: redis.createClient(redisConfig.port, redisConfig.host) }); });
function setupStatsRedis(){ redisStats = redis.createClient(portalConfig.redis.port, portalConfig.redis.host); redisStats.on('error', function(err){ logger.error(logSystem, 'Historics', 'Redis for stats had an error ' + JSON.stringify(err)); }); }
function gatherStatHistory(){
var retentionTime = (((Date.now() / 1000) - portalConfig.website.stats.historicalRetention) | 0).toString();
redisStats.zrangebyscore(['statHistory', retentionTime, '+inf'], function(err, replies){ if (err) { logger.error(logSystem, 'Historics', 'Error when trying to grab historical stats ' + JSON.stringify(err)); return; } for (var i = 0; i < replies.length; i++){ _this.statHistory.push(JSON.parse(replies[i])); } _this.statHistory = _this.statHistory.sort(function(a, b){ return a.time - b.time; }); _this.statHistory.forEach(function(stats){ addStatPoolHistory(stats); }); }); }
function addStatPoolHistory(stats){ var data = { time: stats.time, pools: {} }; for (var pool in stats.pools){ data.pools[pool] = { hashrate: stats.pools[pool].hashrate, workerCount: stats.pools[pool].workerCount, blocks: stats.pools[pool].blocks } } _this.statPoolHistory.push(data); }
this.getCoins = function(cback){ _this.stats.coins = redisClients[0].coins; cback(); };
this.getPayout = function(address, cback){
async.waterfall([
function(callback){
_this.getBalanceByAddress(address, function(){
callback(null, 'test'); });
},
function(msg, callback){
var totaltargetcoin = 0;
async.each(_this.stats.balances, function(balance, cb){
_this.getCoinTotals(balance.coin, balance.balance, function(targetcoin){
if(typeof(targetcoin) != "undefined"){ totaltargetcoin += targetcoin; }
cb(); });
}, function(err){ callback(null, totaltargetcoin); }); }
], function(err, total){
cback(total.toFixed());
}); };
this.getBalanceByAddress = function(address, cback){
var client = redisClients[0].client, coins = redisClients[0].coins, balances = []; payouts = [];
client.hgetall('Payouts:' + address, function(error, txns){ //logger.error(logSystem, 'TEMP', 'txnid variable is:' + txnid);
if (error) { callback ('There was no payouts found'); return; } if(txns === null){ var index = []; } else{ payouts=txns;
}
});
async.each(coins, function(coin, cb){
client.hget(coin + ':balances', address, function(error, result){ if (error){ callback('There was an error getting balances'); return; }
if(result === null) { result = 0; }else{ result = result; }
balances.push({ coin:coin, balance:result });
cb(); });
}, function(err){ _this.stats.balances = balances; _this.stats.address = address;
cback(); }); };
this.getCoinTotals = function(coin, balance, cback){ var client = redisClients[0].client, coinData = poolConfigs[coin].coin; logger.error(logSystem, 'TEMP', 'var is' + JSON.stringify(poolConfigs[coin].coin)); //logger.error(logSystem, 'TEMP', 'coinData.ID variable is:' + coinData.ID);
async.waterfall([
// Get all balances from redis if no balance was provided already function(callback){
if(balance) { callback(null, balance); return; }
client.hgetall(coin + ':balances', function(error, results){ if (error){ callback('There was an error getting balances'); return; }
callback(null, results); }); },
THIS NEXT PART OF THE FILE YOU NEED TO MAKE SOME CHANGES TO - this is a continuation of the file above though // make a call to Mintpal to get targetcoin exchange rate function(balances_results, callback){ var options = { // url:'https://api.mintpal.com/market/stats/<TICKER SYMBOL OF YOUR TARGET COIN HERE>/BTC', url:'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=<CRYPTSY MARKET ID OF YOUR TARGET COIN>', json:true }
request(options, function (error, response, body) { if (!error && response.statusCode == 200) { // var targetcoin_price = parseFloat(body[0].last_price); var targetcoin_price = body['return'].markets['<YOUR POS TARGET COIN TICKET SYMBOL HERE>'].lasttradeprice; callback(null, targetcoin_price, balances_results);
} else { callback('There was an error getting mintpal targetcoin exchange rate'); } }); },
The rest of the stats.js is below - just paste all three of these into the same file, remembering in to fill in your info into the second part. // make call to get coin's exchange rate function(targetcoin_price, balances_results, callback){
// logger.error(logSystem, 'TEMP', '#1 ---- coinData.ID variable is:' + coinData.ID);
if(coinData.ID === 'mintpal') {
var optionsB = { url:'https://api.mintpal.com/market/stats/' + coinData.symbol + '/BTC', json:true }
request(optionsB, function (error, responseB, bodyB) { if (!error && responseB.statusCode == 200) { var coinB_price = parseFloat(bodyB[0].last_price); logger.error(logSystem, 'TEMP', 'coinB_price variable is:' + coinB_price);
callback(null, targetcoin_price, coinB_price, balances_results);
} else { callback('There was an error getting mintpal exchange rate'); } });
} else if (coinData.ID) {
var options = { url:'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=' + coinData.ID, json:true }
request(options, function (error, response, body) { if (!error && response.statusCode == 200) { var coin_price = body['return'].markets[coinData.symbol].lasttradeprice;
/* if(coin_price.toString().indexOf('-') === -1) { // Good it doesn't have a dash.. no need to convert it to a fixed number } else { var decimal_places = coin_price.toString().split('-')[1]; coin_price = coin_price.toFixed(parseInt(decimal_places)); } */
callback(null, targetcoin_price, coin_price, balances_results);
} else { callback('There was an error getting mintpal targetcoin exchange rate'); } }); } else { callback(null, targetcoin_price, coinData.rate, balances_results); } },
// Calculate the amount of targetcoin earned from the worker's balance function(targetcoin_price, coin_price, balances_results, callback){
if(typeof balances_results !== 'object') { var total_coins = balances_results var bitcoins = parseFloat(total_coins) * coin_price; var balance = (bitcoins / targetcoin_price);
callback(null, balance); return; }
var balances = [];
for(var worker in balances_results){ var total_coins = parseFloat(balances_results[worker]) / 1; var bitcoins = total_coins.toFixed() * coin_price; var balance = (bitcoins / targetcoin_price); balances.push({worker:worker, balance:balance.toFixed( 8 )}); }
callback(null, balances); }
], function(err, balances){
if(balance) { cback(balances); return; }
_this.stats.balances = balances; _this.stats.payout = payouts; //logger.error(logSystem, 'TEMP', '_this.stats right before CB variable is:' + JSON.stringify(_this.stats));
cback(); });
};
this.getGlobalStats = function(callback){
var statGatherTime = Date.now() / 1000 | 0;
var allCoinStats = {};
async.each(redisClients, function(client, callback){ var windowTime = (((Date.now() / 1000) - portalConfig.website.stats.hashrateWindow) | 0).toString(); var redisCommands = [];
var redisCommandTemplates = [ ['zremrangebyscore', ':hashrate', '-inf', '(' + windowTime], ['zrangebyscore', ':hashrate', windowTime, '+inf'], ['hgetall', ':stats'], ['scard', ':blocksPending'], ['scard', ':blocksConfirmed'], ['scard', ':blocksOrphaned'] ];
var commandsPerCoin = redisCommandTemplates.length;
client.coins.map(function(coin){ redisCommandTemplates.map(function(t){ var clonedTemplates = t.slice(0); clonedTemplates[1] = coin + clonedTemplates[1]; redisCommands.push(clonedTemplates); }); });
client.client.multi(redisCommands).exec(function(err, replies){ if (err){ logger.error(logSystem, 'Global', 'error with getting global stats ' + JSON.stringify(err)); callback(err); } else{ for(var i = 0; i < replies.length; i += commandsPerCoin){ var coinName = client.coins[i / commandsPerCoin | 0]; var coinStats = { name: coinName, symbol: poolConfigs[coinName].coin.symbol.toUpperCase(), algorithm: poolConfigs[coinName].coin.algorithm, hashrates: replies[i + 1], poolStats: { validShares: replies[i + 2] ? (replies[i + 2].validShares || 0) : 0, validBlocks: replies[i + 2] ? (replies[i + 2].validBlocks || 0) : 0, invalidShares: replies[i + 2] ? (replies[i + 2].invalidShares || 0) : 0, totalPaid: replies[i + 2] ? (replies[i + 2].totalPaid || 0) : 0 }, blocks: { pending: replies[i + 3], confirmed: replies[i + 4], orphaned: replies[i + 5] } }; allCoinStats[coinStats.name] = (coinStats); } callback(); } }); }, function(err){ if (err){ logger.error(logSystem, 'Global', 'error getting all stats' + JSON.stringify(err)); callback(); return; }
var portalStats = { time: statGatherTime, global:{ workers: 0, hashrate: 0 }, algos: {}, pools: allCoinStats };
Object.keys(allCoinStats).forEach(function(coin){ var coinStats = allCoinStats[coin]; coinStats.workers = {}; coinStats.shares = 0; coinStats.hashrates.forEach(function(ins){ var parts = ins.split(':'); var workerShares = parseFloat(parts[0]); var worker = parts[1]; if (workerShares > 0) { coinStats.shares += workerShares; if (worker in coinStats.workers) coinStats.workers[worker].shares += workerShares; else coinStats.workers[worker] = { shares: workerShares, invalidshares: 0, hashrateString: null }; } else { if (worker in coinStats.workers) coinStats.workers[worker].invalidshares -= workerShares; // workerShares is negative number! else coinStats.workers[worker] = { shares: 0, invalidshares: -workerShares, hashrateString: null }; } });
var shareMultiplier = Math.pow(2, 32) / algos[coinStats.algorithm].multiplier; coinStats.hashrate = shareMultiplier * coinStats.shares / portalConfig.website.stats.hashrateWindow;
coinStats.workerCount = Object.keys(coinStats.workers).length; portalStats.global.workers += coinStats.workerCount;
/* algorithm specific global stats */ var algo = coinStats.algorithm; if (!portalStats.algos.hasOwnProperty(algo)){ portalStats.algos[algo] = { workers: 0, hashrate: 0, hashrateString: null }; } portalStats.algos[algo].hashrate += coinStats.hashrate; portalStats.algos[algo].workers += Object.keys(coinStats.workers).length;
for (var worker in coinStats.workers) { coinStats.workers[worker].hashrateString = _this.getReadableHashRateString(shareMultiplier * coinStats.workers[worker].shares / portalConfig.website.stats.hashrateWindow); }
delete coinStats.hashrates; delete coinStats.shares; coinStats.hashrateString = _this.getReadableHashRateString(coinStats.hashrate); });
Object.keys(portalStats.algos).forEach(function(algo){ var algoStats = portalStats.algos[algo]; algoStats.hashrateString = _this.getReadableHashRateString(algoStats.hashrate); });
_this.stats = portalStats; _this.statsString = JSON.stringify(portalStats);
_this.statHistory.push(portalStats); addStatPoolHistory(portalStats);
var retentionTime = (((Date.now() / 1000) - portalConfig.website.stats.historicalRetention) | 0);
for (var i = 0; i < _this.statHistory.length; i++){ if (retentionTime < _this.statHistory[i].time){ if (i > 0) { _this.statHistory = _this.statHistory.slice(i); _this.statPoolHistory = _this.statPoolHistory.slice(i); } break; } }
redisStats.multi([ ['zadd', 'statHistory', statGatherTime, _this.statsString], ['zremrangebyscore', 'statHistory', '-inf', '(' + retentionTime] ]).exec(function(err, replies){ if (err) logger.error(logSystem, 'Historics', 'Error adding stats to historics ' + JSON.stringify(err)); }); callback(); });
};
this.getReadableHashRateString = function(hashrate){ var i = -1; var byteUnits = [ ' KH', ' MH', ' GH', ' TH', ' PH' ]; do { hashrate = hashrate / 1024; i++; } while (hashrate > 1024); return hashrate.toFixed(2) + byteUnits[i]; };
}; [/code
Delete the stock website.js file as well, make a new one: [code]
var fs = require('fs'); var path = require('path');
var async = require('async'); var watch = require('node-watch'); var redis = require('redis');
var dot = require('dot'); var express = require('express'); var bodyParser = require('body-parser'); var compress = require('compression');
var Stratum = require('stratum-pool'); var util = require('stratum-pool/lib/util.js');
var api = require('./api.js');
module.exports = function(logger){
dot.templateSettings.strip = false;
var portalConfig = JSON.parse(process.env.portalConfig); var poolConfigs = JSON.parse(process.env.pools);
var websiteConfig = portalConfig.website;
var portalApi = new api(logger, portalConfig, poolConfigs); var portalStats = portalApi.stats;
var logSystem = 'Website';
var pageFiles = { 'index.html': 'index', 'home.html': '', 'tbs.html': 'tbs', 'workers.html': 'workers', 'api.html': 'api', 'admin.html': 'admin', 'mining_key.html': 'mining_key', 'miner.html': 'miner', 'miner_stats.html': 'miner_stats', 'user_shares.html': 'user_shares', 'getting_started.html': 'getting_started' };
var pageTemplates = {};
var pageProcessed = {}; var indexesProcessed = {};
var keyScriptTemplate = ''; var keyScriptProcessed = '';
var processTemplates = function(){
for (var pageName in pageTemplates){ if (pageName === 'index') continue; pageProcessed[pageName] = pageTemplates[pageName]({ poolsConfigs: poolConfigs, stats: portalStats.stats, portalConfig: portalConfig }); indexesProcessed[pageName] = pageTemplates.index({ page: pageProcessed[pageName], selected: pageName, stats: portalStats.stats, poolConfigs: poolConfigs, portalConfig: portalConfig }); }
//logger.debug(logSystem, 'Stats', 'Website updated to latest stats'); };
var readPageFiles = function(files){ async.each(files, function(fileName, callback){ var filePath = 'website/' + (fileName === 'index.html' ? '' : 'pages/') + fileName; fs.readFile(filePath, 'utf8', function(err, data){ var pTemp = dot.template(data); pageTemplates[pageFiles[fileName]] = pTemp callback(); }); }, function(err){ if (err){ console.log('error reading files for creating dot templates: '+ JSON.stringify(err)); return; } processTemplates(); }); };
//If an html file was changed reload it watch('website', function(filename){ var basename = path.basename(filename); if (basename in pageFiles){ console.log(filename); readPageFiles([basename]); logger.debug(logSystem, 'Server', 'Reloaded file ' + basename); } });
portalStats.getGlobalStats(function(){ readPageFiles(Object.keys(pageFiles)); });
var buildUpdatedWebsite = function(){ portalStats.getGlobalStats(function(){ processTemplates();
var statData = 'data: ' + JSON.stringify(portalStats.stats) + '\n\n'; for (var uid in portalApi.liveStatConnections){ var res = portalApi.liveStatConnections[uid]; res.write(statData); }
}); };
setInterval(buildUpdatedWebsite, websiteConfig.stats.updateInterval * 1000);
var buildKeyScriptPage = function(){ async.waterfall([ function(callback){ var client = redis.createClient(portalConfig.redis.port, portalConfig.redis.host); client.hgetall('coinVersionBytes', function(err, coinBytes){ if (err){ client.quit(); return callback('Failed grabbing coin version bytes from redis ' + JSON.stringify(err)); } callback(null, client, coinBytes || {}); }); }, function (client, coinBytes, callback){ var enabledCoins = Object.keys(poolConfigs).map(function(c){return c.toLowerCase()}); var missingCoins = []; enabledCoins.forEach(function(c){ if (!(c in coinBytes)) missingCoins.push(c); }); callback(null, client, coinBytes, missingCoins); }, function(client, coinBytes, missingCoins, callback){ var coinsForRedis = {}; async.each(missingCoins, function(c, cback){ var coinInfo = (function(){ for (var pName in poolConfigs){ if (pName.toLowerCase() === c) return { daemon: poolConfigs[pName].paymentProcessing.daemon, address: poolConfigs[pName].address } } })(); var daemon = new Stratum.daemon.interface([coinInfo.daemon], function(severity, message){ logger[severity](logSystem, c, message); }); daemon.cmd('dumpprivkey', [coinInfo.address], function(result){ if (result[0].error){ logger.error(logSystem, c, 'Could not dumpprivkey for ' + c + ' ' + JSON.stringify(result[0].error)); cback(); return; }
var vBytePub = util.getVersionByte(coinInfo.address)[0]; var vBytePriv = util.getVersionByte(result[0].response)[0];
coinBytes[c] = vBytePub.toString() + ',' + vBytePriv.toString(); coinsForRedis[c] = coinBytes[c]; cback(); }); }, function(err){ callback(null, client, coinBytes, coinsForRedis); }); }, function(client, coinBytes, coinsForRedis, callback){ if (Object.keys(coinsForRedis).length > 0){ client.hmset('coinVersionBytes', coinsForRedis, function(err){ if (err) logger.error(logSystem, 'Init', 'Failed inserting coin byte version into redis ' + JSON.stringify(err)); client.quit(); }); } else{ client.quit(); } callback(null, coinBytes); } ], function(err, coinBytes){ if (err){ logger.error(logSystem, 'Init', err); return; } try{ keyScriptTemplate = dot.template(fs.readFileSync('website/key.html', {encoding: 'utf8'})); keyScriptProcessed = keyScriptTemplate({coins: coinBytes}); } catch(e){ logger.error(logSystem, 'Init', 'Failed to read key.html file'); } });
}; buildKeyScriptPage();
var getPage = function(pageId){ if (pageId in pageProcessed){ var requestedPage = pageProcessed[pageId]; return requestedPage; } };
var minerpage = function(req, res, next){ var address = req.params.address || null;
if (address != null){ portalStats.getBalanceByAddress(address, function(){ processTemplates();
res.end(indexesProcessed['miner_stats']);
}); } else next(); };
var payout = function(req, res, next){ var address = req.params.address || null;
if (address != null){ portalStats.getPayout(address, function(data){ res.write(data.toString()); res.end(); }); } else next(); };
var shares = function(req, res, next){ portalStats.getCoins(function(){ processTemplates();
res.end(indexesProcessed['user_shares']);
}); };
var usershares = function(req, res, next){
var coin = req.params.coin || null;
if(coin != null){ portalStats.getCoinTotals(coin, null, function(){ processTemplates();
res.end(indexesProcessed['user_shares']);
}); } else next(); };
var route = function(req, res, next){ var pageId = req.params.page || ''; if (pageId in indexesProcessed){ res.header('Content-Type', 'text/html'); res.end(indexesProcessed[pageId]); } else next();
};
var app = express();
app.use(bodyParser.json());
app.get('/get_page', function(req, res, next){ var requestedPage = getPage(req.query.id); if (requestedPage){ res.end(requestedPage); return; } next(); });
app.get('/key.html', function(req, res, next){ res.end(keyScriptProcessed); });
app.get('/stats/shares/:coin', usershares); app.get('/stats/shares', shares); app.get('/miner/:address', minerpage); app.get('/payout/:address', payout);
app.get('/:page', route); app.get('/', route);
app.get('/api/:method', function(req, res, next){ portalApi.handleApiRequest(req, res, next); });
app.post('/api/admin/:method', function(req, res, next){ if (portalConfig.website && portalConfig.website.adminCenter && portalConfig.website.adminCenter.enabled){ if (portalConfig.website.adminCenter.password === req.body.password) portalApi.handleAdminApiRequest(req, res, next); else res.send(401, JSON.stringify({error: 'Incorrect Password'}));
} else next();
});
app.use(compress()); app.use('/static', express.static('website/static'));
app.use(function(err, req, res, next){ console.error(err.stack); res.send(500, 'Something broke!'); });
try { app.listen(portalConfig.website.port, portalConfig.website.host, function () { logger.debug(logSystem, 'Server', 'Website started on ' + portalConfig.website.host + ':' + portalConfig.website.port); }); } catch(e){ logger.error(logSystem, 'Server', 'Could not start website on ' + portalConfig.website.host + ':' + portalConfig.website.port + ' - its either in use or you do not have permission'); }
};
Then inside the <nomp instal>/website/pages folder create two new files.. Three actually. First while in the <nomp install>/website/pages type 'touch user_shares.html' just to create the file so NOMP won't puke for now. Open a new file named miner.html: <div class="row"> <div class="col-md-2"> </div> <div class="col-md-4"> <p class="lead">Enter Your <YOUR COIN> Wallet address</p> <div class="input-group"> <input type="text" class="form-control input-lg"> <span class="input-group-btn"> <button class="btn btn-default btn-lg" type="button">Go!</button> </span> </div> </div> <div class="col-md-4"></div> </div> <!--- end row --!>
<script type="text/javascript"> $(document).ready(function(){
$('.btn-lg').click(function(){ window.location = "miner/" + $('.input-lg').val(); }); }); </script>
[/code]
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 03:18:03 PM Last edit: July 19, 2014, 04:18:48 PM by paradigmflux |
|
As well as a miner_stat.html <div class="row"> <div class="col-md-6"> {{? it.stats.balances }}
<div class="row"> </div> <div class="row"> <div class="panel panel-default"> <div class="panel-heading"> <h1><strong><span id="address">{{=it.stats.address}}<span></strong> </h1></div> <div class="panel-body"> <p class="lead">So far this shift you have earned: <strong><span class="payout"></span></strong> <<INSERT UR TARGET SYMBOL HERE> (estimate).</p>
<p class="lead">Your previous balance with the pool is currently: <strong><span class="owed"></span></strong> <INSERT UR TARGET SYMBOL HERE> </p? </div> </div> </div> </div> <!-- end row --> <div class="row"> </div>
</div> </div> <div class="row"> </div> <center> <div class="panel panel-default"> <div class="panel-body"> <div class="stat-panel row">
<div>
<!-- content --> <div class="row">
<!-- main col left --> <div class="col-sm-12">
<div class="panel panel-default"> <div class="panel-body">
<div class="stat-panel row"> {{ for(var balance in it.stats.balances) { }} <div class="panel panel-default col-md-2 col-xs-4"> <div class="panel-heading" style="background-color:#3d3d3d;color:white;"><center><span class="text-xs"> {{=it.stats.balances[balance].coin}}</span></center></div> <div class="panel-body"><span class="text-bg"><strong>{{=it.stats.balances[balance].balance}}</strong></span>  </div> </div> {{ } }} </div> </div>
</div>
<!-- Debug --> <div class="debug" style="display:none;"> {{=JSON.stringify(it.stats)}} </div>
</div> {{?}}
<script type="text/javascript">
$(document).ready(function(){
var addr = $('#address').text();
$.get('http://ururl:7379/hget/Pool_Stats:CurrentShift:WorkerTgtCoin/' + addr + '.txt', function(payoutdata){ $('.payout').text(payoutdata); });
/* $.each($('.blockAmount'), function(i, v){ if($(v).html() === "undefined") { $(v).html(' --- '); } });
$.each($('.blockShares'), function(i, v){ if($(v).html() === "undefined") { $(v).html(' --- '); } }); */ });
$(function() { var addr = $('#address').text();
$.get("http://ururl:7379/ZSCORE/Pool_Stats:Balances/" + addr, function( payoutdata ) { $('.owed').append(payoutdata.ZSCORE); }); });
</script>
next install Webdis from here https://github.com/nicolasff/webdisand launch it with a .json config file like this: root@blackcoinpool:/FORFREEDOM/# cat ~/webdis-home/webdis.json { "redis_host": "127.0.0.1",
"redis_port": 6379, "redis_auth": null,
"http_host": "0.0.0.0", "http_port": 7379,
"threads": 5, "pool_size": 20,
"daemonize": true, "websockets": true, "default_root": "/GET/index.html", "database": 0,
"acl": [ { "disabled": ["*"] },
{ "enabled": ["GET", "HGET", "ZRANGE", "ZCARD", "ZRANGEBYSCORE", "ZVAL", "LRANGE", "HGETALL", "HKEYS", "HVALS", "GETALL", "HGETALL", "ZRANGE", "SMEMBERS", "ZSCORE"] } ],
"verbosity": 6, "logfile": "/root/webdis-home/webdis.log" }
Next, you should be able to restart your nomp install and browse to <url>/miner and input your worker address and see all the coin that worker has mined that shift. the balance and outstanding amount will be blank till. you can manually browe to <url>/payout/<worker> to see an estimated next payout. Next up, let's do a deep dive into redis and how NOMP uses it (as well a re-write the entire payment processor)
|
|
|
|
albertdros
|
|
July 19, 2014, 03:18:13 PM |
|
that's great! Looking forward to your guide.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 03:18:17 PM Last edit: July 19, 2014, 05:16:50 PM by paradigmflux |
|
How NOMP uses REDIS, and how we can use it to get what we want (and improve it) All coin are given a redis key named <coin> - this breaks down into subkeys, :blocksCompleted, blocksPending, blocksOrphaned, blocksKicked It also has a :balances key that keeps track of each workers current earnings per key, and a :payouts (which keeps track of the number of coins paid out to each user).
If we set up the coin configs right, we will never ever see a :payouts key in our install. We are also going to create a new subkey, :blocksPaid that we are going to be moving every paid for round into, every time we run payouts.
Now the only stats that NOMP keeps are brutal.
We are going to implement a whole new level of logging into redis.
A master key, Pool_Stats contains everything. We also have a kew named Coin_Names that contains (lowercase) a list of every coin we support. We have Coin_Algos that lists every algo we support (lowercase) We have Coin_Names_<algo> that lists every coin of that algo. These are all stored as hashes, by the way, with values of 1 (although the value doesn't matter, these are all accessed via the redis-command HVALS)
ie: Coin_Names consists of two hashes: feathercoin 1 terracoin 1 Coin_Algos consists of two hashes: scrypt 1 sha256 1 Coin_Names_scrypt consists of 1 hash feathercoin 1 Coin_Name_sha256 consists of 1 hash: terracoin 1
Set these up like this: root@blackcoinpool:/: redis-cli hset Coin_Names feathercoin 1 root@blackcoinpool:/: redis-cli hset Coin_Algos scrypt 1 root@blackcoinpool:/: redis-cli hset Coin_Names_scrypt feathercoin 1 and so on for all of your coins/algos....
We are going to keep track of every users historical hashrate in a key named Pool_Stats:WorkerHRS:<Algo>:<Worker> as part of a redis SORTED SET. The format for the sorted set will be to use epoch time as the score and the value will be set to: <Users current hashrate>:<epoch time> - the epoch time is requierd to allow redis to store duplicate hashrates (as it makes them all unique). Pool_Stats will have seperate subkeys for every shift, but all of that will get created automatically through the next series of scripts I will provide.
We will keep CurrentShift datat in Pool_Stats:CurrentShift and current shift profitability data in Pool_Stats:CurrentShift:Profitability Inside CurrentShift we will have:
Pool_Stats:CurrentShift:AlgosBTC - hash field listing each algos total value, in BTC, of each algo mined so far this shift. Pool_Stats:CurrentShift:AlgosTgtCoin - hash field listing each algos total value, in target coin, of each algo mined so far this this shift. Pool_Stats:CurrentShift:CoinsBTC - hash field listing each coins total BTC that have been earned so far this shift. Pool_Stats:CurrentShift:CoinsTgtCoin - hash field listing each coins total target coins that have been earned so far this shift. Pool_Stats:CurrentShift:WorkersBTC - hash field listing each workers total BTC they have earned so far this shift Pool_Stats:CurrentShift:WorkersTgtCoin - hash field listing each workers total target coin they have earned so far this shift.
We will keep track of historical stats in: Pool_Stats:Profitability_<algo> - set of profitabilities, using shift number as field and profitability a value. Pool_Stats:WorkerHRs:<algo>:<worker> - sorted set list of hashrates, per worker, per algo. Epoch time as field, value is hashrate:epochtime to ensure uniqueness. Pool_Stats:Balances - outstanding balances in target coin Pool_Stats:DetailedPayouts:<worker> - sorted set list of payouts, using epoch time as field and value as txn ids in target coin. Pool_Stats:DetailedPayouts:<worker>:<txn> - Hash key named Date: with value as epoch time, hash key as Amount: and value as txn amount, hash key as URL and value set to full URL for txn in target coin's block explorer. Pool_Stats:TotalPaid - hash key listing every worker ID as a key name and the total amount of target coins they have been paid in total as the value.
.
These values are all automatically calculated by the payment processor, which is coming up in a few posts - the reason for the duplicate storage (in both BTC and TgtCoin is it provides a level of sanity checking for the payment processor before it decides to pay out the shift automatically or not).
Pool_Stats:CurrentShift will always have a starttime set to the epoch time that shift started at. We will have an incrementer named This_Shift in Pool_Stats (a hash value) that will be incremented by one whenver a shift ends. Whenever a shift end, we will rename all of the Pool_Stats:CurrentShift keys to be Pool_Stats:<value of ThisShift variable> instead, and then start a fresh Pool_Stats:CurrentShift set of keys.
Let's start off with calculating and storing the users hashrates into the redis db.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 03:19:04 PM |
|
lol you guy are dinks for stealting the first posts
|
|
|
|
|
bathrobehero
Legendary
Offline
Activity: 2002
Merit: 1051
ICO? Not even once.
|
|
July 19, 2014, 03:50:17 PM |
|
lol you guy are dinks for stealting the first posts
There's an Edit button for a reason.
|
Not your keys, not your coins!
|
|
|
MV120
Newbie
Offline
Activity: 6
Merit: 0
|
|
July 19, 2014, 04:07:26 PM |
|
These are probably noob questions, but here goes anyways.
How much machine would this require (RAM, storage)?
What is the minimum hashrate you would recommend for before setting up your own Multipool?
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 04:12:23 PM |
|
These are probably noob questions, but here goes anyways.
How much machine would this require (RAM, storage)?
What is the minimum hashrate you would recommend for before setting up your own Multipool?
SSD backing is important, but you can run a MP easily on a machine with 4 GB of ram and 40 GB of disk space.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 04:34:23 PM |
|
Set up a cronjob to run every 10 minutes. Have it run a bash file, and this is the bash file: #!/bin/bash workercounter=0 arraycounter=0 now="`date +%s`" TenMins=$((now - 600)) interval=600 modifier=65536 SHAmodifier=4294967296 redis-cli del tmpkey
while read Algo do TotalWorkers=0 WorkerTotals=0 unset arrWorkerTotals unset arrWorkerCounts unset arrWorkerNames typeset -A arrWorkerTotals typeset -A arrWorkerCounts typeset -A arrWorkerNames AlgoCounter=0 workercounter=0 redis-cli del tmpkey while read CoinType do echo "$CoinType"
counter=0 CoinKeyName=$CoinType":hashrate" totalhashes=`redis-cli zcard $CoinKeyName` if [ -z "$totalhashes" ] then echo "no hashes" >/dev/null else while read LineItem do # echo "$LineItem" counter=$(($counter + 1)) AlgoCounter=$(($AlgoCounter + 1)) IN=$LineItem arrIN=(${IN//:/ }) preworker=(${arrIN[1]}) #strip HTML tags out to ensure safe displaying later workername=`echo "$preworker," | tr -d '<>,'` echo "$workername" if [[ $workername == "" ]] then echo "ignore worker" else share=(${arrIN[0]}) arrWorkerCounts[$workername]=$((${arrWorkerCounts[$workername]} + 1)) if [[ ${arrWorkerCounts[$workername]} -eq 1 ]] then #must have been this workers first share, so thi s is a new worker TotalWorkers=$(($TotalWorkers + 1)) workercounter=$(($workercounter + 1)) arrWorkerNames[$workercounter]=$workername echo "TotalWorkers - $TotalWorkers ~~~ workercounter - $workercounter ~~~ arrWorkerNames -" ${arrWorkerNames[$workercounter]} else #this was a duplicate worker, do nothing echo " " >/dev/null fi if [ -z "${arrWorkerTotals[$workername]}" ] then tempvar=0 else tempvar=${arrWorkerTotals[$workername]} fi
arrWorkerTotals[$workername]=`echo "scale=6;$tempvar + $share" | bc -l`
echo "${arrWorkerNames[$workercounter]}"
fi done< <(redis-cli zrangebyscore $CoinKeyName $TenMins $now)
TotalHash=`echo "$TotalHash + $share" | bc -l` fi done< <(redis-cli hkeys Coin_Names_$Algo)
# break it down - sha is stored a GH, everything else is stored a MH if [ $Algo = "sha256" ] then modifier=4294967296 divisor=1073741824 elif [ $Algo = "keccak" ] then modifier=16777216 divisor=1048576 elif [ $Algo = "x11" ] then modifier=4294967296 divisor=1048576 else modifier=65536 divisor=1048576 fi
TotalHR=`echo "scale=3;$TotalHash * $modifier / $interval / $divisor" | bc` redis-cli zadd Pool_Stats:avgHR:$Algo $now $TotalHR":"$now #go over the array of WorkerNames and calculate each workers HR counterB=0 while [[ $counterB -lt $workercounter ]] do counterB=$(($counterB + 1)) workerName=${arrWorkerNames[$counterB]} echo $workerName temporary=`echo "scale=6;${arrWorkerTotals[$workerName]} * $modifier" | bc -l` # number=`echo "$interval / $divisor" | bc -l` arrWorkerHashRates[$counterB]=`echo "$temporary / $interval / $divisor" | bc -l` workerName=${arrWorkerNames[$counterB]} rate=${arrWorkerHashRates[$counterB]} echo $workerName $rate string=$rate":"$now redis-cli zadd Pool_Stats:WorkerHRs:$Algo:$workerName $now $string echo "$Algo - $workerName -"$arrWorkerHashRates[$counterB]} done
done< <(redis-cli hkeys Coin_Algos)
Go ahead and run it a few times at the command line and see how it works - it's a bit spammy, but notice how the redis keys are being populated with all of the users hashrate information now? The line near the top can be adjuted from 600 to a lower number if you would rather users not have to wait 10 min to see an accurate hash rate. Make sure you have your Coin_Algos and Coin_Names_<algo> redis values filled in appropriately for this to work.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 04:44:11 PM |
|
Now let's set up a cronjob, and start grabbing a local copy of all of the coin exchange rates. This is a simplfied version, I have a better version that aggregates prices from multiple exchanges but we all have to keep some secrets, right? I never said I'd give away how to make the most profitable pool, but I did say i'd show you how to set up A multipool. #!/bin/bash BC2BTC=`curl -G 'https://api.mintpal.com/v1/market/stats/BC/BTC/' | jq -r .[].last_price` FTC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=5' | jq -r .return.markets.FTC.lasttradeprice` MZC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=164' | jq -r .return.markets.MZC.lasttradeprice` NET2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=134' | jq -r .return.markets.NET.lasttradeprice` DRK2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=155' | jq -r .return.markets.DRK.lasttradeprice` WDC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=14' | jq -r .return.markets.WDC.lasttradeprice` STR2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=83' | jq -r .return.markets.STR.lasttradeprice` KDC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=178' | jq -r .return.markets.KDC.lasttradeprice` NYAN2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=184' | jq -r .return.markets.NYAN.lasttradeprice` MNC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=7' | jq -r .return.markets.MNC.lasttradeprice` POT2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=173' | jq -r .return.markets.POT.lasttradeprice` GDC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=82' | jq -r .return.markets.GDC.lasttradeprice` GLC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=76' | jq -r .return.markets.GLC.lasttradeprice` BTE2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=49' | jq -r .return.markets.BTE.lasttradeprice` UNO2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=133' | jq -r .return.markets.UNO.lasttradeprice` USDE2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_USDE.last` HIRO2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_HIRO.last` GDN2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_GDN.last` cinni2btc=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_CINNI.last` NOBL2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_NOBL.last` REDD2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_REDD.last` LGC2BTC=`curl -G 'https://poloniex.com/public?command=returnTicker' | jq -r .BTC_LGC.last` EAC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=139' | jq -r .return.markets.EAC.lasttradeprice` CAP2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=53' | jq -r .return.markets.CAP.lasttradeprice` RBY2BTC=`curl -G 'https://bittrex.com/api/v1/public/getticker?market=BTC-RBY' | jq -r .result.Last` tips2ltc=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=147' | jq -r .return.markets.TIPS.lasttradeprice` ltc2btc=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=3' | jq -r .return.markets.LTC.lasttradeprice` nxt2btc=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=159' | jq -r .return.markets.NXT.lasttradeprice` tips2btc=`echo "$tips2ltc * $ltc2btc" | bc -l` TRC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=27' | jq -r .return.markets.TRC.lasttradeprice` LOT2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=137' | jq -r .return.markets.LOT.lasttradeprice` GLD2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=30' | jq -r .return.markets.GLD.lasttradeprice` MEC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=45' | jq -r .return.markets.MEC.lasttradeprice` MYR2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=200' | jq -r .return.markets.MYR.lasttradeprice` MEOW2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=149' | jq -r .return.markets.MEOW.lasttradeprice` EXE2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=183' | jq -r .return.markets.EXE.lasttradeprice` VTC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=151' | jq -r .return.markets.VTC.lasttradeprice` SAT2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=168' | jq -r .return.markets.SAT.lasttradeprice` NXT2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=159' | jq -r .return.markets.NXT.lasttradeprice` MAX2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=152' | jq -r .return.markets.MAX.lasttradeprice` THREE2BTC=`curl -G 'https://api.mintpal.com/v1/market/stats/365/BTC/' | jq -r .[].last_price` ZET2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=85' | jq -r .return.markets.ZET.lasttradeprice` XC2BTC=`curl -G 'http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid=210' | jq -r .return.markets.XC.lasttradeprice`
redis-cli hset Exchange_Rates blackcoin $BC2BTC redis-cli hset Exchange_Rates globaldenomination $GDN2BTC redis-cli hset Exchange_Rates zetacoin $ZET2BTC redis-cli hset Exchange_Rates logicoin $LGC2BTC redis-cli hset Exchange_Rates cinnicoin $cinni2btc redis-cli hset Exchange_Rates 365coin $THREE2BTC redis-cli hset Exchange_Rates NXT $NXT2BTC redis-cli hset Exchange_Rates execoin $EXE2BTC redis-cli hset Exchange_Rates vertcoin $VTC2BTC redis-cli hset Exchange_Rates kittehcoin $MEOW2BTC redis-cli hset Exchange_Rates megacoin $MEC2BTC redis-cli hset Exchange_Rates netcoin $NET2BTC redis-cli hset Exchange_Rates globalcoin $GLC2BTC redis-cli hset Exchange_Rates grandcoin $GDC2BTC redis-cli hset Exchange_Rates goldcoin $GLD2BTC redis-cli hset Exchange_Rates fedoracoin $tips2btc redis-cli hset Exchange_Rates litecoin $ltc2btc redis-cli hset Exchange_Rates nxt $NXT2BTC redis-cli hset Exchange_Rates terracoin $TRC2BTC redis-cli hset Exchange_Rates feathercoin $FTC2BTC redis-cli hset Exchange_Rates reddcoin $REDD2BTC redis-cli hset Exchange_Rates earthcoin $EAC2BTC redis-cli hset Exchange_Rates bottlecaps $CAP2BTC redis-cli hset Exchange_Rates rubycoin $RBY2BTC redis-cli hset Exchange_Rates terracoin $TRC2BTC redis-cli hset Exchange_Rates noblecoin $NOBL2BTC redis-cli hset Exchange_Rates mincoin $MNC2BTC redis-cli hset Exchange_Rates klondikecoin $KDC2BTC redis-cli hset Exchange_Rates darkcoin $DRK2BTC redis-cli hset Exchange_Rates mazacoin $MZC2BTC redis-cli hset Exchange_Rates unobtanium $UNO2BTC redis-cli hset Exchange_Rates hirocoin $HIRO2BTC redis-cli hset Exchange_Rates cinnicoin $cinni2btc redis-cli hset Exchange_Rates usde $USDE2BTC redis-cli hset Exchange_Rates lottocoin $LOT2BTC redis-cli hset Exchange_Rates nyancoin $NYAN2BTC redis-cli hset Exchange_Rates worldcoin $WDC2BTC redis-cli hset Exchange_Rates potcoin $POT2BTC redis-cli hset Exchange_Rates myriadcoin $MYR2BTC redis-cli hset Exchange_Rates myriad-scrypt $MYR2BTC redis-cli hset Exchange_Rates myriadsha $MYR2BTC redis-cli hset Exchange_Rates saturncoin $SAT2BTC redis-cli hset Exchange_Rates maxcoin $MAX2BTC redis-cli hset Exchange_Rates rubycoin $RBY2BTC redis-cli hset Exchange_Rates xccoin $XC2BTC
Get this cronjob to run every 5 minutes, this will keep track of the most current exchange prices. (the final step will be syncing all the cronjobs into a single script, but we'll get to that still in a few posts. Next up is re-writing the payment processor to properly handle the start and end of 'shifts' as well as to handle the coin moving to the exchange.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 04:55:32 PM |
|
Let me know if anyone is even reading this far.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 05:17:43 PM |
|
Naw, feel free to post - I'm not going to waste my time putting all this up if nobody is interested. I will be linking to a .zip with all of the files I am referring to as well, once I get all these written.
|
|
|
|
Kergekoin
|
|
July 19, 2014, 05:19:52 PM |
|
Im probably going to set one back-end up for my miners. Thanks for your effort! At the moment im switching manually on my NOMP pool back-end.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 05:23:08 PM |
|
STILL TO COME:
--> custom payment processor ---> front end javascript reporting ---> Exchange integration ---> Cryptonote integration --> Sanity Checking --> Future Plans
|
|
|
|
billotronic
Legendary
Offline
Activity: 1610
Merit: 1000
Crackpot Idealist
|
|
July 19, 2014, 05:54:00 PM |
|
Naw, feel free to post - I'm not going to waste my time putting all this up if nobody is interested. I will be linking to a .zip with all of the files I am referring to as well, once I get all these written.
No sir, please share! I've been very curious of how MP's work and this guide so far has been great!
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 07:28:12 PM |
|
BTW, I invite anyone intereted to check out http://www.btcdpool.com Send some hash it's way and see how it works and if you like the feel of it. I will be posting all of it's source in time, this thread is inspired by it.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 07:39:30 PM |
|
Calculating ProfitabilityCalculating profitability can be done fairly easily like this! This also calculates each worker's current earnings for the round, as well as updates each of the respective keys under the Pool_Stats:CurrentShift key for the coin/algo earning totals. Set via cronjob, runs in only a few seconds on a pool even when it is getting mutli GH of scrypt/x11 traffic. You can run it manually to verify the output (it's a bit spammy, but it prints what it's doing to the screen as you do it. If run via cronjob this would all just go to null. #!/bin/bash declare -A ProArr declare -A NameArr AlgoCounter=0 now="`date +%s`" ShiftNumber=`redis-cli hget Pool_Stats This_Shift` echo "Test" startstring="Pool_Stats:"$ShiftNumber starttime=`redis-cli hget $startstring starttime` endtime=$now length=$(($endtime - $starttime)) redis-cli hset Pool_Stats CurLength $length dayslength=`echo "scale=3;$length / 86400" | bc -l` TgtCoinPrice=`redis-cli hget Exchange_Rates <TARGET COIN NAME HERE>` TotalEarned=0 TotalEarnedTgtCoin=0 redis-cli hset Pool_Stats CurDaysLength $dayslength redis-cli del Pool_Stats:CurrentShift:WorkerBTC redis-cli del Pool_Stats:CurrentShift:WorkerTgtCoin
# START CALCULATING COIN PROFIT FOR CURRENT ROUND - THIS ALSO CALCULATES WORKER EARNINGS MID SHIFT. # PLEASE NOTE ALL COIN NAMES IN COIN_ALGO REDIS KEY MUST MATCH KEY NAMES IN EXCHANGE_RATES KEY CASE-WISE while read line do AlgoTotal=0 AlgoTotalTgtCoin=0 logkey2="Pool_Stats:CurrentShift:Algos" logkey2TgtCoin="Pool_Stats:CurrentShift:AlgosTgtCoin" echo "LOGKEY2: $logkey2" # loop through each coin for that algo while read CoinName do coinTotal=0 coinTotalTgtCoin=0 thiskey=$CoinName":balances" logkey="Pool_Stats:CurrentShift:Coins" logkeyTgtCoin="Pool_Stats:CurrentShift:CoinsTgtCoin" #Determine price for Coin coin2btc=`redis-cli hget Exchange_Rates $CoinName` # echo "$CoinName - $coin2btc" workersPerCoin=`redis-cli hlen $thiskey` if [ $workersPerCoin = 0 ] then echo "do nothing" > /dev/null else
while read WorkerName do thisBalance=`redis-cli hget $thiskey $WorkerName` thisEarned=`echo "scale=6;$thisBalance * $coin2btc" | bc -l` coinTotal=`echo "scale=6;$coinTotal + $thisEarned" | bc -l` AlgoTotal=`echo "scale=6;$AlgoTotal + $thisEarned" | bc -l` TgtCoinEarned=`echo "scale=6;$thisEarned / $TgtCoinPrice" | bc -l` coinTotalTgtCoin=`echo "scale=6;$coinTotalTgtCoin + $TgtCoinEarned" | bc -l` AlgoTotalTgtCoin=`echo "scale=6;$AlgoTotalTgtCoin + $TgtCoinEarned" | bc -l`
# echo "$WorkerName earned $TgtCoinEarned from $CoinName" redis-cli hincrbyfloat Pool_Stats:CurrentShift:WorkerTgtCoin $WorkerName $TgtCoinEarned redis-cli hincrbyfloat Pool_Stats:CurrentShift:WorkerBTC $WorkerName $thisEarned done< <(redis-cli hkeys $CoinName:balances) redis-cli hset $logkey $CoinName $coinTotal redis-cli hset $logkeyTgtCoin $CoinName $coinTotalTgtCoin echo "$CoinName: $coinTotal"
fi done< <(redis-cli hkeys Coin_Names_$line) redis-cli hset $logkey2 $line $AlgoTotal redis-cli hset $logkey2TgtCoin $line $AlgoTotalTgtCoin TotalEarned=`echo "scale=6;$TotalEarned + $AlgoTotal" | bc -l` TotalEarnedTgtCoin=`echo "scale=6;$TotalEarnedTgtCoin + $AlgoTotalTgtCoin" | bc -l`
done< <(redis-cli hkeys Coin_Algos)
# END CALCULATING COIN PROFITS FOR CURRENT SHIFT
# START CALCULATIN AVERAGE HASHRATES SO FAR THIS SHIFT echo "Start: $starttime End: $endtime" AlgoCounter=0 while read Algo do AlgoCounter=$(($AlgoCounter + 1)) if [ $Algo = "sha256" ] then Algo="sha256" fi AlgoHRTotal=0 counter=0 loopstring="Pool_Stats:AvgHRs:"$Algo while read HR do IN=$HR arrIN=(${IN//:/ }) amt=${arrIN[0]} counter=`echo "$counter + 1" | bc` AlgoHRTotal=`echo "$AlgoHRTotal + $amt" | bc -l` done< <(redis-cli zrangebyscore $loopstring $starttime $endtime)
if [ $Algo = "sha" ] then Algo="sha256" fi thisalgoAVG=`echo "scale=8;$AlgoHRTotal / $counter" | bc -l` string="average_"$Algo redis-cli hset Pool_Stats:CurrentShift $string $thisalgoAVG string3="Pool_Stats:CurrentShift:Algos" thisalgoEarned=`redis-cli hget $string3 $Algo` thisalgoP=`echo "scale=8;$thisalgoEarned / $thisalgoAVG / $dayslength" | bc -l` string2="Profitability_$Algo" redis-cli hset Pool_Stats:CurrentShift $string2 $thisalgoP if [ $Algo = "keccak" ] then thisalgoP=`echo "scale=8;$thisalgoP * 500" | bc -l` elif [ $Algo = "sha256" ] then thisalgoP=`echo "scale=8;$thisalgoP * 100" | bc -l` elif [ $Algo = "x11" ] then thisalgoP=`echo "scale=8;$thisalgoP * 4" | bc -l` else echo "done" >/dev/null fi if [ -z "$thisalgoP" ] then thisalgoP=0 fi
ProArr[$AlgoCounter]=$thisalgoP NameArr[$AlgoCounter]=$Algo redis-cli hset Pool_Stats:CurrentShift $string2 $thisalgoP
echo "For Current Shift Algo $Algo had an average of $thisalgoAVG - profitability was $thisalgoP" done< <(redis-cli hkeys Coin_Algos)
profitstring=${ProArr[1]}":"${ProArr[2]}":"${ProArr[3]}":"${ProArr[4]}":"${ProArr[5]} stringnames=${NameArr[1]}":"${NameArr[2]}":"${NameArr[3]}":"${NameArr[4]}":"${NameArr[5]} redis-cli hset Pool_Stats:CurrentShift:Profitability $now $profitstring redis-cli hset Pool_Stats:CurrentShift NameString $stringnames
Ugly, but it works. It even gives you all the data that you require to be able to pull pretty charts of current shift profitability. I will try and clean up the commenting and repost soon.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 08:00:59 PM |
|
I'll post the next bit about the pool once the hashrate on btcdpool.com goes up a bit.
|
|
|
|
PereguineBerty
Member
Offline
Activity: 109
Merit: 35
|
|
July 19, 2014, 08:45:54 PM |
|
Haven't tried it yet but if your info is good as it looks, you'll be a hero around here.
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 08:47:50 PM |
|
Haven't tried it yet but if your info is good as it looks, you'll be a hero around here.
This is all live directly off of the BTCDPool.com site. I will be uploading a snapshot of all of the config files from that pool later tonight.
|
|
|
|
goodluckpool
Newbie
Offline
Activity: 42
Merit: 0
|
|
July 19, 2014, 09:45:28 PM |
|
thank you for your sharing, waiting to read more.
|
|
|
|
zikomoto
|
|
July 19, 2014, 09:52:57 PM |
|
Thank you for sharing , was already working on one and this is helping me a lot
|
|
|
|
paradigmflux (OP)
|
|
July 19, 2014, 10:25:48 PM |
|
Here's a cronjob equivalent of the payment processor. This way it is entirely independent of the node.js application - it still logs to the redis instance and dumps any failed txns through to a text file. I will try and clean this up some. coming up next will be the exchange interactions as well as the front end reporting. #!/bin/bash cp /Scripts/payouts /Scripts/old_payouts rm /Scripts/payouts counter=0 redis-cli del Pool_Stats:CurrentRoundBTCD redis-cli del Pool_Stats:CurrentRoundBTC now="`date +%s`" thisShift=`redis-cli hget Pool_Stats This_Shift` ShiftStart=`redis-cli hget Pool_Stats:$thisShift starttime` BTCDPrice=`redis-cli hget Exchange_Rates btcdcoin` TotalEarned=0 TotalEarnedBTCD=0 # loop through algos while read line do AlgoTotal=0 AlgoTotalBTCD=0 logkey2="Pool_Stats:"$thisShift":Algos" logkey2BTCD="Pool_Stats:"$thisShift":AlgosBTCD" echo "LOGKEY2: $logkey2"
# loop through each coin for that algo while read CoinName do coinTotal=0 coinTotalBTCD=0 thiskey=$CoinName":balances" logkey="Pool_Stats:"$thisShift":Coins" logkeyBTCD="Pool_Stats:"$thisShift":CoinsBTCD" #Determine price for Coin coin2btc=`redis-cli hget Exchange_Rates $CoinName` # echo "$CoinName - $coin2btc" workersPerCoin=`redis-cli hlen $thiskey` if [ $workersPerCoin = 0 ] then echo "do nothing" > /dev/null else
while read WorkerName do thisBalance=`redis-cli hget $thiskey $WorkerName` thisEarned=`echo "scale=4;$thisBalance * $coin2btc" | bc -l` coinTotal=`echo "scale=4;$coinTotal + $thisEarned" | bc -l` AlgoTotal=`echo "scale=4;$AlgoTotal + $thisEarned" | bc -l` BTCDEarned=`echo "scale=4;$thisEarned / $BTCDPrice" | bc -l` coinTotalBTCD=`echo "scale=4;$coinTotalBTCD + $BTCDEarned" | bc -l` AlgoTotalBTCD=`echo "scale=4;$AlgoTotalBTCD + $BTCDEarned" | bc -l`
# echo "$WorkerName earned $BTCDEarned from $CoinName" redis-cli hincrbyfloat Pool_Stats:CurrentRoundBTCD $WorkerName $BTCDEarned redis-cli hincrbyfloat Pool_Stats:CurrentRoundBTC $WorkerName $thisEarned done< <(redis-cli hkeys $CoinName:balances) redis-cli hset $logkey $CoinName $coinTotal redis-cli hset $logkeyBTCD $CoinName $coinTotalBTCD echo "$CoinName: $coinTotal"
fi done< <(redis-cli hkeys Coin_Names_$line) redis-cli hset $logkey2 $line $AlgoTotal redis-cli hset $logkey2BTCD $line $AlgoTotalBTCD TotalEarned=`echo "scale=4;$TotalEarned + $AlgoTotal" | bc -l` TotalEarnedBTCD=`echo "scale=4;$TotalEarnedBTCD + $AlgoTotalBTCD" | bc -l`
done< <(redis-cli hkeys Coin_Algos) redis-cli hset Pool_Stats:$thisShift Earned_BTC $TotalEarned redis-cli hset Pool_Stats:$thisShift Earned_BTCD $TotalEarnedBTCD
echo "Total Earned: $TotalEarned"
redis-cli hset Pool_Stats:$thisShift endtime $now nextShift=$(($thisShift + 1)) redis-cli hincrby Pool_Stats This_Shift 1 echo "$thisShift" >>/Scripts/Shifts redis-cli hset Pool_Stats:$nextShift starttime $now echo "Printing Earnings report" >> /Scripts/ShiftChangeLog.txt echo "Shift change switching from $thisShift to $nextShift at $now" >>/Scripts/ShiftChangeErrorCheckerReport while read WorkerName do
PrevBalance=`redis-cli zscore Pool_Stats:Balances $WorkerName` if [[ $PrevBalance == "" ]] then PrevBalance=0 fi thisBalance=`redis-cli hget Pool_Stats:CurrentRoundBTCD $WorkerName` TotalBalance=`echo "scale=4;$PrevBalance + $thisBalance" | bc -l` >/dev/null echo $WorkerName $TotalBalance # echo "$WorkerName $TotalBalance" >> /Scripts/PayoutReport echo "$WorkerName $TotalBalance - was $PrevBalance plus today's $thisBalance" >> /Scripts/ShiftChangeErrorCheckerReport
redis-cli zadd Pool_Stats:Balances $TotalBalance $WorkerName redis-cli hset Worker_Stats:Earnings:$WorkerName $thisShift $thisBalance redis-cli hincrbyfloat Worker_Stats:TotalEarned $WorkerName $thisBalance #Log each earning in the redis instance in the HTML format. #But first, mask the worker address. str=$WorkerName n=7 hiddenWorkerName="***************"${str:${#str} - $n}
redis-cli hincrby Pool_Stats Earning_Log_Entries 1 EarningNumber=`redis-cli hget Pool_Stats Earning_Log_Entries` redis-cli hset Pool_Stats:EarningsLog $EarningNumber "<tr><td><center>$thisShift</center></td><td><b><center>Shift Earning</center></b></td><td><b>$hiddenWorkerName</td><td><center>$thisBalance</center></td><td><center>$PrevBalance</center></td><td><center>$TotalBalance</center></td><td><center>$now</center></td></tr>"
done< <(redis-cli hkeys Pool_Stats:CurrentRoundBTCD) echo "Done adding coins, clearing balances now." >> /Scripts/ShiftChangeLog.log
# Save the total BTC/BTCD earned for each shift into a historical key for auditing purposes. redis-cli rename Pool_Stats:CurrentRoundBTCD Pool_Stats:$thisShift:ShiftBTCD redis-cli rename Pool_Stats:CurrentRoundBTC Pool_Stats:$thisShift:ShiftBTC
#for every coin on the pool.... while read Coin_Names2 do #Save the old balances key for every coin into a historical key. redis-cli rename $Coin_Names2:balances Prev:$thisShift:$Coin_Names2:balances
# This loop will move every block from the blocksConfirmed keys into the blocksPaid keys. This means only blocksConfirmed are unpaid. while read PaidLine do redis-cli sadd $Coin_Names2:"blocksPaid" $PaidLine redis-cli srem $Coin_Names2:"blocksConfirmed" $PaidLine echo "nothing" > /dev/null done< <(redis-cli smembers $Coin_Names2:"blocksConfirmed")
done< <(redis-cli hkeys Coin_Names)
echo "Done script at $now" >> /Scripts/ShiftChangeLog.log
#Calculate workers owed in excesss of 5 BTCDcoin and generate a report of them. while read PayoutLine do
amount=`redis-cli zscore Pool_Stats:Balances $PayoutLine` roundedamount=`echo "scale=4;$amount - 1" | bc -l` echo "$PayoutLine $roundedamount" #log a file named /Scripts/pallp0pp echo "$PayoutLine $amount" >> /Scripts/payouts #send all of the payments using the coin daemon txn=`bitcoindarkd sendtoaddress $PayoutLine $amount` if [ -z "$txn" ] then #log failed payout to txt file. echo "payment failed! $PayoutLine" >>/Scripts/AlertLog else urlstring="http://btcd.explorer.ssdpool.com:9050/tx/$txn"
newtotal=`echo "scale=4;$amount - $roundedamount" | bc -l` >/dev/null redis-cli hincrby Pool_Stats Earning_Log_Entries 1 redis-cli hset Worker_Stats:Payouts:$PayoutLine $thisShift $amount redis-cli hincrbyfloat Worker_Stats:TotalPaid $PayoutLine $amount EarningNumber=`redis-cli hget Pool_Stats Earning_Log_Entries` str=$PayoutLine n=7 hiddenPayoutLine="***************"${str:${#str} - $n} #log in redis for displaying on website redis-cli hset Pool_Stats:EarningsLog $EarningNumber "<tr><td>$thisShift</td><td><b>Payout</b></td><td>$hiddenPayoutLine</td><td>$amount</td><td>$amount</td><td><a href=\"$urlstring\">here</a></td><td>$now</td></tr>" redis-cli zadd Pool_Stats:Balances 0 $PayoutLine fi done< <(redis-cli zrangebyscore Pool_Stats:Balances 5 inf) #move old profitability stats redis-cli rename Pool_Stats:CurrentShift:Profitability Pool_Stats:$thisShift:Profitability
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
July 19, 2014, 10:42:15 PM Last edit: July 20, 2014, 04:55:08 PM by YarkoL |
|
Been looking for a guide like this for some time already..
|
“God does not play dice"
|
|
|
paradigmflux (OP)
|
|
July 20, 2014, 04:35:51 AM |
|
Anybody going through and setting one of these bad boys up?
|
|
|
|
paradigmflux (OP)
|
|
July 20, 2014, 04:38:18 AM |
|
Hashrate pie charts - this would go onto your home.html for example. <!--- Start Pie ---> <script>
var data_arr3=[]; init.push(function () { var item = { "label": "Scrypt", "value": {{=it.stats.algos['scrypt'].hashrate/1024/1024}} } data_arr3.push(item); var item = { "label": "SHA", "value": {{=it.stats.algos['sha256'].hashrate/1024/1024/1024/100}} } data_arr3.push(item); var item = { "label": "X11", "value": {{=it.stats.algos['x11'].hashrate/1024/1024/4}} } data_arr3.push(item); var item = { "label": "Keccak", "value": {{=it.stats.algos['keccak'].hashrate/1024/1024/500}} } data_arr3.push(item); var item = { "label": "ScryptN", "value": {{=it.stats.algos['scrypt-n'].hashrate/1024/1024}} } data_arr3.push(item); Morris.Donut({ element: 'hero-donut', data: data_arr3, colors: PixelAdmin.settings.consts.COLORS, resize: true, labelColor: '#888', formatter: function (y, data) { if(data.label == "Keccak") return(Math.round(y * 500,2) + ' MH'); else if(data.label == "Scrypt") return(Math.round(y,2) + ' MH'); else if(data.label == "SHA") return(Math.round(y * 100,2) + ' GH'); else if(data.label == "X11") return(Math.round(y * 4,2) + ' MH'); else return(y + ' MH');
}
}); }); </script> <!-- / Javascript -->
<div class="row"> <div class="col-md-5"> <div class="graph-container"> <div id="hero-donut" class="graph"></div> </div> </div> <!--- END PIE -->
Worker distribution pie charts: <!--- start worker pie chart ---> <script>
var data_arr4=[]; init.push(function () { var item = { "label": "Scrypt", "value": {{=it.stats.algos['scrypt'].workers}} } data_arr4.push(item); var item = { "label": "SHA", "value": {{=it.stats.algos['sha256'].workers}} } data_arr4.push(item); var item = { "label": "X11", "value": {{=it.stats.algos['x11'].workers}} } data_arr4.push(item); var item = { "label": "Keccak", "value": {{=it.stats.algos['keccak'].workers}} } data_arr4.push(item); var item = { "label": "ScryptN", "value": {{=it.stats.algos['scrypt-n'].workers}} } data_arr4.push(item); Morris.Donut({ element: 'hero-donut2', data: data_arr4, colors: PixelAdmin.settings.consts.COLORS, resize: true, labelColor: '#888', formatter: function (y, data) { return(y + ' Workers'); }
}); }); </script> <!-- / Javascript -->
<div class="graph-container"> <div id="hero-donut2" class="graph"></div> </div> <!-- end worker pie chart --->
|
|
|
|
OmarGsPools
|
|
July 20, 2014, 05:03:27 AM |
|
Anybody going through and setting one of these bad boys up?
I'll be attempting to set one up later tomorrow. I like the open-source route and would like to help in the future development.
|
|
|
|
mnporter2001
Sr. Member
Offline
Activity: 602
Merit: 250
HEX: Longer pays better
|
|
July 20, 2014, 09:19:57 AM |
|
Nice share !
However I cant get it to work, this is my fault not yours i must say, I just dont understand it all........
So how much would it cost me for you/ anyone else reading this to do it for me ?
Thanks Mark
|
████████████████████ ██████████████████████ ████████████████████████ ██████████████████████████ ████████████████████████████ ████ ▀██████████ ████ ██████████████ ██████████ ████ ████████████████ ██████████▄ ████ ██████████████████ █████████▀ ██ ████████████████████ ███████ ███ █████████ █████ ███ ███████ ███████ █████ █████████ █████ █████ ███████████ ███ █████ █████████ ███ █████ ███████ ███ █████ | | | ● ● ● ● ● ● ●
| | | | ● ● ● ● ● ● ●
| | Powered by,
|
|
|
|
rcloss
Newbie
Offline
Activity: 7
Merit: 0
|
|
July 20, 2014, 02:33:20 PM |
|
Do you have any suggestions for hosting providers that can run this kind a MultiPool install? If not, how much bandwidth does one of these installs typically use? I'm guessing I can't run this on my old DSL line
|
|
|
|
goodluckpool
Newbie
Offline
Activity: 42
Merit: 0
|
|
July 21, 2014, 07:41:23 AM |
|
great info, looking forward to that automatic coin exchange part.
|
|
|
|
billotronic
Legendary
Offline
Activity: 1610
Merit: 1000
Crackpot Idealist
|
|
July 22, 2014, 04:25:53 AM |
|
Do you have any suggestions for hosting providers that can run this kind a MultiPool install? If not, how much bandwidth does one of these installs typically use? I'm guessing I can't run this on my old DSL line
ooo good question for the OP. I too would be very interested to know what kind of hardware you need for this.
|
|
|
|
DreamSpace
|
|
July 22, 2014, 07:49:38 AM |
|
nice guide thank you
|
|
|
|
Biomech
Legendary
Offline
Activity: 1372
Merit: 1022
Anarchy is not chaos.
|
|
July 22, 2014, 03:15:02 PM |
|
Do you have any suggestions for hosting providers that can run this kind a MultiPool install? If not, how much bandwidth does one of these installs typically use? I'm guessing I can't run this on my old DSL line
ooo good question for the OP. I too would be very interested to know what kind of hardware you need for this. I'm not involved, just another spectator. But I know a bit about NOMP, which is what he's basing this off of. I ran it in a test rig with a single core running 2.8 GHz and 1 meg ram with no issues at all, and rented rigs to bombard it. On a basic cable connection with a lot of other shit running. Barely slowed it down. NOMP is not terribly resource intensive.
|
|
|
|
paradigmflux (OP)
|
|
July 22, 2014, 05:19:06 PM |
|
You can easily run one of these off of a 4GB Digital Ocean VM (which costs $40 a month). Just make sure that you use a free cloudflare.com account (to help reduce DDOS attacks) Use the cloudflare.com accunt to sign up for a free Dome9 account (and use Dome9 to firewall all HTTP access to only accept connections from cloudflare) and voila! PS, I will be posting some more in this thread a bit later today with some more info. Thanks for all of the nice PMs so far.
|
|
|
|
s1kx
Newbie
Offline
Activity: 1
Merit: 0
|
|
July 22, 2014, 08:09:23 PM |
|
Awesome, great code quality and definitely an improvement in the scene. Proper redis usage etc will definitely help a lot of pools scale.
|
|
|
|
paradigmflux (OP)
|
|
July 22, 2014, 08:29:49 PM |
|
Awesome, great code quality and definitely an improvement in the scene. Proper redis usage etc will definitely help a lot of pools scale.
It could be lots better. It could all be in node.js but then again, I don't really feel that the whole pool operation should be in a single node.js application. And using stuff like bash scripts - as long as they're well written (and run entirely from in memory) - they're pretty much just as efficient. I'm just wrapping up extending the NOMP api quite a bit (to reveal stuff like user hashrates, profitabilities, payouts etc) I will post the updated stats.js and website.js once I am done.
|
|
|
|
goodluckpool
Newbie
Offline
Activity: 42
Merit: 0
|
|
July 23, 2014, 11:00:17 PM |
|
updates?
|
|
|
|
paradigmflux (OP)
|
|
July 24, 2014, 04:37:43 AM |
|
updates?
i will update this thread a bit later tonight for a sample one of these pools check out BTCDPool.com every 10 minutes it liquidate all the earning coins and continually applies buy pressure on bitcoindark at cryptsy and poloniex. it's even mining every possible profitable SHA altcoin (including PPC, which up until today I never figured out how to get working with NOMP) i will post a link to a zip of the entire pool tonight
|
|
|
|
appooler
Newbie
Offline
Activity: 28
Merit: 0
|
|
July 25, 2014, 12:46:02 AM |
|
Satoshi releaed the bitcoin client opensource, quit charging people a bitcoin to set them up a multipool and quit trying to screw everyone over.
That's what I'm talking about, well said.
|
|
|
|
goodluckpool
Newbie
Offline
Activity: 42
Merit: 0
|
|
July 25, 2014, 03:48:50 AM |
|
updates?
i will update this thread a bit later tonight for a sample one of these pools check out BTCDPool.com every 10 minutes it liquidate all the earning coins and continually applies buy pressure on bitcoindark at cryptsy and poloniex. it's even mining every possible profitable SHA altcoin (including PPC, which up until today I never figured out how to get working with NOMP) i will post a link to a zip of the entire pool tonight plz update your article. It is good.
|
|
|
|
GoldBit89
|
|
July 25, 2014, 05:19:55 AM |
|
Satoshi releaed the bitcoin client opensource, quit charging people a bitcoin to set them up a multipool and quit trying to screw everyone over.
That's what I'm talking about, well said. +1 to both This is so totally awesome. Thank you for posting this information and being brave enough to do it. Its time the dirty multipool mining operators are exposed.
|
FTC 6nvzqqaCEizThvgMeC86MGzhAxGzKEtNH8 |WDC WckDxipCes2eBmxrUYEhrUfNNRZexKuYjR |BQC bSDm3XvauqWWnqrxfimw5wdHVDQDp2U8XU BOT EjcroqeMpZT4hphY4xYDzTQakwutpnufQR |BTG geLUGuJkhnvuft77ND6VrMvc8vxySKZBUz |LTC LhXbJMzCqLEzGBKgB2n73oce448BxX1dc4 BTC 1JPzHugtBtPwXgwMqt9rtdwRxxWyaZvk61 |ETH 0xA6cCD2Fb3AC2450646F8D8ebeb14f084F392ACFf
|
|
|
Min€r
|
|
July 25, 2014, 07:21:33 AM |
|
Really thanks for that enhancements - will try to include pie-charts & worker feature in my new pool.
|
|
|
|
omgbossis21
|
|
July 25, 2014, 09:41:31 AM |
|
Great thread! Subscribed!
|
|
|
|
jeezy
Legendary
Offline
Activity: 1237
Merit: 1010
|
|
July 25, 2014, 03:25:30 PM |
|
One important thing is to let the miner chose his own difficulty via "d=XXX" as pwd. Is this already build into NOMP?
|
|
|
|
hammo
|
|
July 31, 2014, 05:44:31 AM Last edit: July 31, 2014, 05:56:51 AM by hammo |
|
Anybody going through and setting one of these bad boys up?
Yep, acutally was actually going play with this on the weekend. Thanks heaps for sharing. My thoughts on multipool has changed in recent weeks. If you can't beat 'em then join 'em or try to out pace 'em NB: This is important work because having a distributed coin relies on having a distributed network. More pools and/or multipools the better for the network.
|
|
|
|
jk9694
|
|
August 03, 2014, 09:59:56 PM |
|
So any idea what is causing this error?
TypeError: Cannot call method 'toString' of undefined at resolveDefs (/home/james/nomp/node_modules/dot/doT.js:52:55) at Object.doT.template (/home/james/nomp/node_modules/dot/doT.js:90:33) at /home/james/nomp/libs/website.js:84:33 at fs.js:207:20 at Object.oncomplete (fs.js:107:15)
I did notice to that in the section where you have the code for stats.js it appears to that you are wanting us to create a new website.js. I did that and removed the code from stats.js
[/code
Delete the stock website.js file as well, make a new one: /code]
|
|
|
|
jk9694
|
|
August 03, 2014, 10:43:37 PM |
|
Note to self, do not incorrectly name the html files.
|
|
|
|
jk9694
|
|
August 03, 2014, 11:21:45 PM |
|
So a couple of things I am not seeing here, maybe I am missing it.
1. How should the miners connect? Do they use their payout address for the target coin when mining all coins? (I am assuming yes)
2. I have not seen any code that gets the exchange to send the coins back to your target wallet. This is still part of the still to come or is that a manual operation for the pool operator?
3. If a shift is sent to an exchange I am assuming that the payout just fail until there is enough funds in the wallet of the target coin to payout what is owed?
Really liking what you are doing so far.
J
|
|
|
|
paradigmflux (OP)
|
|
August 03, 2014, 11:25:00 PM Last edit: August 04, 2014, 12:00:12 AM by paradigmflux |
|
WHOOPS i let OVH suspend BTCDPool.com accidently due to an outstanding balance. i will pay it today and once it is reactivated i will make an archive of the entire pool's files and upload it somewhere. in return for this, i will ask each of you to go to an exchange and buy some bitcoindark and hold onto it. and if you find it useful, to consider making a donation! I won't try and blackmail coins out of anyone for any of this info though. donation addresses: Bitcoindark (BTCD) - RRuXHtWdGFE95BSDjfPZNEfRhn9XKfHqky Bitcoin (BTC) - 1ALLcKCwUrQCQ9XPkxwv2pKstyau9XogpV Litecoin (LTC) - LTWwzpPhdHD9KzDcGh4SfkfiKPbFiSoAVV check out what the latest "my miner" reports look like: http://hashrate.org/miner/8896310590202454791PS thanks for the JQ info - I'm happy somebody is actually looking at this and deriving some value from it.
|
|
|
|
jk9694
|
|
August 03, 2014, 11:53:54 PM |
|
No worries, just lending a hand here.
Also, you may want to note the dependency upon "jq" for the shell scripts that get the coin info from the exchanges. For an updated ubuntu 12.04 it is just apt-get install jq
No need to change or add repos like the jq sites says.
J
|
|
|
|
paradigmflux (OP)
|
|
August 04, 2014, 12:04:22 AM |
|
So a couple of things I am not seeing here, maybe I am missing it.
1. How should the miners connect? Do they use their payout address for the target coin when mining all coins? (I am assuming yes) yes, that is exactly what they do. 2. I have not seen any code that gets the exchange to send the coins back to your target wallet. This is still part of the still to come or is that a manual operation for the pool operator? i haven't included any of the exchange-side code yet. it supports withdrawals from cryptsy, at least. bter and mintpal and poloniex require manual withdrawals by the pool operator. having a healthy reserve of the target coin sorta mitigates this a a big issue, though. it does support automated withdrawals as long as the pool's wallet address is added as a trusted withdrawal address within cryptsy. 3. If a shift is sent to an exchange I am assuming that the payout just fail until there is enough funds in the wallet of the target coin to payout what is owed? the payout script actually is triggered by another script i have (which first of all estimates the amunt of the payout, then checks the local daemon and verifies that enough is present to cover the payout. If there is not enough it will try and withdraw from cryptsy, if there is still not enough it will log an error in a log file. I still mean to set it up so that it can notify the op via SMTP at that point. Really liking what you are doing so far.
J
thanks very much man! i plan on eventually trying to document all of this stuff the best way to understand lots of this stuff is just fire up a basic nomp pool run redis-server and run the command ' monitor' and just watch how NOMP interacts with the various redis values. there really needs to be tons of documentation created around that sort of stuff, then how my modifications alter all that. i will try and get it done eventually ,but I'm also in the process of setting up a big (well, relatively, ~100th) sha community-owned mine that will pay out solely in NXT.
|
|
|
|
jk9694
|
|
August 04, 2014, 12:14:28 AM |
|
Cool and thanks for the information here. I will keep plugging along with this initial setup and look at some of these items. Personally my group of miners and I have decided to only deal with cryptsy. Do you have anything to share yet around that exchange and pulling funds back out to the wallet?
I may look at doing a payment process as a complete replacement for what is in NOMP now.
If you like I can add info here on the basic setup as well from what I am going through. Don't mind contributing at all.
Looks like we have a common goal here, im doing the same for TEK...
|
|
|
|
Lima99
|
|
August 04, 2014, 12:46:46 AM |
|
Thank you for this guide, this is exactly what I was looking for so I can setup a multipool ones CapitalCoin becomes PoS only.
|
|
|
|
paradigmflux (OP)
|
|
August 04, 2014, 12:48:08 AM |
|
By setting the payout limit to all 9s it effectively kills the momp payouts from ever triggering. I pretty much have the whole thing rewritten, I just use the momo code to calculate the per miner coin earnings for the various alts. Ps I mean it guys, please go buy some bitcoindark. If each of you did, we would all be rich.
|
|
|
|
iampingu
|
|
August 04, 2014, 12:50:45 AM |
|
Thanks for this post bud. Much appreciated
|
|
|
|
jk9694
|
|
August 04, 2014, 02:07:43 AM |
|
Stats are looking good man, trying to work through all this myself to do the same for TEK.
|
|
|
|
jk9694
|
|
August 04, 2014, 03:26:25 AM |
|
WHOOPS i let OVH suspend BTCDPool.com accidently due to an outstanding balance. i will pay it today and once it is reactivated i will make an archive of the entire pool's files and upload it somewhere. in return for this, i will ask each of you to go to an exchange and buy some bitcoindark and hold onto it. and if you find it useful, to consider making a donation! I won't try and blackmail coins out of anyone for any of this info though. donation addresses: Bitcoindark (BTCD) - RRuXHtWdGFE95BSDjfPZNEfRhn9XKfHqky Bitcoin (BTC) - 1ALLcKCwUrQCQ9XPkxwv2pKstyau9XogpV Litecoin (LTC) - LTWwzpPhdHD9KzDcGh4SfkfiKPbFiSoAVV check out what the latest "my miner" reports look like: http://hashrate.org/miner/8896310590202454791PS thanks for the JQ info - I'm happy somebody is actually looking at this and deriving some value from it. Heck, you get me through setting this up with an auto send and an auto withdrawl for TEK just on cryptsy alone and i will send you .333 BTC.....
|
|
|
|
PondSea
Legendary
Offline
Activity: 1428
Merit: 1000
|
|
August 04, 2014, 04:42:12 AM |
|
Watching with interest
|
|
|
|
jk9694
|
|
August 05, 2014, 06:12:54 PM |
|
Hmmm no updates yet? Also I see the site is still down..
|
|
|
|
KimmyF
|
|
August 06, 2014, 08:39:30 PM |
|
Thanks for your work, really like how you act in this world full of scams: Respect!
|
|
|
|
tuanvie
|
|
August 06, 2014, 11:24:25 PM |
|
does this need to build a large server specifications?
|
|
|
|
TrangLee
Full Member
Offline
Activity: 210
Merit: 100
Living the dream
|
|
August 06, 2014, 11:25:06 PM |
|
Watching with interest Same here. I'd like to make one of my own.
|
|
|
|
utahjohn
|
|
August 07, 2014, 12:23:42 PM Last edit: August 07, 2014, 12:47:09 PM by utahjohn |
|
I'm having trouble setting up NOMP Here is output from node init.js 2014-08-07 05:40:57 [POSIX] [Connection Limit] (Safe to ignore) POSIX module not installed and resource (connection) limit was not raised 2014-08-07 05:40:57 [Master] [CLI] CLI listening on port 17117 2014-08-07 05:40:58 [Master] [PoolSpawner] Spawned 1 pool(s) on 2 thread(s) 2014-08-07 05:41:00 [Payments] [diamondcoin] Payment processing setup to run every 600 second(s) with daemon (bogus@127.0.0.1:17772) and redis (127.0.0.1:6379) 2014-08-07 05:41:00 [Payments] [diamondcoin] Finished interval - time spent: 17ms total, 5ms redis, 10ms daemon RPC 2014-08-07 05:41:00 [Switching] [Setup] (Thread 1) Loading last proxy state from redis 2014-08-07 05:41:00 [Pool] [diamondcoin] (Thread 1) Share processing setup with redis (127.0.0.1:6379) 2014-08-07 05:41:00 [Switching] [Setup] (Thread 1) Switching "switch1" listening for groestl on port 3333 into diamondcoin 2014-08-07 05:41:00 [Website] [Server] Website started on 127.0.0.1:8080 2014-08-07 05:41:00 [Stats] [Global] error with getting global stats [{},{}] 2014-08-07 05:41:00 [Stats] [Global] error getting all stats[{},{}] 2014-08-07 05:41:00 [Pool] [diamondcoin] (Thread 1) Stratum Pool Server Started for diamondcoin [DMD] {groestl} 2014-08-07 05:41:00 [Switching] [Setup] (groestl) Setting proxy difficulties after pool start 2014-08-07 05:41:00 [Switching] [Setup] (Thread 2) Loading last proxy state from redis 2014-08-07 05:41:00 [Pool] [diamondcoin] (Thread 2) Share processing setup with redis (127.0.0.1:6379) 2014-08-07 05:41:00 [Switching] [Setup] (Thread 2) Switching "switch1" listening for groestl on port 3333 into diamondcoin 2014-08-07 05:41:00 [Pool] [diamondcoin] (Thread 2) Stratum Pool Server Started for diamondcoin [DMD] {groestl} 2014-08-07 05:41:00 [Switching] [Setup] (groestl) Setting proxy difficulties after pool start 2014-08-07 05:41:30 [Pool] [diamondcoin] (Thread 1) Authorized dE1xgAEwfA3BsxEpXSBTrqiYbrEUyZwwKg:x [10.42.0.99] 2014-08-07 05:41:55 [Pool] [diamondcoin] (Thread 1) No new blocks for 55 seconds - updating transactions & rebroadcasting work 2014-08-07 05:41:55 [Pool] [diamondcoin] (Thread 2) No new blocks for 55 seconds - updating transactions & rebroadcasting work 2014-08-07 05:41:58 [Stats] [Global] error with getting global stats [{},{}] 2014-08-07 05:41:58 [Stats] [Global] error getting all stats[{},{}] 2014-08-07 05:42:05 [Pool] [diamondcoin] (Thread 2) Block notification via RPC polling 2014-08-07 05:42:05 [Pool] [diamondcoin] (Thread 1) Block notification via RPC polling 2014-08-07 05:42:50 [Pool] [diamondcoin] (Thread 1) Share accepted at diff 0.00390625/2.42240273 by dE1xgAEwfA3BsxEpXSBTrqiYbrEUyZwwKg [10.42.0.99] 2014-08-07 05:42:50 [Pool] [diamondcoin] (Thread 1) Error with share processor multi [{},{},{},{}] 2014-08-07 05:42:55 [Pool] [diamondcoin] (Thread 1) Share accepted at diff 0.00390625/2.59897940 by dE1xgAEwfA3BsxEpXSBTrqiYbrEUyZwwKg [10.42.0.99] 2014-08-07 05:42:55 [Pool] [diamondcoin] (Thread 1) Error with share processor multi [{},{},{},{}] 2014-08-07 05:42:55 [Pool] [diamondcoin] (Thread 1) Share accepted at diff 0.00390625/20.79394799 by
Here is config.json { "logLevel": "debug", "logColors": true,
"cliPort": 17117,
"clustering": { "enabled": true, "forks": "auto" },
"defaultPoolConfigs": { "blockRefreshInterval": 1000, "jobRebroadcastTimeout": 55, "connectionTimeout": 600, "emitInvalidBlockHashes": false, "validateWorkerUsername": true, "tcpProxyProtocol": false, "banning": { "enabled": true, "time": 600, "invalidPercent": 50, "checkThreshold": 500, "purgeInterval": 300 }, "redis": { "host": "127.0.0.1", "port": 6379 } },
"website": { "enabled": true, "host": "127.0.0.1", "port": 8080, "stratumHost": "bogus.ddns.net", "stats": { "updateInterval": 60, "historicalRetention": 43200, "hashrateWindow": 300 }, "adminCenter": { "enabled": true, "password": "password1bogus" } },
"redis": { "host": "127.0.0.1", "port": 6379 },
"switching": { "switch1": { "enabled": true, "algorithm": "groestl", "ports": { "3333": { "diff": 0.00390625, "varDiff": { "minDiff": 0.00390625, "maxDiff": 4, "targetTime": 15, "retargetTime": 90, "variancePercent": 30 } } } }, "switch2": { "enabled": false, "algorithm": "scrypt", "ports": { "4444": { "diff": 10, "varDiff": { "minDiff": 16, "maxDiff": 512, "targetTime": 15, "retargetTime": 90, "variancePercent": 30 } } } }, "switch3": { "enabled": false, "algorithm": "x11", "ports": { "5555": { "diff": 0.001, "varDiff": { "minDiff": 0.001, "maxDiff": 1, "targetTime": 15, "retargetTime": 60, "variancePercent": 30 } } } }
},
"profitSwitch": { "enabled": false, "updateInterval": 600, "depth": 0.90, "usePoloniex": true, "useCryptsy": true, "useMintpal": true, "useBittrex": true } }
Here is coins/diamondcoin.json { "name": "Diamondcoin", "symbol": "DMD", "algorithm": "groestl", "txMessages": false, "normalHashing" : true, "peerMagic" : "e4e8dbfd", "peerMagicTestnet" : "edf2c0ef" }
Here is pool_configs/diamond.json { "enabled": true, "coin": "diamondcoin.json",
"address": "dQKbYwJFpq9MDojuwE99D3MHihEkEW6aqk",
"rewardRecipients": { "dZi9hpA5nBC6tSAbPSsiMjb6HeQTprcWHz" : 4.76190477, "dEdj7aH7Pgt3oVAeEFmA46sXCDcZYeiQjP" : 0.95238095 },
"paymentProcessing": { "enabled": true, "paymentInterval": 600, "minimumPayment": 1, "daemon": { "host": "127.0.0.1", "port": 17772, "user": "bogususer", "password": "boguspass" } },
"ports": { "3333": { "diff": 0.008, "varDiff": { "minDiff": 0.00390625, "maxDiff": 8, "targetTime": 15, "retargetTime": 90, "variancePercent": 30 } } },
"daemons": [ { "host": "127.0.0.1", "port": 17772, "user": "bogususer", "password": "boguspass" } ],
"p2p": { "enabled": false, "host": "127.0.0.1", "port": 17772, "disableTransactions": true },
"mposMode": { "enabled": false, "host": "127.0.0.1", "port": 3306, "user": "dmdpool", "password": "boguspass", "database": "dmd", "checkPassword": true, "autoCreateWorker": false }
}
|
|
|
|
utahjohn
|
|
August 07, 2014, 12:34:41 PM Last edit: August 07, 2014, 05:47:33 PM by utahjohn |
|
I'm extremely interested and have read thru the whole thread. Please examine my basic config in prev post and tell me what I need to fix or config so I can get it up and running, then I want to mod it like you have been describing. OK I solved the "(Thread 1) Error with share processor multi [{},{},{},{}]" error, it had to do with redis ... The [STATS] error also went away Now to find a block and see if payouts working ... At the sgminer end of things I'm getting "Accepted untracked share" occasionally what could this be? Also eagerly awaiting your zip files
|
|
|
|
utahjohn
|
|
August 07, 2014, 09:02:38 PM |
|
Haven't tried it yet but if your info is good as it looks, you'll be a hero around here.
This is all live directly off of the BTCDPool.com site. I will be uploading a snapshot of all of the config files from that pool later tonight. And when
|
|
|
|
utahjohn
|
|
August 08, 2014, 12:04:41 AM |
|
Sweet I found a block on my NOMP pool. Now I hope it gets to maturity http://diamond.danbo.bg:2750/t/8MjqxZ9bCKLooks like very little or even any tuning needed on my "rewardRecipients" is needed 0 Not yet redeemed 0.990001 dNsxRjmtmUrEXQkKESGAJzm5tv3MPzqLrS 33:03b5...35d6 CHECKSIG 1 Not yet redeemed 0.05 dZi9hpA5nBC6tSAbPSsiMjb6HeQTprcWHz DUP HASH160 20:de92...42d0 EQUALVERIFY CHECKSIG 2 Not yet redeemed 0.009999 dE1xgAEwfA3BsxEpXSBTrqiYbrEUyZwwKg DUP HASH160 20:068e...1829 EQUALVERIFY CHECKSIG 2nd block found http://diamond.danbo.bg:2750/t/47CvEcUhwk0 Not yet redeemed 0.99 dNsxRjmtmUrEXQkKESGAJzm5tv3MPzqLrS 33:03b5...35d6 CHECKSIG 1 Not yet redeemed 0.05 dZi9hpA5nBC6tSAbPSsiMjb6HeQTprcWHz DUP HASH160 20:de92...42d0 EQUALVERIFY CHECKSIG 2 Not yet redeemed 0.01 dE1xgAEwfA3BsxEpXSBTrqiYbrEUyZwwKg DUP HASH160 20:068e...1829 EQUALVERIFY CHECKSIG Looks like I got "rewardRecipients" perfect now Still waiting on valid on first one ... 1st block Valid now in Pool Wallet balance. Transaction to Donation wallet added to balance Now waiting on 2nd block to become valid and then Pool should payout to Miner wallet when threshold reached ... Now to work on incorrect stats page hashrate values (1/256 of miner rate) ... If OP would come back and finish his great work I'd just replace all the funky NOMP stats pages with his work ...
|
|
|
|
neite99
|
|
August 08, 2014, 09:43:27 PM |
|
Let me know if anyone is even reading this far. Reading it and thanking you for every second you spent on it. I love you right now
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
August 08, 2014, 09:59:18 PM |
|
Let me know if anyone is even reading this far. Reading it and thanking you for every second you spent on it. I love you right now And I want to have your baby
|
“God does not play dice"
|
|
|
neite99
|
|
August 08, 2014, 10:03:20 PM |
|
Let me know if anyone is even reading this far. Reading it and thanking you for every second you spent on it. I love you right now And I want to have your baby Nice
|
|
|
|
utahjohn
|
|
August 10, 2014, 01:10:01 AM |
|
groestl algo testing started, will add more coins ... Once I am happy with this pool, I will start a X11 NOMP multipool for DMD, BTC and JUDGE payouts
|
|
|
|
incognitoworker
|
|
August 11, 2014, 06:30:53 PM |
|
groestl algo testing started, will add more coins ... Once I am happy with this pool, I will start a X11 NOMP multipool for DMD, BTC and JUDGE payouts if anyone can setup and host one with payments in tek,hbn,cap,phs,grw and dmd... i would love this option, but miss the knowledge of seting it up and how to run it...
|
|
|
|
utahjohn
|
|
August 11, 2014, 06:41:31 PM Last edit: August 11, 2014, 07:43:04 PM by utahjohn |
|
groestl algo testing started, will add more coins ... Once I am happy with this pool, I will start a X11 NOMP multipool for DMD, BTC and JUDGE payouts if anyone can setup and host one with payments in tek,hbn,cap,phs,grw and dmd... i would love this option, but miss the knowledge of seting it up and how to run it... My pool offline atm (wallet locked bug). But I would be happy to add coins once I get multipool working properly. I'm still waiting on OP dev to come back and share the rest ...
|
|
|
|
|
|
|
utahjohn
|
|
August 12, 2014, 10:52:49 PM |
|
BTW I have two old Dell Poweredge 2650 Xeon servers for sale each have 6G RAM ... They are quite heavy and shipping would be $75+, The last time I bought a server it was over $75 shipping Sad I have modded board to provide 4 pin molex for power to SATA drives if u throw in a PCIX64 SATA controller card. PCI32 also works in them, 3 slots Smiley I will ship each with 18G scsi drive with Windows Server 2008 Enterprise installed. More than adequate for a pool server using stratum+tcp Smiley Screaming fans are annoying though LOL. Let me know if anyone interested and I'll fire one up and delete all personal stuff and get them up to date with Windows update. Just cluttering up my closet atm Smiley Auto login enabled as Admin so u can change Administrator Smiley
|
|
|
|
utahjohn
|
|
August 14, 2014, 08:18:35 PM |
|
Installing Ubuntu 14.04 LTS on one of servers in previous post to get my pool hosted in a datacenter
|
|
|
|
jk9694
|
|
August 20, 2014, 05:58:07 PM |
|
I am starting to wonder if this project is dead at this point.
|
|
|
|
edric
|
|
August 20, 2014, 10:36:40 PM |
|
I'm working on something similar using LiveCode, possibly I can fill in the blanks once I get my head round it all.
|
|
|
|
jk9694
|
|
August 21, 2014, 03:07:53 AM |
|
Yes, now that I am getting much more familiar with node.js I have started working on my own pool software completely from scratch that will have this as a standard feature. But I am taking our reddit as many people are not familiar with it and replacing it with mysql. Back to people understanding the database.
|
|
|
|
OmarGsPools
|
|
August 22, 2014, 01:43:01 PM |
|
Has anyone managed to get a multipool working? With at least proper coin switching and coin's being converted into target coin's using the exchange rates?
|
|
|
|
jk9694
|
|
August 24, 2014, 02:25:14 AM |
|
There is also a multipool located at easyp2pool.us
Yes, I have not changed the domain name to something more relevant. The switching is custom and all in node.js.
However, I am working on new pool software now that will not use nomp as I am not seeing much support out there for it.
|
|
|
|
optiplex
Newbie
Offline
Activity: 17
Merit: 0
|
|
August 24, 2014, 06:07:23 AM Last edit: August 24, 2014, 07:08:02 AM by optiplex |
|
Glad to see this thread is not totally dead, eagerly awaiting OP to return and continue the postings, I have a normal NOMP up and running but want to improve it http://optiplexpool.ddns.net:8081I'm building another pool for testing with the modifications listed so far but need the rest of the code
|
|
|
|
MCDev
|
|
August 25, 2014, 10:25:56 PM |
|
I agree with this being a great post to start. I'm also hoping the OP continues our journey into the NOMP world. I have 3 standard installs working right now with 1 to 10 coins in each, and am in the process of going through this guide.
Fingers crossed for more to come.
|
|
|
|
paradigmflux (OP)
|
|
September 08, 2014, 03:53:07 AM |
|
Hey guys, sorry, BTCD Pool flatlined. I am currently in the middle of setting up BURSTMULTIPOOL.com and will finish off updating this thread as I get the various pieces going.
A bit more background information:
you will require a few things to be installed:
jq and webdis, both are fairly easy to google and figure out how to get installed. Check out burstmultipool.com to see how the new pool is coming along.
|
|
|
|
utahjohn
|
|
September 08, 2014, 04:28:25 AM Last edit: September 08, 2014, 04:41:56 AM by utahjohn |
|
Hey thx for coming back I want to build a multipool for Diamond (DMD), have basic NOMP running and a selection of coins ready to go (X11). Will you be starting a github for this?
|
|
|
|
paradigmflux (OP)
|
|
September 13, 2014, 03:14:12 AM |
|
Hey thx for coming back I want to build a multipool for Diamond (DMD), have basic NOMP running and a selection of coins ready to go (X11). Will you be starting a github for this? lol if someone wants to be so kind as to post in this thread a nice and simple guide (or a link to a guide) for how to actually use git that isn't some stupid verbose 500 page man article i will most certainly use github
|
|
|
|
paradigmflux (OP)
|
|
September 14, 2014, 04:20:19 PM |
|
Hey thx for coming back I want to build a multipool for Diamond (DMD), have basic NOMP running and a selection of coins ready to go (X11). Will you be starting a github for this? lol if someone wants to be so kind as to post in this thread a nice and simple guide (or a link to a guide) for how to actually use git that isn't some stupid verbose 500 page man article i will most certainly use github Come on folks I've googled it a bit unsuccessfully haha I wanted to get the whole new burstmultipool.com source onto a git today if possible.
|
|
|
|
MCDev
|
|
September 14, 2014, 04:27:37 PM |
|
Someone PLEASE help! I would If I knew how
|
|
|
|
utahjohn
|
|
September 14, 2014, 04:46:38 PM |
|
all I know about git is how to git clone the source from it, but I think that would be the best place for open source, then others can add issues, requests, and create sub forks and submit pull requests to main tree for your consideration ...
|
|
|
|
MCDev
|
|
September 14, 2014, 04:58:06 PM |
|
I'm in the same boat. git is easy, I got lost in the dox for putting it up there. If I weren't trying to catch up on 2 weeks work for my day job, I'd spend today trying to figure it out. Hopefully someone will come through. It will benefit all of us!!
Thanks for coming back!!!!
|
|
|
|
utahjohn
|
|
September 14, 2014, 05:31:49 PM |
|
BTW I have successfully installed redis and node.js on a windows server 2008 R2 box Have not tried using nomp on windows yet ... Already have a running nomp pool on a linux box
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
September 14, 2014, 06:03:03 PM |
|
Come on folks I've googled it a bit unsuccessfully haha I wanted to get the whole new burstmultipool.com source onto a git today if possible.
okay..This is for ubuntu linux, the same principles apply elsewhere. First go to github site and get yourself a new account there. You'll then have your own place there, say https://github.com/paradigmfluxNext open a terminal and get git on your local machine Go to your github place, go to your repositories, click New (the large green button) From then on do as instructed. Github will ask the repo name then give you the commands to be executed on your terminal. Alternatively, once you've got your github page and git installed, You can also push the repo entirely from the terminal (without using the browser) with these commands: (Let's say you want to make repository called "multipool"..) Change to the directory where your code is, then generate data that git needs add all your files to be pushed to repo commit your changes and insert a comment git commit -m "first commit" tell where the repo is - where to upload git remote add origin https://github.com/paradigmflux/multipool.git upload the stuff git push -u origin master you'll then be asked for username and password that you gave when you signed up at Github, and then your code will be moved to your new repo. Hope this helps.
|
“God does not play dice"
|
|
|
paradigmflux (OP)
|
|
September 14, 2014, 09:17:23 PM |
|
Come on folks I've googled it a bit unsuccessfully haha I wanted to get the whole new burstmultipool.com source onto a git today if possible.
okay..This is for ubuntu linux, the same principles apply elsewhere. First go to github site and get yourself a new account there. You'll then have your own place there, say https://github.com/paradigmfluxNext open a terminal and get git on your local machine Go to your github place, go to your repositories, click New (the large green button) From then on do as instructed. Github will ask the repo name then give you the commands to be executed on your terminal. Alternatively, once you've got your github page and git installed, You can also push the repo entirely from the terminal (without using the browser) with these commands: (Let's say you want to make repository called "multipool"..) Change to the directory where your code is, then generate data that git needs add all your files to be pushed to repo commit your changes and insert a comment git commit -m "first commit" tell where the repo is - where to upload git remote add origin https://github.com/paradigmflux/multipool.git upload the stuff git push -u origin master you'll then be asked for username and password that you gave when you signed up at Github, and then your code will be moved to your new repo. Hope this helps. ty for the crash course, I think i have this almost figured out. I will update this thread once I have some of it committed.
|
|
|
|
utahjohn
|
|
September 14, 2014, 09:31:43 PM |
|
Once I get a clone of your pool modified, I will be working on API to UseCryptos exchange for buying DMD, we just got listed there recently (a few weeks ago) and need to build some volume there. We have been using Cryptsy for ages but they are way slow to respond to support tickets and keeping Wallet daemon up to date LOL, sometimes weeks with no deposit/withdraw available ...
|
|
|
|
paradigmflux (OP)
|
|
September 14, 2014, 09:38:48 PM |
|
Time for some extremely long posts again. Let's go into some background first on how it is that NOMP works. I strongly suggest each of you set up redis-commander, and use it to look at your database. Be careful to use a good password, as someone could maliciously log into your instance of redis-commander and wipe your entire redis db. https://www.npmjs.org/package/redis-commanderWith NOMP, each coin that you have configured is going to have a seperate redis key. Let's start with what is redis? Redis is an in-memory key-value store, which means that it exists solely in RAM. This is the reason that NOMP is so much better than MPOS, which uses an on-disk SQL database. Redis can scale up to millions of transactions a second easily, SQL cannot. So in NOMP, each coin is going to have it's own seperate key. Inside that key, you will see several other keys: blocksConfirmed - these are previously solved blocks. blocksKicked - for all intensive purposes, these are the same as blocksOrphaned blocksOrphaned - solved blocks that wound up being orphans. shares - this will usually consist of only a single key, roundCurrent which will contain numerous fields. Each of your workers will be the name of a field, and their appropriate value will be the number of shares that they have successfully submitted for the current round. hashrate - this will be a live log of the shares that are being submitted by each worker. This is where the hashrate calculation are done from. This is sorted set where the scores are the epoch timestamps, and the values are a string made up of the share difficulty, the worker name and the full javascript epoch time (down to milliseconds). balances - This is where NOMP keeps track of who has earned what. As blocks are solved, the block reward is divided up proportionately amongst the miners (ie, the currentRound key is checked and all of the value numbers are added together. Each worker then is credited with their particular number of shares (ie, their value in the currentRound divided by the total of all of the values in the currentRound) multiplied by the block reward. This number is then added to any existing balance that may already exist in the balances key (so that a worker can have their reward for multiple blocks in a row add up) - when a payout process happens, this key has to be cleared in order to prevent duplicate payouts. In my pools, every single payout each coin has it's balances key renamed to the format Prev:<shiftnumber>:<coinname>:balances - this leaves a good audit trail as you can always look back at any previous round and see exactly how many coins each miner has earned, in any previous round.
|
|
|
|
MCDev
|
|
September 15, 2014, 12:48:36 AM |
|
All I can say is FANTASTIC JOB!!! Got me waiting for the next one!! Thanks!!
|
|
|
|
utahjohn
|
|
September 15, 2014, 04:04:49 AM |
|
I like the tutorial format please continue and get git up
|
|
|
|
MCDev
|
|
September 20, 2014, 04:17:42 PM |
|
WOW!!! Based on your post, took the time to get Redis-Commander working and it's fantastic. Had it on a test vm on digital ocean and couldn't get to it. Then figured out it was my TMG2010 outbound rules. Fixed that and it worked. Repeated the process on what will become my public host that I'm using privately at the moment with 10+ coins, and WOW!! So much info! I'm a lot more familiar with SQL Server and would love to work out dumping the data to that and run background jobs on 1 to 5 minute intervals but I'm going to spend some time with Redis. Thanks for the PUSH
|
|
|
|
stoner19
|
|
October 02, 2014, 05:08:19 AM |
|
to have a github repo for this would be awesome!
|
|
|
|
utahjohn
|
|
October 02, 2014, 05:51:30 PM |
|
Bump ... Where is our lead dev from OP
|
|
|
|
paradigmflux (OP)
|
|
October 12, 2014, 02:17:31 AM |
|
Sorry for not updating in forever. Thanks for the people who have been messaging me with thanks. To the others who are asking to hire me to set them up a pool - eventually I will get this all in the public domain, then I would appreciate any donations made. i will see about getting a github repo with everything checked into it before I shut down the http://burstmultipool.com site. My other, main pool, is http://hashrate.org (it was my first and is pretty much my only long term pool - it's for NXT)
|
|
|
|
utahjohn
|
|
October 13, 2014, 04:53:23 AM |
|
OK will do some testing on one of my boxes, I really interested in algo switching, has this all been ironed out?, I'd like to quota mine X11 multicoin + diamond on my boxes, I currently quota mine on all DMD pools, just finishing up my US mirror of miningfield.com pools and all my DMD hashrate goin to my mirror right now ... http://utahjohn.ddns.net, so far still beta see https://bitcointalk.org/index.php?topic=580725.msg9178778#msg9178778Only pool I have finished on my US mirror is DMD right now, and just beta testing it. This was in regards to sgminer-dev v5 with stock NOMP+MPOS, I want a multipool profit switching NOMP LOL and coin->BTC->coin of choice payout (DMD) I'll add a port at my end, on another VM for DMD multipool, dunno if miningfield is interested but I am
|
|
|
|
infernoman
Legendary
Offline
Activity: 964
Merit: 1000
|
|
October 26, 2014, 03:33:50 AM |
|
to anyone who is considering trying to do this on a "home built" pc think again unless you have a LOT of money invested into your rig. I'm running an i7 3970x at 4.5ghz with 28 gb of ram. and only 21 for the VM. currently the VM still maxes out the memory and ends up maxing out my entire pc memory. i have seen loads on my cpu upwards of 70% and have finally gotten them to lower doing some adjustments to the configs. ANOTHER THING if you want your damn database to save so that when you restart your redis server?
add this to the bottom of EVERY single one of those scripts that paradigmflux posted and the database will save every 5 or 10 minutes how ever often you set it.
redis-cli save
|
|
|
|
paradigmflux (OP)
|
|
November 07, 2014, 09:35:54 PM |
|
to anyone who is considering trying to do this on a "home built" pc think again unless you have a LOT of money invested into your rig. I'm running an i7 3970x at 4.5ghz with 28 gb of ram. and only 21 for the VM. currently the VM still maxes out the memory and ends up maxing out my entire pc memory. i have seen loads on my cpu upwards of 70% and have finally gotten them to lower doing some adjustments to the configs. ANOTHER THING if you want your damn database to save so that when you restart your redis server?
add this to the bottom of EVERY single one of those scripts that paradigmflux posted and the database will save every 5 or 10 minutes how ever often you set it.
redis-cli save
It's a lot easier to just configure redis to save every X number of writes. inside your redis.conf file, make sure that you have this section exactly like this: # Save the DB on disk: # # save <seconds> <changes> # # Will save the DB if both the given number of seconds and the given # number of write operations against the DB occurred. # # In the example below the behaviour will be to save: # after 900 sec (15 min) if at least 1 key changed # after 300 sec (5 min) if at least 10 keys changed # after 60 sec if at least 10000 keys changed # # Note: you can disable saving at all commenting all the "save" lines. # # It is also possible to remove all the previously configured save # points by adding a save directive with a single empty string argument # like in the following example: # # save ""
save 900 1 save 300 10 save 60 10000
also make sure you bind your local redis instance to 127.0.0.1 with this in that same config file: Then as long as you have it set up like this, it should automatically reload all of the redis data as soon as you re-start redis-server with the same redis.conf file: # The filename where to dump the DB dbfilename dump.rdb
# The working directory. # # The DB will be written inside this directory, with the filename specified # above using the 'dbfilename' configuration directive. # # The Append Only File will also be created inside this directory. # # Note that you must specify a directory here, not a file name. dir /var/lib/redis/
|
|
|
|
elitemobb
|
|
November 22, 2014, 08:15:30 AM |
|
So any idea what is causing this error?
TypeError: Cannot call method 'toString' of undefined at resolveDefs (/home/james/nomp/node_modules/dot/doT.js:52:55) at Object.doT.template (/home/james/nomp/node_modules/dot/doT.js:90:33) at /home/james/nomp/libs/website.js:84:33 at fs.js:207:20 at Object.oncomplete (fs.js:107:15)
I did notice to that in the section where you have the code for stats.js it appears to that you are wanting us to create a new website.js. I did that and removed the code from stats.js
[/code
Delete the stock website.js file as well, make a new one: /code]
Were you ever able to resolve this error?
|
|
|
|
hhfbrg
Newbie
Offline
Activity: 1
Merit: 0
|
|
November 25, 2014, 02:22:18 PM |
|
Very interesting subject, I'd like to hear more!
|
|
|
|
MCDev
|
|
November 30, 2014, 06:44:06 PM |
|
Any news or updates? Still watching and destroying my NOMP installs
|
|
|
|
fragar10
Newbie
Offline
Activity: 22
Merit: 0
|
|
December 05, 2014, 07:28:44 AM |
|
I am getting this error. Any help would be appreciated.
/home/frank/nomp/libs/stats.js:560 var portalApi = new api(logger, portalConfig, poolConfigs); ^ TypeError: object is not a function at new module.exports (/home/frank/nomp/libs/stats.js:560:21) at new module.exports (/home/frank/nomp/libs/api.js:11:36) at new module.exports (/home/frank/nomp/libs/website.js:29:21) at Object.<anonymous> (/home/frank/nomp/init.js:80:13) at Module._compile (module.js:456:26) at Object.Module._extensions..js (module.js:474:10) at Module.load (module.js:356:32) at Function.Module._load (module.js:312:12) at Function.Module.runMain (module.js:497:10) at startup (node.js:119:16) 2014-12-05 02:21:29 [Master] [Website] Website process died, spawning replacement...
|
|
|
|
tencentcoin
|
|
December 05, 2014, 08:01:45 AM |
|
this tutorial is very useful for the miner and dev
thanks for the share. i will try this way to build my own
|
|
|
|
utahjohn
|
|
January 26, 2015, 11:28:28 PM |
|
bump ...paradigmflux where r u?
|
|
|
|
pjcltd
Legendary
Offline
Activity: 1778
Merit: 1003
NodeMasters
|
|
January 27, 2015, 07:51:37 AM |
|
I am getting this error. Any help would be appreciated.
/home/frank/nomp/libs/stats.js:560 var portalApi = new api(logger, portalConfig, poolConfigs); ^ TypeError: object is not a function at new module.exports (/home/frank/nomp/libs/stats.js:560:21) at new module.exports (/home/frank/nomp/libs/api.js:11:36) at new module.exports (/home/frank/nomp/libs/website.js:29:21) at Object.<anonymous> (/home/frank/nomp/init.js:80:13) at Module._compile (module.js:456:26) at Object.Module._extensions..js (module.js:474:10) at Module.load (module.js:356:32) at Function.Module._load (module.js:312:12) at Function.Module.runMain (module.js:497:10) at startup (node.js:119:16) 2014-12-05 02:21:29 [Master] [Website] Website process died, spawning replacement...
Hi What command are you useing to start nomp ?
|
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
April 21, 2015, 11:38:10 AM |
|
Ok, so I'm finally trying out this guide. I've got a NOMP with two scrypt coins, Monacoin and Wildbeastbitcoin. When I execute the shell script in post #11I see this output: (integer) 0 (integer) 0 monacoin
ignore worker (standard_in) 1: syntax error wildbeastbitcoin
ignore worker (standard_in) 1: syntax error (standard_in) 1: syntax error (integer) 1
I'm not at all familiar with bash scripts, so I don't know where the syntax errors come from... This is what redis gives me after running the script: 127.0.0.1:6379> keys * 1) "Pool_Stats:avgHR:" 2) "coinVersionBytes" 3) "monacoin:shares:roundCurrent" 4) "Pool_Stats:avgHR:scrypt" 5) "statHistory" 6) "Coin_Names_scrypt" 7) "monacoin:stats" 8) "Coin_Algos" 9) "Coin_Names" 10) "wildbeastbitcoin:stats" 11) "wildbeastbitcoin:shares:roundCurrent"
Does it look like it should..?
|
“God does not play dice"
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
April 22, 2015, 10:13:06 AM |
|
Silly me, I needed just to run the script while having worker(s) mining. Still getting syntax errors, but the script works, since I now have the worker hashrates getting written into Redis.
On to the next step. Fun stuff!
|
“God does not play dice"
|
|
|
silversurfer1958
|
|
April 22, 2015, 10:36:37 PM |
|
Interesting but way above my abilities at the mo, would be great to build more pools though to maybe help with the 51% problem. Maybe if a number of pools were set up and encouraged pool hopping, that would draw people away from the big pools. Great job putting it all together.
|
|
|
|
hdmediaservices
|
|
May 19, 2015, 06:08:35 PM |
|
Do you all address the duplicate shares exploit?
|
|
|
|
l8nit3
Legendary
Offline
Activity: 1007
Merit: 1000
|
|
July 17, 2015, 06:37:58 AM |
|
first off great tutorial! my question for you is does this method work with the updated unomp aswell? and if so is there a way to send the auxpow coins for auto-exchange aswell? so they can be paid out to the user? (unomp has the auxpow merged mining ability, it just doesnt pay them out yet)
|
|
|
|
silversurfer1958
|
|
July 14, 2016, 10:17:16 AM |
|
I like this idea, but don't have the skills to put it together well enough I'd want people to risk their money. How about we set up a 'company' Invest some money to get it set up by one or two trust worthy people on here, and charge a 1% initial, management / maintenance fee (per annum, or per transaction in or out), maybe up to a maximum of $5 (or whatever) Something quite reasonable and operate it as a long term savings and investment plan for people. Possibly with some decent POW coins thrown in too if they look decent, eg Monero, Etherium etc, to diversify people's Portfolio.
Done properly, by trusted people, we wouldn't even need a return on our initial investment because we'd be getting that from the POS coin interest. I couldn't afford much but if It was set up by trusted individuals and audited before launch, It would be worth a risk.
|
|
|
|
0icu8
Newbie
Offline
Activity: 14
Merit: 0
|
|
July 20, 2016, 01:47:14 AM |
|
Total noob question.... I just finished my Setup for the Litecoind using NOMP. Waiting for chain to update. When going through the Config.json, i come across this:
"website": { "enabled": true, "host": "VPSIP", "siteTitle": "YOUR POOL", "port": 80, "stratumHost": "pool.unomp.org", "stats": {
For "stratumHost" What do i put in replace of "pool.unomp.org"? Im running on a VPS with Ubuntu. I can see my stratum page and see what coins and what not are there to mine. Im not sure if i need to change the StratumHost or not? if so, how do i get one, where do i find one or what do i need to change it to? Thanks in Advanced!
|
|
|
|
pjcltd
Legendary
Offline
Activity: 1778
Merit: 1003
NodeMasters
|
|
July 20, 2016, 08:12:07 AM |
|
Total noob question.... I just finished my Setup for the Litecoind using NOMP. Waiting for chain to update. When going through the Config.json, i come across this:
"website": { "enabled": true, "host": "VPSIP", "siteTitle": "YOUR POOL", "port": 80, "stratumHost": "pool.unomp.org", "stats": {
For "stratumHost" What do i put in replace of "pool.unomp.org"? Im running on a VPS with Ubuntu. I can see my stratum page and see what coins and what not are there to mine. Im not sure if i need to change the StratumHost or not? if so, how do i get one, where do i find one or what do i need to change it to? Thanks in Advanced!
HI ok in "host": "VPSIP", make that "host": "IP-address or domain name ", like "host": "99.44.22.44", or "host": " www.mypooo.com", The "port": 80, part this is the port the your pool wist will be on best to leave it on 80 "stratumHost": "pool.unomp.org", this is where the pool is and if you look in getting_started page it will display this as part of the connection info so again it can be hte IP address or the domain name hope that helps you out thanks Paul
|
|
|
|
0icu8
Newbie
Offline
Activity: 14
Merit: 0
|
|
July 22, 2016, 05:57:41 PM |
|
Hey Paul, Thanks for the reply. I was guessing that but u know guess and computer work doesn't go hand and hand. Thanks for the help! Have a good day! LTC POOL NOW OPEN http://45.32.243.180/getting_startedPayouts every .01LTC every 60 Secs. 2%fee
|
|
|
|
pjcltd
Legendary
Offline
Activity: 1778
Merit: 1003
NodeMasters
|
|
July 22, 2016, 06:22:25 PM |
|
Hey Paul, Thanks for the reply. I was guessing that but u know guess and computer work doesn't go hand and hand. Thanks for the help! Have a good day! LTC POOL NOW OPEN http://45.32.243.180/getting_startedPayouts every .01LTC every 60 Secs. 2%fee No problem Just looked at the pool You might want to increase the diff start it at 8192 on port 3008
|
|
|
|
jcreyesb
|
|
September 23, 2017, 04:31:12 AM Last edit: September 29, 2017, 12:51:04 PM by jcreyesb |
|
configure in algoProperties.js to deploy magic pool [code] m7mhash: { hash: function(){ return function(){ return multiHashing.m7mhash.apply(this, arguments); } } }
the system is ok [2017-09-23 12:51:14.705] [INFO] [default] - New Relic [2017-09-23 12:51:14.707] [DEBUG] [default] - NewRelic Monitor New Relic initiated [2017-09-23 12:51:14.708] [INFO] [default] - POSIX Not Installed [2017-09-23 12:51:14.708] [DEBUG] [default] - POSIX Connection Limit (Safe to ignore) POSIX module not installed and resource (connection) limit was not raised [2017-09-23 12:51:14.708] [INFO] [default] - Run Workers [2017-09-23 12:51:14.966] [DEBUG] [default] - Master PoolSpawner Spawned 1 pool(s) on 1 thread(s) [2017-09-23 12:51:15.398] [INFO] [default] - New Relic [2017-09-23 12:51:15.401] [INFO] [default] - POSIX Not Installed [2017-09-23 12:51:15.402] [INFO] [default] - Run Workers [2017-09-23 12:51:15.410] [INFO] [default] - Switching Setup Thread 1 Loading last proxy state from redis [2017-09-23 12:51:15.417] [DEBUG] [default] - Pool magicoin Thread 1 Share processing setup with redis (127.0.0.1:6379) [2017-09-23 12:51:15.448] [DEBUG] [default] - Pool magicoin Thread 1 started for magicoin [XMG] {m7mhash} Network Connected: Mainnet font=Verdana]Detected ReOSrd Type: POW Current Block Height: 1489428 Current Connect Peers: 2 Current Block Diff: 2.305667004 Network Difficulty: NaN Network Hash Rate: 30.37 MH Stratum Port(s): 3008, 3032, 3256 Pool Fee Percent: 1% Block polling every: 1000 ms [2017-09-23 12:51:15.449] [DEBUG] [default] - Switching Setup m7mhash Setting proxy difficulties after pool start [2017-09-23 12:51:24.719] [DEBUG] [default] - Master CLI CLI listening on port 17117 [2017-09-23 12:51:25.598] [INFO] [default] - New Relic [2017-09-23 12:51:25.597] [INFO] [default] - New Relic [2017-09-23 12:51:25.604] [INFO] [default] - POSIX Not Installed [2017-09-23 12:51:25.605] [INFO] [default] - Run Workers [2017-09-23 12:51:25.604] [INFO] [default] - POSIX Not Installed [2017-09-23 12:51:25.605] [INFO] [default] - Run Workers [2017-09-23 12:51:25.633] [DEBUG] [default] - Payments magicoin Payment processing setup to run every 600 sekannon(s) with daemon (magi@127.0.0.1:8232) and redis (127.0.0.1:6379) [2017-09-23 12:51:25.676] [DEBUG] [default] - Website Server Website started on xxxxx:80
the worker connect [2017-09-23 12:51:30.848] [DEBUG] [default] - Pool magicoin Thread 1 getting block notification via RPC polling [2017-09-23 12:51:31.600] [DEBUG] [default] - Pool magicoin Thread 1 Authorized 9QodNwAEKeetejjtNZiX6N4xxxxxx:x [190.38.xx]
now whe the worker send [2017-09-23 08:59:56] Stratum requested work restart [2017-09-23 08:59:56] thread 2: 261273 hashes, 5.11 khash/s [2017-09-23 08:59:56] thread 4: 530541 hashes, 10.37 khash/s [2017-09-23 08:59:56] thread 3: 194764 hashes, 8.83 khash/s [2017-09-23 08:59:56] thread 5: 333126 hashes, 10.01 khash/s [2017-09-23 08:59:56] thread 0: 495504 hashes, 9.69 khash/s [2017-09-23 08:59:56] thread 1: 263041 hashes, 5.14 khash/s [2017-09-23 09:00:07] thread 3: 124013 hashes, 11.22 khash/s [color=red][2017-09-23 09:00:07] stratum_recv_line failed [2017-09-23 09:00:07] Stratum connection interrupted[/color]
the pool crash /home/ubuntu/unomp/node_modules/merged-pooler/lib/algoProperties.js:227 return multiHashing.m7mhash.apply(this, arguments); ^ TypeError: Cannot call method 'apply' of undefined at /home/ubuntu/unomp/node_modules/merged-pooler/lib/algoProperties.js:227:45 at JobManager.processShare (/home/ubuntu/unomp/node_modules/merged-pooler/lib/jobManager.js:263:26) at null.<anonymous> (/home/ubuntu/unomp/node_modules/merged-pooler/lib/pool.js:630:46) at EventEmitter.emit (events.js:117:20) at handleSubmit (/home/ubuntu/unomp/node_modules/merged-pooler/lib/stratum.js:175:15) at handleMessage (/home/ubuntu/unomp/node_modules/merged-pooler/lib/stratum.js:88:17) at /home/ubuntu/unomp/node_modules/merged-pooler/lib/stratum.js:248:25 at Array.forEach (native) at Socket.<anonymous> (/home/ubuntu/unomp/node_modules/merged-pooler/lib/stratum.js:234:26) at Socket.EventEmitter.emit (events.js:95:17)
[/code]
|
|
|
|
Bitcoinera
Newbie
Offline
Activity: 50
Merit: 0
|
|
February 26, 2018, 03:57:51 PM |
|
Total noob question.... I just finished my Setup for the Litecoind using NOMP. Waiting for chain to update. When going through the Config.json, i come across this:
"website": { "enabled": true, "host": "VPSIP", "siteTitle": "YOUR POOL", "port": 80, "stratumHost": "pool.unomp.org", "stats": {
For "stratumHost" What do i put in replace of "pool.unomp.org"? Im running on a VPS with Ubuntu. I can see my stratum page and see what coins and what not are there to mine. Im not sure if i need to change the StratumHost or not? if so, how do i get one, where do i find one or what do i need to change it to? Thanks in Advanced!
HI ok in "host": "VPSIP", make that "host": "IP-address or domain name ", like "host": "99.44.22.44", or "host": " www.mypooo.com", The "port": 80, part this is the port the your pool wist will be on best to leave it on 80 "stratumHost": "pool.unomp.org", this is where the pool is and if you look in getting_started page it will display this as part of the connection info so again it can be hte IP address or the domain name hope that helps you out thanks Paul Well, this link is quite old, but it might help people like me still trying to figure this whole out. I am following these steps with the config.json file to connect to my VPS IP and getting every time this error: '[ERROR] [default] - Master Website Website process died, spawning replacement...', which means the uNOMP pool didn't get set up on my VPS.. but it works fine on my localhost. Can it be the redis server? Perhaps it should be also hosted on the VPS IP? Thank you if you can help me out on this one.
|
|
|
|
CryptoMona
Newbie
Offline
Activity: 10
Merit: 0
|
|
July 26, 2018, 08:28:52 AM |
|
How to setup multiple remote stratum on the same pool? I mean, server1 hosting the pool, server2 is in asia and hosting a stratum, server3 is in USA hosting another stratum , but all are on the same pool of server1. is this how works?
|
|
|
|
|