|
|
|
|
Be very wary of relying on JavaScript for security on crypto sites. The site can change the JavaScript at any time unless you take unusual precautions, and browsers are not generally known for their airtight security.
|
|
|
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
|
|
|
|
rossen
|
|
February 13, 2016, 03:31:37 PM Last edit: February 25, 2016, 11:02:39 AM by rossen |
|
Hi all, How to repair the chain table? http://zetool.cointech.net/chains now Aurumcoin has missing columns, Some times FireFlycoin is crashed... --rescan can not help. Best Regards, Rossen Edit: http://mobiblocks.cointech.net has the same problem with other coins
|
|
|
|
rik8119
Full Member
Offline
Activity: 217
Merit: 100
CEO WINC e. V.
|
|
February 25, 2016, 08:36:24 AM |
|
Hi Everybody, i tried to fit the Abe for Worldleadcurrency which is a merge mining clone of Freicoin. Freicoin has the additional int32_t variable nRefHeight (to calculate demurrage) inside transaction. So i added d['nRefheight'] = vps.read_int64()
(dont know if this is the correct way, but i found no better solution right now) which changed the error: block_tx 25 25 Skipped 4 bytes at block end Exception at 72056511857166274 Failed to catch up {'blkfile_offset': 0, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': u'/root/.worldleadcurrency', 'id': 39L} Traceback (most recent call last): File "Abe/DataStore.py", line 2536, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2837, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2959, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 91, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
to block_tx 26 26 Skipped 1283 bytes at block end Exception at 580166053 Failed to catch up {'blkfile_offset': 0, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': u'/root/.worldleadcurrency', 'id': 40L} Traceback (most recent call last): File "Abe/DataStore.py", line 2536, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2837, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2959, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 91, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
Do i just have to adjust the reading of nRefHeight or is there something else to consider? Thanks in advance Rik
|
Demurrage - the easiest way to a human society.
|
|
|
rossen
|
|
February 25, 2016, 11:04:55 AM |
|
Hi Everybody, i tried to fit the Abe for Worldleadcurrency which is a merge mining clone of Freicoin. Freicoin has the additional int32_t variable nRefHeight (to calculate demurrage) inside transaction. So i added d['nRefheight'] = vps.read_int64()
(dont know if this is the correct way, but i found no better solution right now) which changed the error: block_tx 25 25 Skipped 4 bytes at block end Exception at 72056511857166274 Failed to catch up {'blkfile_offset': 0, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': u'/root/.worldleadcurrency', 'id': 39L} Traceback (most recent call last): File "Abe/DataStore.py", line 2536, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2837, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2959, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 91, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
to block_tx 26 26 Skipped 1283 bytes at block end Exception at 580166053 Failed to catch up {'blkfile_offset': 0, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': u'/root/.worldleadcurrency', 'id': 40L} Traceback (most recent call last): File "Abe/DataStore.py", line 2536, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2837, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2959, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 91, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
Do i just have to adjust the reading of nRefHeight or is there something else to consider? Thanks in advance Rik When I see similar errors just re-run the Abe then move some blocks before exit again.... I do not why, I am not familiar with python.
|
|
|
|
rik8119
Full Member
Offline
Activity: 217
Merit: 100
CEO WINC e. V.
|
|
February 25, 2016, 12:32:48 PM |
|
When I see similar errors just re-run the Abe then move some blocks before exit again.... I do not why, I am not familiar with python.
Hi Thank you for your answer the problem is its the genesisblock ;-) that is not deserialized correctly. What you describe sounds to me like memory problem, however.. ;-)
|
Demurrage - the easiest way to a human society.
|
|
|
rik8119
Full Member
Offline
Activity: 217
Merit: 100
CEO WINC e. V.
|
|
March 07, 2016, 08:51:53 PM Last edit: March 07, 2016, 09:07:29 PM by rik8119 |
|
Hey thanks again i did put it in the wrong line have to add it in line 96 main.h (additional nRefHeight): class CTransaction { public: static mpq nMinTxFee; static mpq nMinRelayTxFee; static const int CURRENT_VERSION=2; int nVersion; std::vector<CTxIn> vin; std::vector<CTxOut> vout; unsigned int nLockTime; int32_t nRefHeight; added nRefHeight in deserialize.py: def parse_Transaction(vds, has_nTime=False): d = {} start_pos = vds.read_cursor d['version'] = vds.read_int32() if has_nTime: d['nTime'] = vds.read_uint32() n_vin = vds.read_compact_size() d['txIn'] = [] for i in xrange(n_vin): d['txIn'].append(parse_TxIn(vds)) n_vout = vds.read_compact_size() d['txOut'] = [] for i in xrange(n_vout): d['txOut'].append(parse_TxOut(vds)) d['lockTime'] = vds.read_uint32() d['nRefHeight'] = vds.read_int32() # Added nRefHeight d['__data__'] = vds.input[start_pos:vds.read_cursor] return d But now it stops at block 12352: block 12352 already in chain 18 commit Exception at 10500443171604481732 Failed to catch up {'blkfile_offset': 12849643, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': '/root/.worldleadcurrency', 'id': Decimal('45')} Traceback (most recent call last): File "Abe/DataStore.py", line 2536, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2837, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2959, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 90, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) OverflowError: Python int too large to convert to C long
i used the --rescan option, but it dont continue.. I write again, when i find something new. rossen - how do you "move some blocks before exit again...."? Thanks ;-) Rik
|
Demurrage - the easiest way to a human society.
|
|
|
rossen
|
|
March 07, 2016, 09:14:56 PM |
|
I think that the problem with my restarting time2time is low memory or similar.... I just re-run the shell/bat command again and again....
|
|
|
|
rik8119
Full Member
Offline
Activity: 217
Merit: 100
CEO WINC e. V.
|
|
March 12, 2016, 07:23:38 AM Last edit: March 12, 2016, 10:30:47 AM by rik8119 |
|
Ok, thats bad. its definitively a RAM problem. On my desktop pc with 8gig ram it continues to read blocks even after block 12322 but i have to use a small bash because it stops after 20 blocks. Bash: #/bin/bash for (( ; ; )) do python -m Abe.abe --config wlc.conf --commit-bytes 2000000 --no-serve done Now it loads 10 tx in a row, but not more: failed to load /home/rik/.worldleadcurrency/bitcoin.conf: [Errno 2] No such file or directory: '/home/rik/.worldleadcurrency/bitcoin.conf' catch_up_rpc: abort Opened /home/rik/.worldleadcurrency/blocks/blk00000.dat block_tx 68382 68382 block_tx 68383 68383 block_tx 68384 68384 block_tx 68385 68385 block_tx 68386 68386 block_tx 68387 68387 block_tx 68388 68388 block_tx 68389 68389 block_tx 68390 68390 block_tx 68391 68391 block_tx 68392 68392 block_tx 68393 68393 block_tx 68394 68394 Exception at 2536894596 Failed to catch up {'blkfile_offset': 3298433, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': '/home/rik/.worldleadcurrency', 'id': Decimal('1')} Traceback (most recent call last): File "Abe/DataStore.py", line 2535, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2836, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2958, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 90, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
So i do not know what to do but to strip the db and copy it to my server what is a really dirty solution and not guaranteed to work..If anybody has a better idea i would be very interested. Many Greetings and thanks Rik
|
Demurrage - the easiest way to a human society.
|
|
|
rossen
|
|
March 12, 2016, 08:32:12 PM |
|
Ok, thats bad. its definitively a RAM problem. On my desktop pc with 8gig ram it continues to read blocks even after block 12322 but i have to use a small bash because it stops after 20 blocks. Bash: #/bin/bash for (( ; ; )) do python -m Abe.abe --config wlc.conf --commit-bytes 2000000 --no-serve done Now it loads 10 tx in a row, but not more: failed to load /home/rik/.worldleadcurrency/bitcoin.conf: [Errno 2] No such file or directory: '/home/rik/.worldleadcurrency/bitcoin.conf' catch_up_rpc: abort Opened /home/rik/.worldleadcurrency/blocks/blk00000.dat block_tx 68382 68382 block_tx 68383 68383 block_tx 68384 68384 block_tx 68385 68385 block_tx 68386 68386 block_tx 68387 68387 block_tx 68388 68388 block_tx 68389 68389 block_tx 68390 68390 block_tx 68391 68391 block_tx 68392 68392 block_tx 68393 68393 block_tx 68394 68394 Exception at 2536894596 Failed to catch up {'blkfile_offset': 3298433, 'blkfile_number': 100000, 'chain_id': 18, 'loader': None, 'conf': None, 'dirname': '/home/rik/.worldleadcurrency', 'id': Decimal('1')} Traceback (most recent call last): File "Abe/DataStore.py", line 2535, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2836, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "Abe/DataStore.py", line 2958, in import_blkdat b = chain.ds_parse_block(ds) File "Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "Abe/deserialize.py", line 90, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
So i do not know what to do but to strip the db and copy it to my server what is a really dirty solution and not guaranteed to work..If anybody has a better idea i would be very interested. Many Greetings and thanks Rik Try to decrease this value: --commit-bytes 2000000 its looks so much.
|
|
|
|
dnp
|
|
April 17, 2016, 06:52:37 PM |
|
i'm trying to get bitcoin-abe 0.8pre to work with MetalMusicCoin3 (a pos/pow coin.) i tried the following configuration but get an abort. does this version of abe handle more recent PoS coins? datadir = [{ "dirname" : "/home/ndl/.noodlyappendagecoin", "chain" : "NoodlyAppendageCoin", "conf" : "noodlyappendagecoin.conf", "loader" : "blkfile", "policy" : "Sha256Chain", # not coins algorithm, but chain format used "code3" : "NDL", "address_version" : "u0305", # hex value of base58.h PUBKEY_ADDRESS "magic" : "u0f0bu0c00u0b06u0d0b" # main.cpp uchar pchMessageStart[4] }] datadir += [{ "dirname" : "/home/metalcoin/.MetalMusicCoin3", "chain" : "MetalMusicCoin3", "conf" : "MetalMusicCoin3.conf", "loader" : "blkfile", "policy" : "PpcPosChain", "code3" : "MTLMC", "address_version" : "u0303", "magic" : "u0c0eu0f0bu0f0au0d0b" # {0xce, 0xfb, 0xfa, 0xdb} }]
ndl@carpdiem:~$ ./runabe.sh --init Opened /home/ndl/.noodlyappendagecoin/blocks/blk00001.dat block_tx 551775 646045 block_tx 551776 646046 commit Opened /home/metalcoin/.MetalMusicCoin3/blk0001.dat Exception at 72056513165787274 Failed to catch up {'blkfile_offset': 0, 'blkfile_number': 1, 'chain_id': 2, 'loader': u'blkfile', 'conf': u'MetalMusicCoin3.conf', 'dirname': u'/home/metalcoin/.MetalMusicCoin3', 'id': 14L} Traceback (most recent call last): File "/usr/lib64/python2.7/site-packages/Abe/DataStore.py", line 2529, in catch_up store.catch_up_dir(dircfg) File "/usr/lib64/python2.7/site-packages/Abe/DataStore.py", line 2836, in catch_up_dir store.import_blkdat(dircfg, ds, blkfile['name']) File "/usr/lib64/python2.7/site-packages/Abe/DataStore.py", line 2958, in import_blkdat b = chain.ds_parse_block(ds) File "/usr/lib64/python2.7/site-packages/Abe/Chain/__init__.py", line 82, in ds_parse_block d['transactions'].append(chain.ds_parse_transaction(ds)) File "/usr/lib64/python2.7/site-packages/Abe/Chain/__init__.py", line 75, in ds_parse_transaction return deserialize.parse_Transaction(ds) File "/usr/lib64/python2.7/site-packages/Abe/deserialize.py", line 90, in parse_Transaction d['txIn'].append(parse_TxIn(vds)) File "/usr/lib64/python2.7/site-packages/Abe/deserialize.py", line 46, in parse_TxIn d['sequence'] = vds.read_uint32() File "/usr/lib64/python2.7/site-packages/Abe/BCDataStream.py", line 71, in read_uint32 def read_uint32 (self): return self._read_num('<I') File "/usr/lib64/python2.7/site-packages/Abe/BCDataStream.py", line 110, in _read_num (i,) = struct.unpack_from(format, self.input, self.read_cursor) error: unpack_from requires a buffer of at least 4 bytes
|
Explorer and full node hosting at explorer.dognose.net
|
|
|
rik8119
Full Member
Offline
Activity: 217
Merit: 100
CEO WINC e. V.
|
|
April 20, 2016, 07:29:46 PM |
|
Can you please also post your MetalMusicCoin3.py also. Thanks.
|
Demurrage - the easiest way to a human society.
|
|
|
dnp
|
|
April 26, 2016, 10:21:05 PM |
|
Can you please also post your MetalMusicCoin3.py also. Thanks.
i'm using the PpcPosChain policy, the .py file is as follows # Copyright(C) 2014 by Abe developers.
# This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as # published by the Free Software Foundation, either version 3 of the # License, or (at your option) any later version. # # This program is distributed in the hope that it will be useful, but # WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU # Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public # License along with this program. If not, see # <http://www.gnu.org/licenses/agpl.html>.
from . import BaseChain from .. import deserialize
class PpcPosChain(BaseChain): """ A blockchain with proof-of-stake as in Peercoin. """ def ds_parse_transaction(chain, ds): return deserialize.parse_Transaction(ds, has_nTime=True)
def ds_parse_block(chain, ds): d = BaseChain.ds_parse_block(chain, ds) d['block_sig'] = ds.read_bytes(ds.read_compact_size()) return d
and this is what the .MetalMusicCoin3 directory structure looks like: . ./database ./database/log.0000000138 ./database/log.0000000140 ./database/log.0000000143 ./database/log.0000000141 ./database/log.0000000144 ./database/log.0000000145 ./database/log.0000000139 ./database/log.0000000142 ./MetalMusicCoin3.conf ./blk0001.dat ./peers.dat ./.lock ./wallet.dat ./MetalMusicCoin3d.pid ./blkindex.dat ./db.log ./debug.log
|
Explorer and full node hosting at explorer.dognose.net
|
|
|
rossen
|
|
April 27, 2016, 04:02:42 AM |
|
Try to re-run the coin daemon with -reindex -indexes I have similar problem yesterday with two daemons and both fix it
|
|
|
|
dnp
|
|
May 09, 2016, 01:10:41 AM |
|
Try to re-run the coin daemon with -reindex -indexes I have similar problem yesterday with two daemons and both fix it
alas, it made no difference
|
Explorer and full node hosting at explorer.dognose.net
|
|
|
rossen
|
|
July 28, 2016, 07:15:34 AM |
|
Hello all,
can anybody share SQL query for generate richlist?
|
|
|
|
rossen
|
|
January 09, 2017, 01:35:08 PM |
|
Hi all, On some block/hash the sync fails with: IntegrityError: (1062, "Duplicate entry '\\x84\\xC0\\xD4\\x00;m\\x82\\xDEki\\\\xCC\\xEC9\\xFF\\xB0\\x94\\xEA\\xE1\\x15\\x' for key 'tx_hash'")
Any solution? I remove the old chain and re-run it from scratch, but again fail withe same error
|
|
|
|
Green Lantern
Newbie
Offline
Activity: 322
Merit: 0
|
|
January 27, 2017, 06:42:26 PM |
|
Hi. I am looking into source code of genesis_tx.py. There are lines 28-30: # Main Bitcoin chain: if tx_hash_hex == "4a5e1e4baab89f3a32518a88c31bc87f618f76673e2cc77ab2127b7afdeda33b": return "01000000010000000000000000000000000000000000000000000000000000000000000000ffffffff4d04ffff001d0104455468652054696d65732030332f4a616e2f32303039204368616e63656c6c6f72206f6e206272696e6b206f66207365636f6e64206261696c6f757420666f722062616e6b73ffffffff0100f2052a01000000434104678afdb0fe5548271967f1a67130b7105cd6a828e03909a67962e0ea1f61deb649f6bc3f4cef38c4f35504e51ec112de5c384df7ba0b8d578a4c702b6bf11d5fac00000000" I am trying to figure out what is return value? Where this constant comes from?
|
|
|
|
bitspill
Legendary
Offline
Activity: 2058
Merit: 1015
|
|
January 28, 2017, 05:25:53 AM |
|
Hi. I am looking into source code of genesis_tx.py. There are lines 28-30: # Main Bitcoin chain: if tx_hash_hex == "4a5e1e4baab89f3a32518a88c31bc87f618f76673e2cc77ab2127b7afdeda33b": return "01000000010000000000000000000000000000000000000000000000000000000000000000ffffffff4d04ffff001d0104455468652054696d65732030332f4a616e2f32303039204368616e63656c6c6f72206f6e206272696e6b206f66207365636f6e64206261696c6f757420666f722062616e6b73ffffffff0100f2052a01000000434104678afdb0fe5548271967f1a67130b7105cd6a828e03909a67962e0ea1f61deb649f6bc3f4cef38c4f35504e51ec112de5c384df7ba0b8d578a4c702b6bf11d5fac00000000" I am trying to figure out what is return value? Where this constant comes from? it's the raw hex of the genesis transaction since there is a bug in the RPC where you can't get info about the genesis without looking at the block data files yourself, there's a note about it at the bottom of the file # Extract your chain's genesis transaction data from the first # block file and add it here, or better yet, patch your coin's # getrawtransaction to return it on request: #if tx_hash_hex == "<HASH>" # return "<DATA>"
|
|
|
|
Green Lantern
Newbie
Offline
Activity: 322
Merit: 0
|
|
January 29, 2017, 06:49:35 PM Last edit: January 30, 2017, 12:56:35 AM by Green Lantern |
|
it's the raw hex of the genesis transaction since there is a bug in the RPC where you can't get info about the genesis without looking at the block data files yourself, there's a note about it at the bottom of the file # Extract your chain's genesis transaction data from the first # block file and add it here, or better yet, patch your coin's # getrawtransaction to return it on request: #if tx_hash_hex == "<HASH>" # return "<DATA>" So when you try getrawtransaction [merkleroot of gen_block] The result should be 01000000010000000000000000000000000000000000000000000000000000000000000000fffff fff4d04ffff001d0104455468652054696d65732030332f4a616e2f32303039204368616e63656c 6c6f72206f6e206272696e6b206f66207365636f6e64206261696c6f757420666f722062616e6b7 3ffffffff0100f2052a01000000434104678afdb0fe5548271967f1a67130b7105cd6a828e03909 a67962e0ea1f61deb649f6bc3f4cef38c4f35504e51ec112de5c384df7ba0b8d578a4c702b6bf11 d5fac00000000 Where I can see this raw hex? Extract your chain's genesis transaction data from the first block file I see this advice but what exactley I have to do?
|
|
|
|
Green Lantern
Newbie
Offline
Activity: 322
Merit: 0
|
|
February 06, 2017, 06:11:40 PM |
|
Do I need to set another network config in *.conf file except these to get Abe running? # Specify port and/or host to serve HTTP instead of FastCGI: port 2750 host localhost It does not work after python -m Abe.abe --config abe-my.confBut this python -m Abe.abe --config abe-my.conf --commit-bytes 100000 --no-serve command works as expected I am using Ubuntu 14.04. Any suggestions?
|
|
|
|
|