John Tobey (OP)
|
|
February 10, 2012, 08:53:12 PM |
|
Scanning the block chain... but not starting the web server til it thinks its done... There's a huge problem with that, it seems on a lot of systems, ABE takes a LOOONG time to scan blockchains of various whateverCoins, and during that time, it refuses to start the web server til it's done =_= It'd be better if it would start the web server FIRST so that the users of an ABE block explorer don't have to wait HOURS for ANYTHING AT ALL to work, if the admin decides to add a new block chain...
Have you tried scanning with one process and serving with another? python -m Abe.abe --config db.conf --config datadirs.conf --no-serve & python -m Abe.abe --config db.conf --config server.conf --datadir ~/emptydir Not as elegant as an option to do it all in one, I'll grant.
|
|
|
|
andrehorta
Legendary
Offline
Activity: 1261
Merit: 1000
|
|
February 11, 2012, 07:18:46 PM Last edit: February 11, 2012, 10:13:12 PM by andrehorta |
|
I´m not using the webserver or the python, i´m using the database.
Ok, resolved. But i have another question:
How can i get the destination address of the transaction? Because, only the table txout has foreign key to the table pubkey. The table txin don´t has any indication of the destination address.
|
|
|
|
eurekafag
|
|
February 12, 2012, 09:13:16 AM |
|
I'm stuck on block 2261: 2012-02-12 13:08:28,659 [4554:MainThread] DataStore INFO - block_tx 2259 2289 2012-02-12 13:08:28,660 [4554:MainThread] DataStore INFO - block_tx 2260 2290 2012-02-12 13:08:28,661 [4554:MainThread] DataStore INFO - block_tx 2261 2291 2012-02-12 13:08:28,661 [4554:MainThread] DataStore ERROR - Failed to catch up {'blkfile_number': 1, 'dirname': '/tmp/ramdisk', 'chain_id': None, 'id': 1, 'blkfile_offset': 515807} Traceback (most recent call last): File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2141, in catch_up store.catch_up_dir(dircfg) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2162, in catch_up_dir store.import_blkdat(dircfg, ds) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2277, in import_blkdat store.import_block(b, chain_ids = chain_ids) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 1609, in import_block block_id)) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 403, in sql store.cursor.execute(cached, params) OverflowError: long too big to convert 2012-02-12 13:08:28,662 [4554:MainThread] __main__ INFO - Abe initialized. 2012-02-12 13:08:28,663 [4554:MainThread] __main__ WARNING - Listening on http://localhost:2750
My config: dbtype = sqlite3 connect-args = abe.sqlite port 2750 host localhost datadir = /tmp/ramdisk template = " <!DOCTYPE html> <html lang=\"en\"> <head> <link rel=\"stylesheet\" type=\"text/css\" href=\"%(dotdot)s%(STATIC_PATH)sabe.css\" /> <link rel=\"shortcut icon\" href=\"%(dotdot)s%(STATIC_PATH)sfavicon.ico\" /> <title>%(title)s</title> </head> <body> <h1><a href=\"%(dotdot)schains\"><img src=\"%(dotdot)s%(STATIC_PATH)slogo32.png\" alt=\"Abe logo\" /></a> %(h1)s </h1> %(body)s <p style=\"font-size: smaller\"> <span style=\"font-style: italic\"> Powered by <a href=\"%(ABE_URL)s\">%(APPNAME)s</a> </span> Tips appreciated! <a href=\"%(dotdot)saddress/%(DONATIONS_BTC)s\">BTC</a> <a href=\"%(dotdot)saddress/%(DONATIONS_NMC)s\">NMC</a> </p> </body> </html> " commit-bytes = 10000 logging = { "version":1, "handlers": { "console": { "class": "logging.StreamHandler", "formatter": "full", "level": "DEBUG"}}, "formatters": { "full": { "format": "%(asctime)s [%(process)d:%(threadName)s] %(name)s %(levelname)s - %(message)s"}}, "root": { "handlers": ["console"], "level": "DEBUG"}}
|
|
|
|
John Tobey (OP)
|
|
February 12, 2012, 08:28:26 PM |
|
How can i get the destination address of the transaction? Because, only the table txout has foreign key to the table pubkey. The table txin don´t has any indication of the destination address.
txout.pubkey_id receives coins in transaction txout.tx_id. txin.txout_id points to txout.pubkey_id that sends coins in txin.tx_id. Is this clear?
|
|
|
|
someguy123
|
|
February 12, 2012, 10:14:06 PM |
|
John, I hope you realize your template system is terrible, my python errors out if I enable it Last time I managed to get it to work... I sat for at least an hour or two trying to figure out how to get it to work. Since then I've never managed it again, Is there any chance that you're going to make the template system USABLE? It's a PITA to even add my donation addresses
|
|
|
|
John Tobey (OP)
|
|
February 12, 2012, 11:27:46 PM |
|
someguy123,
I'm afraid I have little time to devote to new features--or much of anything beyond explaining how things work. Patches are welcome. I suggest starting a thread about the design of such a system--or integration into existing Python template systems, about which I know nothing.
-John
|
|
|
|
andrehorta
Legendary
Offline
Activity: 1261
Merit: 1000
|
|
February 13, 2012, 12:04:14 AM |
|
Sorry Jonh, But for me it´s not clear. I´m trying to make a query, to know how much some addess sent to another address, but i´m not understanding the database. For exemple, i would like to know how much the address 12cbQLTFMXRnSzktFkuoG3eHoMeFtpTu3S sent to 1Q2TWHE3GMdB6BZKafqwxXtWAWgFt5Jvm3. How can i get the destination address of the transaction? Because, only the table txout has foreign key to the table pubkey. The table txin don´t has any indication of the destination address.
txout.pubkey_id receives coins in transaction txout.tx_id. txin.txout_id points to txout.pubkey_id that sends coins in txin.tx_id. Is this clear?
|
|
|
|
John Tobey (OP)
|
|
February 13, 2012, 01:45:33 PM |
|
Sorry Jonh,
But for me it´s not clear. I´m trying to make a query, to know how much some addess sent to another address, but i´m not understanding the database. For exemple, i would like to know how much the address 12cbQLTFMXRnSzktFkuoG3eHoMeFtpTu3S sent to 1Q2TWHE3GMdB6BZKafqwxXtWAWgFt5Jvm3.
I think this will do it. Note that the question is a little ambiguous, since a transaction can have more than one sending address. I'll count every transaction where the given address is among the senders. This will unfortunately double-count transactions where it appears in two inputs. But at least it should illustrate the table relationships. Note the hashes of the two addresses: sender = 12cbQLTFMXRnSzktFkuoG3eHoMeFtpTu3S = 11B366EDFC0A8B66FEEBAE5C2E25A7B6A5D1CF31 receiver = 1Q2TWHE3GMdB6BZKafqwxXtWAWgFt5Jvm3 = FC916F213A3D7F1369313D5FA30F6168F9446A2D Note that Abe stores hashes as lowercase hex strings when it does not figure out how to use a binary type. Only SQLite has a binary type that Abe can use, so for SQLite instead of LOWER('hex string') you would put X'hex string', IIRC. (To force use of hex, pass --binary-type=hex when first creating the tables.) SELECT SUM(txout.txout_value) / 100000000 AS btc FROM txout JOIN txin ON (txout.tx_id = txin.tx_id) JOIN txout prevout ON (txin.txout_id = prevout.txout_id) JOIN pubkey addr_out ON (addr_out.pubkey_id = txout.pubkey_id) JOIN pubkey addr_in ON (addr_in.pubkey_id = prevout.pubkey_id) WHERE addr_in.pubkey_hash = LOWER('11B366EDFC0A8B66FEEBAE5C2E25A7B6A5D1CF31') AND addr_out.pubkey_hash = LOWER('FC916F213A3D7F1369313D5FA30F6168F9446A2D');
|
|
|
|
eurekafag
|
|
February 13, 2012, 02:25:00 PM |
|
John Tobey, what about my case? Why am I getting this exception and what can I do to break through that block?
|
|
|
|
John Tobey (OP)
|
|
February 13, 2012, 02:54:34 PM |
|
John Tobey, what about my case? Why am I getting this exception and what can I do to break through that block?
This error takes more thought. I got this error with SQLite and thought I'd worked around it in the latest code. Please list your versions of Abe, pysqlite, and SQLite. I could not produce the error using Abe latest, pysqlite-2.6.0, and libsqlite3 3.7.4-2ubuntu5. I tried this command, and it succeeds past block_tx 2261 2291: python -m Abe.abe --dbtype sqlite3 --commit-bytes 10000 --connect-args test.sqlite Note that SQLite will slow down horribly if you make it to block 70000+. Another database engine such as PostgreSQL or MySQL will not have the same error, which is specific to SQLite.
|
|
|
|
eurekafag
|
|
February 13, 2012, 03:09:57 PM |
|
Please list your versions of Abe, pysqlite, and SQLite. I could not produce the error using Abe latest, pysqlite-2.6.0, and libsqlite3 3.7.4-2ubuntu5. I'm using Debian wheezy and here are versions of my libs: Abe (from git): commit dd07123fb1b5cb6c633c08d90366b879b3cb8ce9 pysqlite - ? libsqlite3-0 - 3.7.9-2 I didn't find pysqlite and python-sqlite isn't installed either (and it shouldn't be because it's for sqlite2). But it works anyway because of the file /usr/lib/python2.6/lib-dynload/_sqlite3.so which belongs to python2.6 package. I've tried your command but it resulted in the same error. Installing python-pysqlite2 or python-pysqlite1.1 doesn't help either. The version of my python bindings is 1.1.8: Python 2.7.2+ (default, Nov 30 2011, 19:22:03) [GCC 4.6.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import sqlite >>> sqlite.version '1.1.8' >>>
|
|
|
|
tiker
|
|
February 13, 2012, 03:26:12 PM |
|
Does Abe re-organize blocks when it re-scans a block chain? I had a problem importing the LiquidCoin blockchain because the linux liquidcoind couldn't download the chain properly. It would get to block 1000 then complain about invalid blocks. Abe was showing 1500 at this point though. I grabbed the entire data directory from a windows download which was up-to-date, moved it over to the linux box and restarted the linux client using the windows data. After I cleared the offset value in Abe's DB and relaunched the import it filled in missing blocks. Here's an example: commit block 462660 already in chain 12 commit block 462661 already in chain 12 commit block 462662 already in chain 12 commit block 462663 already in chain 12 commit block 462664 already in chain 12 commit block_tx 1464778 4137316 commit block 462665 already in chain 12 commit block 462666 already in chain 12
Will adding blocks in the middle of a chain cause any problems with the block order for Abe? The site for this install is http://blockexplorer.funkymonkey.org if you want to look. (It is slow but that's due to older hardware.)
|
|
|
|
John Tobey (OP)
|
|
February 13, 2012, 03:38:49 PM |
|
eurekafag: Yes, you are right about not needing sqlite2. Please run "sqlite3 abe.sqlite" from the command line and issue select configvar_value from configvar where configvar_name = 'int_type'; The output should be "str" indicating that the workaround for this SQLite limitation is in effect. (A result of integer arithmetic is too big for 64 bits. The workaround forces the use of floating point.) If int_type is not "str", please update via git where I've added support for --int-type=str. Delete abe.sqlite and abe.sqlite-journal and retry with int-type=str in your config.
|
|
|
|
eurekafag
|
|
February 13, 2012, 03:45:48 PM |
|
John Tobey, well, the type was "int". I did as you said and now it passed that block. Thanks! I left a comment on the config line, seems like a mistype occured.
|
|
|
|
John Tobey (OP)
|
|
February 13, 2012, 03:56:41 PM |
|
Does Abe re-organize blocks when it re-scans a block chain?
[...]
Will adding blocks in the middle of a chain cause any problems with the block order for Abe?
I know of one, possibly two bugs affecting this situation. First, Abe sometimes crashes by exceeding Python's recursion limit when rescanning. This does not produce wrong data, and I work around it by deleting rows from chain_candidate, updating chain.chain_last_block_id, and retrying. I have noticed slight deviations in statistical values like Coin Days Destroyed from one Abe instance to another. (The values on the homepage depend on server system time, but on a given block's page they should be identical.) I suspect rescanning may create statistical discrepancies. The ones I've seen were slight (under 1%). I do not know of any bugs affecting address balances, transaction links, etc. But I can offer no guarantee.
|
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
February 14, 2012, 09:46:36 PM |
|
Dears
How can i get the Date and Time of transaction?
Thank you
does this help? select to_timestamp(block_ntime) from block b inner join block_tx btx on btx.block_id = b.block_id inner join tx on tx.tx_id = btx.tx_id where tx.tx_hash = '97ddfbbae6be97fd6cdf3e7ca13232a3afff2353e29badfab7f73011edd4ced9';
this will get you the time the block the tx got in was mined
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
andrehorta
Legendary
Offline
Activity: 1261
Merit: 1000
|
|
February 15, 2012, 12:44:30 PM |
|
yes, thank you!
|
|
|
|
eurekafag
|
|
February 15, 2012, 03:12:01 PM Last edit: February 15, 2012, 06:56:40 PM by eurekafag |
|
I switched to MySQL for experiment. During the scanning I had to stop abe and restart later. 2-3 times it was ok but now it shows me this: Failed to catch up {'blkfile_number': 1, 'dirname': '/tmp/ramdisk', 'chain_id': None, 'id': Decimal('1'), 'blkfile_offset': 446201276} Traceback (most recent call last): File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2145, in catch_up store.catch_up_dir(dircfg) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2166, in catch_up_dir store.import_blkdat(dircfg, ds) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 2281, in import_blkdat store.import_block(b, chain_ids = chain_ids) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 1485, in import_block tx['tx_id'] = store.import_tx(tx, pos == 0) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 1802, in import_tx store.intin(tx['lockTime']), len(tx['tx']))) File "/home/eurekafag/svn-soft/bitcoin-abe/Abe/DataStore.py", line 404, in sql store.cursor.execute(cached, params) File "/usr/lib/python2.7/dist-packages/MySQLdb/cursors.py", line 174, in execute self.errorhandler(self, exc, value) File "/usr/lib/python2.7/dist-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler raise errorclass, errorvalue IntegrityError: (1062, "Duplicate entry '2' for key 'PRIMARY'") Abe initialized. Listening on http://localhost:2750 ...refusing to add left transactions. I experienced this before during configuring MySQL but then I just dropped the DB and created it again. Now it contains too many txs and I don't want to spend another 2 days rescanning the chain. I don't know how the transactional db can be so inconsistent but looks like it really has 2 equal keys in to different records. I can't figure out which tables to look at.
|
|
|
|
|
tiker
|
|
February 15, 2012, 05:20:24 PM |
|
I've seen somewhere an Abe install with the SolidCoin2 chain but I can't seem to get it going on my install. I wasn't able to compile the SC2 from source so I'm using the pre-compiled binary from the SC2 site. Here's what I'm getting when trying to start the import: Failed to catch up {'blkfile_number': 1, 'dirname': u'/home/ed/.solidcoin2', 'chain_id': Decimal('19'), 'id': 77L, 'blkfile_offset': 0} Traceback (most recent call last): File "Abe/DataStore.py", line 2141, in catch_up store.catch_up_dir(dircfg) File "Abe/DataStore.py", line 2162, in catch_up_dir store.import_blkdat(dircfg, ds) File "Abe/DataStore.py", line 2277, in import_blkdat store.import_block(b, chain_ids = chain_ids) File "Abe/DataStore.py", line 1493, in import_block raise MerkleRootMismatch(b['hash'], tx_hash_array) MerkleRootMismatch: Block header Merkle root does not match its transactions. block hash=725ca2009ba1b949a820d3a48511dc21c7f8d76d06d8afb6abb5988f5a911a8b
|
|
|
|
|