Bitcoin Forum
May 07, 2024, 01:03:05 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 »
141  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 16, 2013, 04:17:34 PM
I did (already yesterday) a  "cmp $HOME/.bitcoin.0.7.99/blocks/blk00000.dat $HOME/.bitcoin/blk0001.dat" which results in the (first) difference at byte 134214364. So they can not be simply concatenated to get the block-chain back, right? But maybe I have discovered a bug in the data structure/files.

The old files are limited to (almost) 2GB. The new files are limited to 128MiB. Apart from that, they are exactly the same format, they just store fewer blocks per file. If you concatenate either (in the right order), you get a usable bootstrap.dat

Quote
Well, "everything except the block files isn't intended to be used by third party code", together with "It's being replaced to improve performance" this both sounds like old Mircosoft-windows argumentation. :-/ "Bitcoin needs exclusive access to the database" Huch? only if it is run - but this line of argumentation matches the previous.

Obviously you can use it for whatever you like while Bitcoin isn't running. I just mean that compatibility with external applications was not a design priority (except for the block files - they could have been made smaller as well, but we chose not to, in order not to break external applications).

Quote
BTW: Why do I sound so negatively? Because I must think of my horrible long initialiation of the block chain which lasts far more than 23 h real-time last december and you tell me that it is no bottleneck. Sorry, to be frank.

You're sort of hijacking this thread, ranting about things that are not priorities now.

In 0.7.x, block verification WAS the bottleneck (except when you were on SSD or a tmpfs), and that is the reason for all changes that will be in 0.8, and what this thread exists for. I'm just saying that after those, it isn't. Block downloading and signature verification are the bottleneck. And yes, using bootstrap.dat for initialization avoids the problem with slow block downloading, but that's not what you want to be telling new users. The priority should be making automatic syncup usable again.
142  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 15, 2013, 09:28:37 PM
Sorry, but I think I should disagree strongly! This might to be an (or better the!) important bottleneck for each newbie who want to start to use bitcoin-qt.
You are true, it is no bottleneck for you and probably for > 95% of all forums posters here who has done already the initial download of the blockchain data. "crappy block download and signature validation" becomes the next bottle neck AFTER you have your personal block chain in a bitcoin-client-conform format!

I'm not sure what you mean. The current downloading mechanism often takes hours just downloading (not including any verification or storage), a lot of which is waiting and not doing anything. Compare that to (order of magnitude) an hour of verification to reach block 210000 (without signature checks). After block 210000, signature checking becomes the bottleneck. So if we want to improve the experience for new users, the first step should be speeding up the download itself, and the signature verification. After that, we can have a look at further improvements for the verification itself.

Quote
I also dislike to see it to be replaced  (the currently undocumented "blkindex.dat"-format) by new formats in rev*dat, blk*dat, coins-directory (and  blktree-diretory?), if these stay undocumented or to be over-complex.  Again making it impossible to use directly this raw block chain data.

It's being replaced to improve performance. The file format of the block files (blk000?.dat before, blocks/blk000??.dat now) hasn't changed: it's a binary concatenation of blocks in network format. The coins directory is a LevelDB database in a cusitom compact format that represents the set of unspent transaction output. See the source code (specifically: main.h above, class CCoins) for the exact specification. The rev* files contain data to be able to disconnect blocks afterwards in case of reorganisation. Bitcoin needs exclusive access to the database anyway, so everything except the block files isn't intended to be used by third party code. If you need access to the raw blockchain/utxo data, use the getblock, getrawtransaction and gettxout RPC commands.

Quote
I think I have shown with my block-parser, that conversion of this data into a bitcoin-qt usable structure can be done only in a few minutes on a typical hardware and not in more than 20 times real time for bitcoin-turbo and checking everything inside the main block chain except the script-verification! Yes, before, 0.7.?, it were more than 200-times of real time, so a great achievement of your new experimental 0.7.99 version.

My priority is correctness/completeness of the verification, consistency of the database on disk, and minimizing the working set size. That on itself leads to improved performance, but there are certainly a lot of things possible to improve it. As said, not a priority now. Yes if you do everything in RAM using a hash map, I'm sure it will be faster (it should!).

Quote
Please remember: I only started activity in this Bitcoin Forum, because the VERY SLOW initialization with the main block chain by the (most spread) bitcoin-qt client might put off increasingly many want-to-be starters into the bitcoin world.

I'm very aware of the slowness, but changing in the reference code things takes time (to implement, to review and to test) - this  slows development down, but I'm sure the community wouldn't want even a blazingly fast release with subtle bugs in it that lead to a split of the network after a large part of the community adopts it. We do what we can, and most of us (including me) just do this in our free time. The real bottleneck is testing, so thanks for helping out here (though I really can't do much about a segfault without more details...).

Oh and by the way, use `qmake RELEASE=1` to build optimized binaries.
143  Bitcoin / Development & Technical Discussion / Re: Standard Check Numbers (checksums for addresses) on: January 15, 2013, 08:13:53 PM
I think it would be quicker, for a human being, to make a cursory inspection of an address followed by a detailed inspection of a check number, as opposed to a detailed inspection of the full address.

That is exactly the same thing as: making a cursory inspection of an address followed by a detailed inspection of the last few characters of the address (as an address can be considered to be the pubkeyhash with its own checksum appended).
144  Bitcoin / Development & Technical Discussion / Re: ANN: Announcing code availability of the bitsofproof supernode on: January 14, 2013, 08:36:29 PM
IIRC, bitcoind has soft (bitcoind's) and hard (Bitcoin protocol) limits for that. Meaning that it will not spend nor build a block spending coinbase less than 120 blocks old, but it will accept a block which spends it a bit earlier (100 blocks I believe, not sure).

Hard limit: 101 confirmations, i.e. created 100 blocks before (network rule)
Soft limit: 120 confirmations, i.e. created 119 blocks before (client policy, which is probably overly conservative)
145  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 14, 2013, 08:34:40 PM
0.7.x has indeed a bug that it needs the ability to seek to positions in block files being imported. This limits it to 2 GB on some systems.

The current git head code doesn't need this, and can import blocks from arbitrary-sized files.
146  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 14, 2013, 08:31:53 PM
I think there is still a lot of room for speed-optimization of the load of the initial bootstrap.dat.

I have never claimed otherwise. That is not the bottleneck now, however (crappy block download and signature validation are).

Quote
Moreoever I see no effect of this "-benchmark" option (maybe this ".....Fix application." message?)

It should report block connect/validation speeds in debug.log.
147  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 11:51:33 PM
A last question for today (it got very late and I must get up tomorrow early): What is the x11-resource name of the color which backgrounds the "This is a pre-release ...." warning in the wallet-window and which controls the red of "(not synchron)" , such that I can change these specific colors?

I have no clue at all about GUI stuff. Wladimir wrote that.
148  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 11:09:52 PM
Running full node means major consumption of upload bandwidth. With my 32 kB total upload, after 30+ nodes connect to me I can
hardly surf the Internet, even very simple websites. Few people I know have even bigger problem because they actualy pay per GB
transfered in any direction, so they stopped running full node.

In such a setup I'd advise you to disable listening, as that will reduce the amount of nodes trying to fetch block data from you significantly.

149  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 11:08:05 PM
Oh, that's not a change I made, though I should have caught it.

The -static was intended to only be added for Windows.


150  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 08:26:41 PM
The error is harmless. It uses git to determine the current version, but it has a fallback in case that fails.

The db_cxx error means you don't have libdb4.8++-dev or libdb5.1++-dev installed. See doc/build-unit.txt for instructions.
151  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 08:24:49 PM
All this I knew -- I thought that I had misunderstood that this 193000 blocks limit was hardcoded into the client. :-/

Yes, it is hardcoded. 0.7.x have a latest checkpoint at 193000. The current development code has an extra one at 210000.
152  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 06:53:55 PM
I simply should concatenate my current block chain (3 files) into 1 huge file (almost 4.9 GB), name this "bootstrap.dat" and 0.7.* will load this without signature-checking into the bitcoin-client if I start it (and of course build a new blkindex.dat)?

It will load it, but only the first 193000 blocks will be done without signature checking.

Also, I think 0.7.x has a bug where no more than 2G of bootstrap.dat is loaded anyway. You can use -loadblock=file1 -loadblock=file2, ... though.
153  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 05:52:01 PM
You can't build an UTXO set (needed for validation) without doing most of the verification anyway (everything except signature checking, basically).
"most of the verification anyway" ... I like to weight this "most" by real-time not by code size or "logical" verification-classes and then it turns out that the signature checking needs 99% or far more of it (depends on your disk speed).

I sure, I fully agree. In terms of CPU usage, signatures validation outweighs everything. But before the last checkpoint (=exactly what is stored in bootstrap.dat), signature checking is disabled anyway (and this hasn't changed in 0.Cool.

Also, I assume that importing bootstrap.dat on your hardware will take more than 5 minutes. Done completely in RAM, it's probably possible, but the client also maintains a database on disk, and does synchronous writes at times to make sure block data and index/coins remain consistent with eachother.
154  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 04:57:10 PM
At least 0.8.0 will have an option to not verify the bootstrap.dat, I think.

You can't build an UTXO set (needed for validation) without doing most of the verification anyway (everything except signature checking, basically). If you mean trusting someone to build a pre-indexed blockchain for you, you're far better off running an SPV node - it's faster, needs less disk, needs less bandwidth to maintain, and doesn't require trust in a centralized source for your block data.

The current git head (pre-0.Cool code still has some problems that cause initial syncup to be slow (in particular, the block fetching algorithm is crappy, and gets very easily confused to the point where it stops doing anything for several minutes). From reports, it also seems it's significantly slower at verifying on Windows systems. Still, if combined with parallel+optimized signature verification code (which may or may not make it into 0.8 release), and the block fetching system gets fixed, I hope we can get close to a sync time of 1 hour on reasonably recent systems.
155  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 04:50:07 PM
But what I dislike is the idea to expect in future not only a cpu/processor for a bitcoin-client, but also a (high-end ?) GPU for the customer to get reasonable real-time behaviour of his bitcoin-client. This I call a miss-conception to the future.

In the future - at some point (near or far) - it's not reasonable to expect that every end user will run a fully verifying bitcoin node. If they use a desktop client at all (I assume much more will happen on mobile devices, for example), it will probably be with an SPV client or even lighter. At this point in the future, "reasonable real-time behaviour for the bitcoin-client" will only apply to those that have to (miners, payment processors, merchants) or volunteer to. I expect such people to run a node 24/7, and time to do a sync will only apply once.

That doesn't mean we should tell everyone now to move away (even though many people are, unfortunately but understandably) for fully validating clients. While the economy grows, we need a network to sustain it. In this respect, you are right, and we should do efforts to keep performance as good as possible (and I believe I do what I can for that). But as retep says, correctness is of much higher importance - if there is a problem with the code, it can be a disaster (lost coins, forked network, ...).

In particular, I have reasonable confidence in the 25% speedup optimization for secp256k1 after having spent time on it myself and tested it by comparing the output of the optimized operations with the non-optimized OpenSSL ones. I'd still like some profession cryptographer to look at the code still.

GPU code to do ECDSA verification would in the first place be an experiment, in my opinion. If it indeed helps in significantly faster verification, some may consider it. But as things stand now, I don't think there's an actual need.
156  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 04:27:30 PM
That's why I add that builds in gitian show the same problem, so I confirm there is a problem here. I just haven't spent the time investigating why.

There was a syntactic error in the bitcoin-qt.pro file (which was fixed before, but it seems the fix got lost). Newer versions of lrelease didn't break on it, but older ones did. It should be fixed now.
157  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 04:15:27 PM
It's not a matter of parallel vs. not parallel, it has to do with the fact that GPUs are terrible at point multiplication, the difficult part of signature verification. A regular CPU far outclasses them at it, and for much less energy.

I'm pretty sure that a good OpenCL implementation of ECDSA verification, running on high-end ATI gpu's, will be significantly faster than what OpenSSL (or any implementation) can do on a CPU. People have already written EC addition in OpenCL, reaching speeds of tens of millions of operations per second. EC multiplication is much slower and more complex to implement efficiently, but I expect it would reach tens of thousands of operations per second at least.

Someone just needs to write it...
158  Bitcoin / Development & Technical Discussion / Re: Speeding up signature verification on: January 13, 2013, 04:02:30 PM
Risk vs reward. No offense to Hal, but 25% isn't a very big improvement, and signature verification is something that absolutely must work correctly. Get it wrong and people can lose a lot of money very quickly. Why risk it?

The secp256k1-specific optimization has been implemented, and there's a pull request to integrate it in the reference client. It does indeed achieve some 20% improvement.

What smtp asked was about the parallel batch verification, with potentially much higher speedsup (Hal mentioned x4). The problem is that this is a much more invasive change, and the risk of getting crypto stuff wrong is extremely high. It's also not certain which degree of speedup is possible, as we need to do recovery of R before batch verification is possible (and changing this requires a hard fork basically, as the signatures are part of the scripting rules).

That said, if someone is interested in investigating, and potentially supplying a patch that is well testable... by all means, do so.
159  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 03:36:32 PM

Building works fine on my laptop, but I believe I hit the same problem when trying to build release binaries for Linux in gitian. It may require the package 'qt4-linguist-tools'.
You deduce from your environment to other people's environment. Smiley

That's why I add that builds in gitian show the same problem, so I confirm there is a problem here. I just haven't spent the time investigating why.
160  Bitcoin / Development & Technical Discussion / Re: Experimental pre-0.8 builds for testing on: January 13, 2013, 03:02:29 PM
I just have downloaded the current source version (1869678 bytes), "tar -xf bitcoin-turbo.tar.gz" it and call in the new created directory bitcoin-turbo the "qmake" . But gets lots of error messages about being unable to find all these local language files:

Project MESSAGE: Building with UPNP support
Project MESSAGE: Building with UPNP support
lrelease warning: Met no 'TRANSLATIONS' entry in project file '/home/achim/Downloads/bitcoin-turbo/bitcoin-qt.pro'
RCC: Error in 'src/qt/bitcoin.qrc': Cannot find file 'locale/bitcoin_bg.qm'
....

So, what are the exact preconditions/dependencies for compiling this turbo-bitcoin source successfully? (Are there linux-binaries availible?).

Building works fine on my laptop, but I believe I hit the same problem when trying to build release binaries for Linux in gitian. It may require the package 'qt4-linguist-tools'.

Quote
Thus I wonder, a new, say 3 GHz cpu with 6 cores should even be able to acomplish 10000 script-verification / sec or more today ... not to mention optimized secp256k1 code.

I've certainly seen more than that - 6000tx/s is over 12000txins/s.
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!