ok to draw this together
It seems [1] a 2MB block No. [2] quadratic solution of some type and :: [segwit and flex trans seem to have some options] Yes. [3] maybe compression :: [segwit and flex trans seem to have some options] No, Segwit does not change space efficiency, and flextrans is not even a serious proposal may be a dialogue that most parties could agree on::
Could this form a backbone to the discussion going forward?
Stripping away the emotive which may be justified but deleterious, reaching a compromise and consensus, where cooler heads prevail.
The miners, the devs, the userbase, exchanges, etc, all these parties can show now if you wish the way forward/
Sure there will be testing and things at the margins that due to unseen issues or tech requirements need to fall one way or the other.
Can we have this accord? No. You're not listening, and you're mis-representing the case for improving space usage (which is the only form of scaling possible) so badly, that it's not easy to take your assessment seriously as a compromise proposal. You're also advocating a 2MB base blocksize hardfork, it's been rejected twice already (XT & Classic), and would involve an 8MB total blocksize when one factors in the corresponding witness blocksize. No. Remember: Segwit is already a significant compromise between big blocks and 1MB forever. I'm somewhere in between big blocks and perma-1MB, and I am willing to accept Segwit, despite it's 4MB total blocksize being a concern to me. What's happened is that the big blocks proponents just will not accept the 4MB Segwit blocksize increase, and keep pushing and pushing and pushing for as big a blocksize as they can, and never compromise anywhere (and also never hardfork the blockchain as they have constantly threatened for 2+ years). Your "compromise" just falls into the same category of rhetoric, and it's unpalatable for well grounded and argued technical reasons.
|
|
|
Negative incentives are not required to sell the Bitcoin idea. And I've said this before regarding hoping for worldwide calamities (major currency and/or bond market crashes) to help the BTC exchange rate, and I will say it again here: be careful what you wish for
Side note on wishes: I really wish the UN would shut up and fuck off. How do they propose to enforce their vaunted no-fly zone, shooting down planes? Or getting these mass-murdering warlords to do pinky-swears, like they usually do?
|
|
|
Anyone rationalizing using hashing power to attack a functioning network and it's user base is toxic to the community and should be called out for suggesting such appalling behavior.
Logical arguments against big blocks == trolling. I wouldn't waste keyboard strokes, when it comes down to it, we know how to prepare for and to mitigate against the various aggressive threats made by the BU proponents, I'm happy with forking away and using a totally different name and brand. If that happened, when that coin wins due to technical merit (yet again), these thinly disguised wreckers will be forced to follow the "Bitcoin Reborn" coin with their meddling (yet again). Rinse and repeat, I would imagine. As long as we're determined to support the most valuable tech, there's not much they can really do, except continue to be increasingly less disruptive.
|
|
|
Try using a different external disk. And/or be extra careful not to knock the USB cable connecting your disk to your machine. The issue you're having might be disk failure, or cable failure. Certainly, Armory isn't happy with what it's reading from the Bitcoin blockchain when it scans your wallet transactions.
|
|
|
It was very amusing, all sorts of weird and wonderful cryptocoins appeared next to basically everyone's avatar in every post on the forum. I don't know who screwed up or why, but it was fun, for about 4 hours
|
|
|
Block size increase is great for a temporary increase in transaction capacity. But when you're talking about long-term upgrades to the protocol before Bitcoin can become even more immutable because it's sufficiently scalable, you have to consider that you can't just keep shoving up the block size. You have to find ways to upgrade the protocol and actual system properly, otherwise you'll end up with the scenario of a few mining monopolies running full nodes and no one else.
ok i see this may be a considerable issue...what would help alleviate this, code wise apart from block size:: Increase the efficiency with which the transactions themselves are stored in the blocks. Don't make the blocks bigger, make the transactions smaller. I can't believe I'm the only person regularly arguing in favour of this incredibly common-sense and simple concept, which actually fulfills the definition of what the word "scaling" means.
|
|
|
I had to re-download the 120 GB blockchain only recently, due (I think) to not allowing Bitcoin Core to shutdown properly and some part of the database becoming corrupted.
If the blockchain was growing at 8x the rate (or with Segwit, 20x or 40x if the base block was increased to 4 MB or 8 MB), I would have had to completely change all my plans around that, it would have taken so much more than the 2+ days that it took (using a Sandy Bridge laptop with external 1TB Samsung 850 SSD). 8MB baseblock with a corresponding 32 MB witness block (i.e. 40x) would have been far too much to contemplate.
This illustrates that even 100 GB IBD and validation is taxing in real world conditions, today.
Why are on-chain advocates never interested in the alternative hard-fork proposals, such as improving the tx encoding efficieny or using more space efficient signatures? By alwasy insisting on plain blocksize increases and nothing else, on-chain big-blockers ignore the threat of Mike Hearn's infamous "only 8 Google datacenters will run the Bitcoin network" future scenario. Jonald Fyookball talks constantly about this, and yet recently admitted he doesn't even run a Bitcoin node, he thinks someone else should do it for him.
If we increase even to 4MB, there is a risk that node numbers will go down, because the computer and net resources needed to keep up with that size of blockchain growth will overwhelm too many people that run nodes today. And I'm willing to risk the 4MB (that is, 1 MB base + 3 MB witness blocks) that Segwit increases the blocksize to. 8MB? Forget it for at least 2 years, it's bound to hurt the node count even more than 4MB.
|
|
|
What do we want? To prove to our opponents that our point of view is the only right one? Or to finally scale?
Those that can see that 'the Emperor wears no clothes' can tell you that Core clearly does not want to scale. Otherwise they wouldn't be stalling for years, offering convoluted solutions to simple programs, breaking agreements and refusing to compromise. This is all entirely in contradiction to actual reality. Core's Segwit solution is the only genuine proposal that involves changing the scale of the Bitcoin blockchain's resource usage. Every other proposal that advertises itself as "scaling" maintains the scale of resource usage as it exists with the current paradigm, including the (3? 4?) various non-scaling solutions you've endorsed in the past few years jonald. Interesting how you never shut up about scaling, and yet always (and relentlessly) suggest false scaling proposals. The charge that Segwit is a "convoluted solution to a simple program" is also pretty dubious; not only is Bitcoin itself a fairly complicated concept already, but Segwit is actually a pretty simple change to the complicated original, i.e. the perfect opposite of what you're saying. And lastly, I think we can all agree that increasing from 1MB to 4MB is a pretty compromising proposal. I don't want 4MB, and yet I support Segwit. Because I'm willing to compromise.
|
|
|
Because it is currently very hard to get your currency (USD for example) out of Bitfinex due to some issues with their banking relationship. This leads to increased pressure to BUY Bitcoins in order to move the Bitcoins of of the Bitfinex platform. Since there is extra pressure to buy the price goes up.
Is it still safe to put money into Bitfinex Burt? (as you were endorsing a few days ago) Or have you changed that position now Bitfinex is looking even more questionable than when you suggested it was still safe?
|
|
|
Bitcoin has been outcompeting SWIFT since 2011 when Gox first opened, even with today's so-called "expensive" tx fees, Bitcoin is much much cheaper (and faster to clear) than SWIFT. And more flexible. And uncensorable. And un-confiscable. tl;dr: SWIFT can never compete, headline fail
|
|
|
Segwit requires 95% support From the miners perspective, all forks are hard, especially Segwit which will put non-Segwit miners at an economic disadvantage (they can earn more fees with Segwit blocks). The higher the signalling goes over 51%, the more concerned they'll be that their blocs won't get relayed, so it could activate quickly (or get very close) soon after 51% signalling hits. Without the hash power of the the miners not signalling Segwit making blocks, the 95% threshold becomes a lower number in hashrate terms anyway, so if F2Pool maintain their signalling, we might see >50% signalling soon, and activation soon after. This has happened with most other soft forks, miners don't like to get left running old network rules creating old block versions, even when they're still compatible.
|
|
|
Your assessment is not altogether correct either, Qartada. -Fiat currencies are localised as they are the legal tender of a country. Bitcoin would only be larger than them if the amount being held in that area was higher than the amount of fiat currency being held. The amount of M1 BTC (of which Bitcoin can only represent, the monetary components M2, M3, M4 etc aren't possible in the Bitcoin system, and that's a good thing), is higher for Bitcoin than many small fiat currencies serving small economies (although I'm not too sure how Norway or Finland M1 got cited, I've not checked recently, but I would guess those currencies still to be larger than BTC M1) -Bitcoin's market cap is not the same as fiat's market cap because most of it is not in circulation and Bitcoin has many different features to these conventional currencies. Much of the price isn't from people who spend it regularly but is from investors which also manipulates the market cap in relation to fiat.
OK, M1 money supply equivalence (or "market cap" as the article erroneously describes) and velocity of money are not the same metrics at all, and you conflate them with one another in your post. But you are right in 1 way, Bitcoin velocity is much lower than some of the fiat currencies for which it's M1 supply exceeds. But also, remember how fiat currencies attain their market value in the first place: currency market trading, which is simply speculation by another name. The general market doesn't determine the value of currencies on the currency markets, it's the currency markets that determine the value of currencies in the general market. The value of fiat currencies is set by a centralised system, not by a decentralised system (even the BTC price is set this way, not just anyone can open a Bitcoin exchange).
|
|
|
Also, running dpkg -i will almost certainly give you more detail with any errors in the process, it will help goatpig if you post what it says.
And it also might simply work without errors using dpkg, and then we know the problem lies with the Software Center.
|
|
|
These people are dumb, the real blockchain market (where actual dollars buy and sell actual blockchain assets today) is already worth many times more than $5.4 billion
|
|
|
Just tried again after fresh install of Ubuntu. I am just now downloading Ubuntu again and burning a new disk. Might maybe also be of interest is how the package looks in ubuntu SW center. Also if I open the package again, SW center does not recognize it's installed and asks me again if I want to install. If I do, it kind of takes way shorter then the first time. Uninsall just hung up out of SW center which previously worked with the package manager.
Don't use the Software Center, that would be my advice. Software Center uses dpkg or apt in the background anyway, just open a Terminal and use dpkg -i armory.deb (whatever the real filename for the package is) directly. dpkg and apt-get are far more rigorously bug-tested on Debian (and derivatives like Ubuntu), in general you'll find that with all Linux software. The software ecosystem is too diverse for the (mostly unpaid) developers to test absolutely every combination of everything, and so expecting the GUI to work oftentimes ends with you solving the problem one way or another in a terminal shell. Just start in the shell, and you'll have less problems, and more skill, in the end.
|
|
|
Should be fixed.
Does that mean there will be a new .deb package soon, or was that another issue? It's probably separate to your issue, it's a build specific issue, and it appears goatpig has used the same version of Ubuntu that you are for at least some of the Debian Linux install packages (Ubuntu is a fork of Debian) In other news, building works for me now thanks goatpig
|
|
|
4.9
The .deb package for 0.95.99.1 complained to me about gcc 4.9, telling me I needed 5.something.
|
|
|
Build fails with current testing (a4e681c), latest Debian 8.7 based Whonix TransactionBatch.cpp: In member function 'void TransactionBatch::unserialize_recipients(const std::vector<std::basic_string<char> >&, std::pair<unsigned int, unsigned int>&)': TransactionBatch.cpp:195:13: error: use of deleted function 'std::basic_stringstream<char>& std::basic_stringstream<char>::operator=(const std::basic_stringstream<char>&)' ss = stringstream(valStr_ss); ^ In file included from log.h:58:0, from BinaryData.h:51, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/sstream:502:11: note: 'std::basic_stringstream<char>& std::basic_stringstream<char>::operator=(const std::basic_stringstream<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_stringstream : public basic_iostream<_CharT, _Traits> ^ /usr/include/c++/4.9/sstream:502:11: error: use of deleted function 'std::basic_iostream<char>& std::basic_iostream<char>::operator=(const std::basic_iostream<char>&)' In file included from /usr/include/c++/4.9/iostream:40:0, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/istream:795:11: note: 'std::basic_iostream<char>& std::basic_iostream<char>::operator=(const std::basic_iostream<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_iostream ^ /usr/include/c++/4.9/istream:795:11: error: use of deleted function 'std::basic_istream<char>& std::basic_istream<char>::operator=(const std::basic_istream<char>&)' /usr/include/c++/4.9/istream:58:11: note: 'std::basic_istream<char>& std::basic_istream<char>::operator=(const std::basic_istream<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_istream : virtual public basic_ios<_CharT, _Traits> ^ /usr/include/c++/4.9/istream:58:11: error: use of deleted function 'std::basic_ios<char>& std::basic_ios<char>::operator=(const std::basic_ios<char>&)' In file included from /usr/include/c++/4.9/ios:44:0, from /usr/include/c++/4.9/ostream:38, from /usr/include/c++/4.9/iostream:39, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/bits/basic_ios.h:66:11: note: 'std::basic_ios<char>& std::basic_ios<char>::operator=(const std::basic_ios<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_ios : public ios_base ^ In file included from /usr/include/c++/4.9/ios:42:0, from /usr/include/c++/4.9/ostream:38, from /usr/include/c++/4.9/iostream:39, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/bits/ios_base.h:789:5: error: 'std::ios_base& std::ios_base::operator=(const std::ios_base&)' is private operator=(const ios_base&); ^ In file included from /usr/include/c++/4.9/ios:44:0, from /usr/include/c++/4.9/ostream:38, from /usr/include/c++/4.9/iostream:39, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/bits/basic_ios.h:66:11: error: within this context class basic_ios : public ios_base ^ In file included from /usr/include/c++/4.9/iostream:40:0, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/istream:795:11: error: use of deleted function 'std::basic_ostream<char>& std::basic_ostream<char>::operator=(const std::basic_ostream<char>&)' class basic_iostream ^ In file included from /usr/include/c++/4.9/iostream:39:0, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/ostream:58:11: note: 'std::basic_ostream<char>& std::basic_ostream<char>::operator=(const std::basic_ostream<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_ostream : virtual public basic_ios<_CharT, _Traits> ^ /usr/include/c++/4.9/ostream:58:11: error: use of deleted function 'std::basic_ios<char>& std::basic_ios<char>::operator=(const std::basic_ios<char>&)' In file included from log.h:58:0, from BinaryData.h:51, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/sstream:502:11: error: use of deleted function 'std::basic_stringbuf<char>& std::basic_stringbuf<char>::operator=(const std::basic_stringbuf<char>&)' class basic_stringstream : public basic_iostream<_CharT, _Traits> ^ /usr/include/c++/4.9/sstream:64:11: note: 'std::basic_stringbuf<char>& std::basic_stringbuf<char>::operator=(const std::basic_stringbuf<char>&)' is implicitly deleted because the default definition would be ill-formed: class basic_stringbuf : public basic_streambuf<_CharT, _Traits> ^ In file included from /usr/include/c++/4.9/ios:43:0, from /usr/include/c++/4.9/ostream:38, from /usr/include/c++/4.9/iostream:39, from BinaryData.h:45, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/streambuf:810:7: error: 'std::basic_streambuf<_CharT, _Traits>& std::basic_streambuf<_CharT, _Traits>::operator=(const std::basic_streambuf<_CharT, _Traits>&) [with _CharT = char; _Traits = std::char_traits<char>]' is private operator=(const basic_streambuf&) { return *this; }; ^ In file included from log.h:58:0, from BinaryData.h:51, from TransactionBatch.h:133, from TransactionBatch.cpp:9: /usr/include/c++/4.9/sstream:64:11: error: within this context class basic_stringbuf : public basic_streambuf<_CharT, _Traits> ^ TransactionBatch.cpp: In member function 'void TransactionBatch::unserialize_spenders(const std::vector<std::basic_string<char> >&, std::pair<unsigned int, unsigned int>&)': TransactionBatch.cpp:252:13: error: use of deleted function 'std::basic_stringstream<char>& std::basic_stringstream<char>::operator=(const std::basic_stringstream<char>&)' ss = stringstream(idstr_ss); ^ Makefile:853: recipe for target 'libCppBlockUtils_la-TransactionBatch.lo' failed make[3]: *** [libCppBlockUtils_la-TransactionBatch.lo] Error 1
|
|
|
When the Russia mafia government "finally" announce that Bitcoin is officially sanctioned, hands up who wants to go to Russia to try it out?
Yeah, that's what I was expecting. No-one is going to take any stance by Russia's professional tyrants even 1% seriously, and so their protection racket will only ever have the BTC native to Russia to play with for a long, long time. Which might not be so bad, I'm sure plenty of impoverished Russian people have been buying or mining BTC since 2010. I just hope they have the sense to hodl tight and not give it up to the government thieves if they initiate some kind of "mandatory purchase order" (aka official theft)
|
|
|
F2Pool are very confusing. I don't think they'll keep actually signalling SegWit and following the users' opinions, I think they might be messing with everyone like they have been for weeks. We'll see if they keep going though.
Whether Wang's tweets are moving the market price or not is not easy to tell, and of course, it's having very little effect even if it could be proved. But I think Wang is possibly trying to move the market price, it makes more sense than the coercion argument (if he's being coerced, one side would strong-arm the other to stop Wang switching sides so often, it doesn't add up that he's being threatened by either side). Yet another reason why decentralisation of mining is so important now the mining market is so consolidated (aka cartelised). It's good to see Bitfury are finally selling chips to external equipment manufacturers, but I still believe a change to a CPU or GPU specific proof of work algorithm is the real answer in the medium term. Long term, 3D printing of simple ASIC hashers will be the most viable route to decentralised mining (ASIC chips are close to the memory class of silicon chips when it comes to simplicity of the chip die, so it's possible that cryptocurrency mining might even drive the incentives for development in the 3D chip printing field to some extent)
|
|
|
|