You know, I cant remember the actual point I switched to google from AltaVista, despite having convinced myself that I never would. Funny, no matter how entrenched we think we are, we are liableto change in a heartbeat. Bitcoin core could learn that lesson.
Altavista started going down that whole "portal" route (as were others). An object lesson in having to watch your step when it comes to monetizing things. I probably would have stuck with Altavista if they hadn't gone that way (though they would have had to have stepped up their search algorithm efforts too)
|
|
|
^I thought Adam Back was on record with some kind of 2-4-8 plan?
I think Luke Jr. is against it because Jesus. Let me go check...
You misheard. That's two regular transactions for the price of eight segwit ones. Clearly they have confidence in its acceptance if the're offering that steep a discount.
|
|
|
Let me know if ive missed anything
Their lies of supporting p2pool...... Yeah, I was actually excited about that. I'd used p2pool before when I was GPU mining so when I ordered an S5 and when it arrived went to their website and saw all the stuff about p2pool and how they had one... Oh well, best run decentralized anyway
|
|
|
Can miners fork to 2MB and still stay with Core?
Or they're back on Classic??
Miners as individuals can do what they want. The issue becomes when they mine that first 1.01MB block and it gets rejected by the network (or not). actually, they will just be on their own fork, it will get rejected from other nodes that runs Core but it will be valid on their chain and they will keep mining higher blocks, it will be the longest chain with most work, as Satoshi implied before: Bitcoin will be what the longest chain with the most work decides it to be. True. But the relevance of the fork is important. A good chunk of the network and many of the auxiliary services would need to be on-board for it to be successful.
|
|
|
I pay like 0.06 cents per kWh in Quebec, Canada. Big farm like bitfury/antpool probably pay like 0.02-0.03 cents per kw (speculating here but they pay less for sure.)
I think you've slipped a couple of decimal places there. But yeah, he'd be crazy to mine at the price he's paying. It was 9.168c here last month and I don't think I'm actually breaking even.
|
|
|
With 1MB blocks Danny Hamilton estimates fees could reach as high as $100 a transaction.
To be fair, there is a mechanism to keep those fees down as others have suggested. The problem is, that mechanism is disinterest in using Bitcoin.
|
|
|
Can miners fork to 2MB and still stay with Core?
Or they're back on Classic??
Miners as individuals can do what they want. The issue becomes when they mine that first 1.01MB block and it gets rejected by the network (or not).
|
|
|
longterm i mean when blockreward is zero, yes. nevertheless we should already be aware of that fees are not something "evil" but overall necessary. its a balance between allowing 0-fee transactions to process, while still having enough pressure to have some fees paid.
I have seen *no one* argue for miners to be forced to process zero fee transactions. Nor could you without massive effort and changes to the protocol.
|
|
|
Longterm Bitcoin has to gather enough fees to survive, without fees no miners.
Or in your business language: We can give away our product for free, said no successful business ever.
edit: the fees will longterm lead 1:1 to Bitcoins security. the higher the gathered fees the higher the security of the network.
Without fees, no miners. So people will pay fees to keep their transactions getting on the blockchain and miners will select transactions which help pay for their business. This isn't rocket science. The problem comes when someone comes along and interferes in this negotiation between customer and service provider and distorts the value proposition.
|
|
|
It is the [use cost] that need to be increased!
Said no successful business in a competitive market anywhere.
|
|
|
~brinnng!~ -Hello? -Do you have Prince Albert in a can? Then let him out! Hahhaha!!1 -Darn you Google kids!
(I actually opted into Google tracking my searches etc., have gmail, google voice etc. accounts linked. Limit questionable stuff to TOR. Actually improved my "browsing experience" (or whatever that's called))
A lot of what Google does is pretty good but they do tend to make things mandatory which really are not technically needed but do help Google to monetize their customers. Now, that's all well and good, free market, blah blah blah until you find yourself in the situation, as Luke-jr did, where you are required to use their services to participate in something. If you are trying to organize that something, particularly in this sphere where it's voluntary participation, you'll find there are those that don't accede to those requirements (See RMS). Better to go with open solutions if you can. New job? I just left the old one. I have some projects I want to work on. Hopefully I can turn at least one of them into a steady income but if not, my skills are in demand around this area.
|
|
|
Streaming to >50 people is an engineering problem too. https://www.reddit.com/r/Bitcoin/comments/42bdwm/consensus_round_table_meeting_in_miami_going_on/cz962q7 (open the collapsed comment for extra lel) luke-jrLuke Dashjr - Bitcoin Expert: It won't let me without giving Google my phone number... Other guy: Is there something stopping you from registering a google voice number and then giving it to them? luke-jrLuke Dashjr - Bitcoin Expert: Dunno, sounds like a lot of steps/time that would distract me from the actual topic. TL;DR: Expert fails, but not after trying to break TOS & h4xx0r all things. I am a little sympathetic to not wanting to give Google your phone #. (Though they already have mine)
|
|
|
If today's daily candle stays green, it will be the first time this year that we have 2 consecutive up days.
Blocks get full, price goes down. Blocks get less full, price goes up. Trend is to fuller blocks. Or maybe just Bitcoin doing its thing.
|
|
|
Blockstream's reasons to want that kludge, ignoring all objections, are obscure.
The simple answer is that they seek complexity because complexity takes time to implement and simply increasing the block size limit could have been done without much further ado. You can argue motivation but the stonewalling of Gavin and others (people who should be regarded as influential people at least worthy of some attention) and then the stalling of the so-called scaling conferences and now this "We really have a way to fix this, just give us some more time" indicates to me that core just really do not want to scale Bitcoin. Jorge, as a professor, I'm sure you see this kind of behavior in your students all the time, typically those who have a bit of maturing to do.
|
|
|
From my memory of the asic race it would take 6 months at least, probably closer to a year, but that's not important. What I'm considering is that it would be necessary to change PoW in the case of a hard fork simply to avoid the risk of attack from the existing sha256 mining hardware..
On the less popular chain, blocks would grind to a halt under the existing difficulty. If we assume 10 percent remain then you would expect 1 block every 100 minutes, at least until the next difficulty adjustment, which would be a very long time because difficulty is adjusted every 2016 blocks. If you reduce the difficulty artificially to allow blocks to be mined and transactions to happen in a timely fashion then It becomes trivial for any mining pool to quickly switch over and mine a bunch of your blocks and then dump the coins...
I think, in the situation where one would want to cling to a minority chain after a hard fork, an algorithm change is essential.
Also, there is no incentive to design or build asics to run the new algorithm unless it is insanely profitable, as was the case when the first bitcoin asics came on the scene.
The problem with your memory of the ASIC race is that it's a race that's already been run. Planning has been done, companies built up, infrastructure and channels put in place, experts brought in-house and manufacturing capacity assigned. If we were starting from scratch, you'd be absolutely correct but just bringing a new algorithm online? I'd imagine that back-of-napkin designs are already widespread after Luke-jr's little stunt. I won't argue that staying on SHA2 wouldn't be problematic, I'm just saying that changing is too, if not moreso. It's lose-lose.
|
|
|
Any fork with a low hashrate would be subject to what Luke-jr did with to Coiledcoin.
Hi Richy, what if the lesser fork adopts a new proof of work algorithm that obsoletes the existing mining hardware? I believe when Luke-jr did what he did, we were not on ASICs yet. Starting from scratch is rolling the dice. There are a lot of powered down (or being used on altcoins) GPUs out there. Who controls them? ASICs will be produced for the new POW in a couple of weeks to months tops in any case. Or would be if it wasn't going to be effectively dead.
|
|
|
The more contentious alternatives the better. The HF only triggers at 75% - thats 75% for Classic. That means that core must now share the remaining 25% with all the other implementations. They will attack each other into oblivion.
Check out the link, comrade. There is no competing over 25% remaining ASICs' with what is being discussed. It is both fascinating and encouraging and we shouldn't worry regardless the outcome. I am at peace with a potential HF and Classic. Either way, the future is great. P.S.. 75% of hashing does not equal an economic majority or a majority of users. Appears to be more supporting Core, but who knows , such a difficult thing to measure. GPU only mining would bring in many new participants who left for other alts long ago , and casual gamers with good gpus as well . Any fork with a low hashrate would be subject to what Luke-jr did with to Coiledcoin.
|
|
|
P2Pool does not require the wallet functionality as long as you use -a when starting to set a payout address.
This is how I roll.
|
|
|
I hope Slush stays at 1MB limit. Otherwise it is time for me to move to the pool that does and respects non-hardforked rules. (Does not matter if it is called "Core" or otherwise.) But of course people who prefer >1MB branch (and believe that it won't lose its utility and consequently value) are free to commit their hashpower to pool that creates bigblocks. Problem is that pool operators have not stated their policy clearly (and they are more and more influenced by mass pressures). And even if they did, there is no guarantee they will keep this. This is nothing to do with the 1mb block size limit but the 750000 byte default soft-limit for mining.
|
|
|
|