Bitcoin Forum
May 06, 2024, 09:09:25 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 [27] 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 »
  Print  
Author Topic: Do you think "iamnotback" really has the" Bitcoin killer"?  (Read 79918 times)
7thKingdom
Member
**
Offline Offline

Activity: 107
Merit: 10


View Profile
March 22, 2017, 02:41:29 PM
 #521

I agree with Shelby that music may not be the best place to start. There is too much entrenched establishment thinking in that domain by both consumers and especially content creators (although it has gotten much better in recent years). Artists are still locked into old ways and I don't believe it will be the easiest market to "attack" first.

Yes, the indie market does offer some inroads, and in time I believe that will be the place to dig in and make our mark, but I don't believe it is the best place to start in the grand scheme of things. I would tend to think a market to focus on first would be one that already has its roots in upending the tradition media/content distribution status quo. The primary ones that come to mind are podcasting/vlogging (and to a lesser extent blogging). These industries are built on the idea of creators getting their content directly in the hands of users with minimal middle man interaction. And the content creators are always looking for new and better ways to monetize their offerings.

Unlike the music industry, where there are a plethora of preconceived ideas and biases holding people back, the podcast/vlog sphere has very little of that. They want innovative distribution ideas, that is why they came into existence in the first place. They want easier/better ways to spread their "art".

I believe the largest hurdle to overcome in this market is the idea that consumers have always gotten these things "for free" and there may be some resistance to now paying for them. But the whole idea of a micropayment social media platform is that the consumers wouldn't even really feel the brunt of paying anyway since the transactions would be so small. So I don't think this will be as difficult to overcome as it initially appears.
1714986565
Hero Member
*
Offline Offline

Posts: 1714986565

View Profile Personal Message (Offline)

Ignore
1714986565
Reply with quote  #2

1714986565
Report to moderator
1714986565
Hero Member
*
Offline Offline

Posts: 1714986565

View Profile Personal Message (Offline)

Ignore
1714986565
Reply with quote  #2

1714986565
Report to moderator
There are several different types of Bitcoin clients. The most secure are full nodes like Bitcoin Core, but full nodes are more resource-heavy, and they must do a lengthy initial syncing process. As a result, lightweight clients with somewhat less security are commonly used.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714986565
Hero Member
*
Offline Offline

Posts: 1714986565

View Profile Personal Message (Offline)

Ignore
1714986565
Reply with quote  #2

1714986565
Report to moderator
1714986565
Hero Member
*
Offline Offline

Posts: 1714986565

View Profile Personal Message (Offline)

Ignore
1714986565
Reply with quote  #2

1714986565
Report to moderator
7thKingdom
Member
**
Offline Offline

Activity: 107
Merit: 10


View Profile
March 22, 2017, 03:30:44 PM
 #522


That sort of startled me, because it is catchy and it is in the vein of a "byteball" type of geekcool phonetics.

Amorphous was on my original brainstorming list (and I had thought of nebulous in the process of thinking of amorphous and mentioned nebulous as a negative 3 times on the page), but I think you are correct that Nebula is better than Amorphous or Nebulus.

But is that the meaning we want for a programming language? The language is targeted to programmers. Nebula does have a nice sound to it. Are you thinking the programming language is a feature marketed to the speculators also and thus the reason for the geekcool name?

While reading your ideas I couldn't help but think of the Bruce Lee quote - "Empty your mind, be formless, shapeless like water if you put water in the cup it becomes the cup and water can flow or it can crash."

This also hits back on your Zen ideas. Water, like your language is fluid. It has many degrees of freedom and goes where it is directed. It does not resist, it just flows. Of course, fluidity is not restricted just liquids. It can refer to anything that is readily changeable. Anything that is not fixed and rigid.

So two names that immediately come to mind are... Flow and Fluid


jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
March 22, 2017, 03:49:12 PM
 #523

If you wanted to upend Core, then you should have more competent people who would have advised you that unbounded block size doesn't have an equilibrium.

If you have successfully demonstrated  that unbounded block size cannot reach an equilibrium outside a majority-collusion environment, I have missed it.

Yes you missed it. ...
Once we do model differing orphan rates for different miners, then the optimal strategies for mining come into play. And if you work out the game theory of that, you realize that collusion and centralization are the only possible outcome.

So you seem to be acknowledging that I am correct above...

Quote
You are making a similar error as those two others did upthread. A 51% (or even 33% selfish mining) attack is not a change in protocol. In other words, in BTC the miners can't make huge blocks, because it violates the protocol limit of 1MB.

Collusion is collusion, irrespective of the protocol. Nakamoto consensus is only possible when a majority of participants are 'honest' as per the whitepaper terminology. Unbounded blocks does nothing to change this.

Quote
And as a practical matter, Bitcoin operated just fine for multiple halvings with no practical bound on blocksize.

There was minimum advised fee and there were pools doing anti-spam such as I think I've read that Luke Jr's pool rejected dust transactions.

Yes, minimum advised fee. 'Advised', as not encoded within the protocol. The fact that this worked up to the point that the production quota was finally persistently hit forms an existence proof that the system can work. The fact that it did work may or may not have something to do with all players having beneficial intent, but there it is. Indeed a populist sentiment includes the notion that it is against the best interests of all participants to do anything that kills the system. Which probably explains why our past known-majority miner (Discus Fish?) turned back from their position of mining majority without ever forming an attack from their assuredly-successful posture.

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
March 22, 2017, 03:53:22 PM
Last edit: March 22, 2017, 04:21:27 PM by IadixDev
 #524

Maybe I can appear strong for people to think im weak  Roll Eyes

Some have less subtle approach Cheesy

https://youtu.be/URybdpu_NhI

Not sure who is winning, everyone is cheating anyway :p

Let's do it! I just think maybe "we" will choose other apps as a priority instead of the music app. But I am also okay with doing a music app first too. I love music and would surely use the app myself! We'll brainstorm about it soon...

I will be very enthused about apps I will myself use, because I will have many ideas of how to innovate them. My million user successes in commercial software were the ones I created because I wanted to use them (and saw the market was lacking the specific features/capabilities that I wanted). For a music app, I want it to be my music player and also keep track of all my music so I never have to hassle with backing up my music, transcoding formats, etc... And I don't want it to tie me into any walled gardens, no adware, no bullshit, etc..

Yeah we'll be "cheating" also but copying for example the way others are already cheating:

http://www.listentoyoutube.com/ (get my idea yet?)

Have you ever noticed you can overlay and obscure the SoundCloud HTML player buttons? Instant library of songs without the 10,000 plays per day app limit. I have some (clever or out-of-the-box thinking?) ideas we can discuss.

Remember our apps need a social component. There needs to be sharing (likes), commenting, etc..

But the more important point is how you the app creator will get paid. And how much money you will be able to earn creating apps.

All you need to do is make apps that people will like to tell their Facebook friends to use.

I think from there, you can start to deduce what I have in mind, but I hold off on the details while I try to finish up getting the preliminaries of the scalable, decentralized blockchain for OpenShare into code. Then we will start to talk in earnest about collaborating, launching, and making a lot of money while shocking this community with our STICKY, VERIFIED adoption rate (into the millions I expect).

I am excited because now I see that real capable app developers are contacting me. So I only need to convince you that my blockchain technology and onboarding strategy is viable, then we can go change the world.

I am on the 2 drug treatment now. Let's hope my energy is ready roll now. I need go finish up on the proposed changes for the (optional proposed app) programming language and see if it realistic for me to write the transpiler in a matter of weeks. I am going to try to go wrap that up now after doing my forum communication now. Are you interested to help on the transpiler? Should I start a Github project for that? Which name is best Async, Copute, Lucid, or Next (Nxt)?

All app programmers please keep in touch. I want to make you wealthy. Let's have fun also.

I'll make an official thread and Slack once I get through the preliminaries and am confident that my production (in code!) is back up to normal.

Yeah I think the music buisness need to do the switch with uber and p2p, with all the thousand of starving artists who will never get signed anywhere, and even the one who are signed not sure they are very happy with it, but they are locked and cant go to the toilet without asking their producer, the people to convince are more their producer for the one who are produced, and the economy is not really Blooming in this sector for a moment Smiley I can say with fair amount of confidence many are looking for new solution markets and income. And the music industry is not that good with internet since the beginning. And no producer for an artist it's the dream Smiley

Need to think also about good back office to have stats & méta infos, and good way to remunerate participants, node owner, artist, developpers.

For the language, I have mostly the low level part on the git, with the module system, the dynamic tree with reference pointer, http/rpc server, and most C runtime, and the vector lib for the raytracing. Then the modules method can be called from a js application with http/json/rpc.

For the high level part to make application, i dont have much theory Smiley I would go for something very basics who can call module function and handle Event with json like data type, and in the spirit akin to BASIC with simple one line statement easy to evaluate, charm is really the kind of things im looking to have good OOP encapsulation in module with an interface.

Js as far as i know is weak for defining good object sur typing and interface, but if I understand well, your idea is to have a source language with good sur typing and interface définition, and transpiling to js to have équivalent code.

But for me i would still rather stick to a kernel in pure C, that can expose module interface to js/html5 app, and having the node/server side in C rather than node.js. Much better for performance, memory use, and portability. The only thing it really miss now is a good html templating system to generate html5/js page based on input data. For this im not sure if the best is to have browser side generated html like angular js, or something that can generate preformated html from the node, or using xslt which can be done by both server & browser. Or something entierely different to define UI and Event handling and rpc call in html5/js.

That would do in sort to have part of application in C modules with the framework, and part of application on js/html5 who can call those modules. But having another source language to transpile this part of the app with the UI and modules interface, why not.

But all the part with binary data crypto & transcoding in js  Cry Cry Cry

jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
March 22, 2017, 04:20:23 PM
 #525

BUcoin crashed last night to 200 nodes after the the new bug was discovered,

Yes, a new bug was discovered by someone desirous of performing a DoS attack upon BU. This bug was exploited by such attackers to cause a number of nodes to crash. While a temporary inconvenience, we welcome this assistance in hardening the BU system before flag day.

Quote
and then the developers had the great idea of releasing a closed source patch.

Well, not exactly. I mean, if your definition of 'closed source' is delayed release of the source, I guess so. While I was not part of the decision process, it seemed to be predicated on the fact that immediate release would reveal the precise nature of the vulnerability to additional attackers. It was done to create a window for the patch to propagate.

Whether or not that was the proper course of action is something that can be debated. Indeed, it is still being debated within the BU community. But your characterization of 'they've gone closed source' is beyond the pale.

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 22, 2017, 04:21:30 PM
 #526

If you wanted to upend Core, then you should have more competent people who would have advised you that unbounded block size doesn't have an equilibrium.

If you have successfully demonstrated  that unbounded block size cannot reach an equilibrium outside a majority-collusion environment, I have missed it.

Yes you missed it. ...
Once we do model differing orphan rates for different miners, then the optimal strategies for mining come into play. And if you work out the game theory of that, you realize that collusion and centralization are the only possible outcome.

So you seem to be acknowledging that I am correct above...

There is no outcome that is outside a majority-collusion environment. I explained that unbounded block size cannot reach an equilibrium outside a majority-collusion environment. You said you must have missed it and I explained you did miss it.

You are conflating that the fact that the equilibrium is reached inside majority-collusion with your thought that I haven't stated what occurs outside of majority-collusion. But I have stated what happens, which it is always devolves to majority-collusion.

You are making a similar error as those two others did upthread. A 51% (or even 33% selfish mining) attack is not a change in protocol. In other words, in BTC the miners can't make huge blocks, because it violates the protocol limit of 1MB.

Collusion is collusion, irrespective of the protocol. Nakamoto consensus is only possible when a majority of participants are 'honest' as per the whitepaper terminology. Unbounded blocks does nothing to change this.

Conflation is conflation. (Meaning you apparently entirely missed the relevance of the point)

I'm trying to be respectful, but please don't waste my time. You see I have too many messages to reply to.

And as a practical matter, Bitcoin operated just fine for multiple halvings with no practical bound on blocksize.

There was minimum advised fee and there were pools doing anti-spam such as I think I've read that Luke Jr's pool rejected dust transactions.

Yes, minimum advised fee. 'Advised', as not encoded within the protocol. The fact that this worked up to the point that the production quota was finally persistently hit forms an existence proof that the system can work. The fact that it did work may or may not have something to do with all players having beneficial intent, but there it is. Indeed a populist sentiment includes the notion that it is against the best interests of all participants to do anything that kills the system. Which probably explains why our past known-majority miner (Discus Fish?) turned back from their position of mining majority without ever forming an attack from their assuredly-successful posture.

Everyone was incentivized by the fact that once the 1MB limit was reached, then the destruction of Bitcoin would ensue as is currently happening with the battle between the miner and codester cartels.

It was the 1MB protocol limit that provided the barrier that everyone had to try to swim far from. Also miners had an incentive to get minimum level of fees and they didn't yet have enough centralization to extract higher fees. And the decentralized miners at that time before ASICs also had an incentive to keep spam low since as I explained to @dinofelis today that it wasn't a fully connected mesh so propagation time was a bigger deal than he realized. Also the decentralization at that time when people were still mining on GPUs, meant there was more of an altruistically driven Nash equilibrium than now.

That not at all like the cut throat, big money economics situation now. As @dinofelis pointed out, the only altruism (and internal discord) from miners now is probably all faked to make us think there isn't a cartel.
jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
March 22, 2017, 04:33:31 PM
 #527

You are conflating that the fact that the equilibrium is reached inside majority-collusion with your thought that I haven't stated what occurs outside of majority-collusion. But I have stated what happens, which it is always devolves to majority-collusion.

You have stated that, yes. That was my assertion. My point is that this is outcome is not affected by the cap on maxblocksize.

Quote
Everyone was incentivized by the fact that once the 1MB limit was reached ... It was the 1MB protocol limit that provided the barrier that everyone had to try to swim far from

Justification of this assertion would require explaining away the almost linear annual doubling of the average block size, up until the saturation point.


Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 22, 2017, 05:08:54 PM
 #528

Readers please click this quote to see if he replied over there. No need to put that discussion in this thread:

But I have stated what happens, which it is always devolves to majority-collusion.

My point is that this is outcome is not affected by the cap on maxblocksize.

Might be true. I've been pondering that today. If BU can successfully attack the token holders, then that should be proof that even small 1MB blocks don't stop mining cartelization/centralization.

In which case, larger blocks along with LN would provide more choice to users. As I wrote upthread, Core has masterfully fooled some people into believing they are rational with no ulterior motive.

But then I don't understand how Core could have ever expected to succeed, since the miners would naturally fork the protocol and increase the block size so they can get more revenue. Core's apparent goal of forcing users to use LN appears to be impossible to enforce (mining will always be a cartel and the cartel will not agree to be stripped of revenue).

But on the flip side, the mining cartel ostensibly doesn't want to allow off chain LN scaling (which is why they won't fix malleability) because that would compete with the miners for on chain fees.

Some have argued that enabling LN would increase overall usership and thus increase onchain transaction fee revenue.

So if BU was sincere, they could demonstrate it by including the necessary fixes to enable LN in their planned HF. Because otherwise we can look forward possibly to a monopoly on block size and thus miners squeezing the market for maximum fees inhibiting the scaling of Bitcoin.

(I'm duplicating this to two other places, because readers can't go clicking links off to so many other places to find the key points, but I am linking it here, so you can decide to reply just here if you want. Your decision obviously. I'd like to finish this discussion asap if possible.)
alkan
Full Member
***
Offline Offline

Activity: 149
Merit: 103


View Profile
March 22, 2017, 06:17:11 PM
Last edit: March 22, 2017, 06:36:20 PM by alkan
 #529

Yes, this is with block rewards constant.  Tail emission.  But in the case of rewards proportional to block length (fees), you have to multiply A's revenues with the fact that his blocks bring in more money.  He has a lower percentage of blocks on the chain, but these blocks bring him more rewards as they are bigger.

So if his big blocks bring him 20% more income per block, this is neutral.

Well, I'm not sure if we are on the same page.

Of course, you can include more transactions and collect more fees by building bigger blocks, but that doesn't solve the fundemental problem that A's fee revenues must be multipled by his blockchain production (fraction of the chain built by A), rather than by his rate of successful blocks (ratio of non-orphaned blocks).

Let me come back to my example of the three miners A, B and C, all with a hashrate of 1/3 and an orphan rate of 0.01.

Now, assume that A and B stick to a block size of 1mb, while C tries to find the block size that maximizes his profits.
C can do so by gradually increasing the block size as long as the higher orphan rate (resulting in a lower production share) is outweighed by the higher fees. As the orphan rate follows an exponential distribution and the marginal fee income tends to decrease, there will be an equilibrium where marginal revenue = marginal costs. Let's assume that C's profits are maximized with an orphan rate 0.2, so that his current blockchain production rate will be 0.288, while that of B and C 0.356 each.

The fundamental problem arises once A and B also start using a variable block size to maximize their profits. By doing so (i.e. by increasing their own block sizes) they will not only decrease their own blockchain production shares due to their higher orphan rates, but at the same time C's blockchain production share will grow and thus destroy his individual market equlibrium. To reach equilibrium again, C will now have to increase his block size once again to collect the same fees as before. So, his optimal orphan rate will be more than 0.2. This, in turn, would place A and B in a disequilibrium, who might then increase their block sizes even more, etc. It will all end up in a doom loop.

It seems that the increasing total block space supply combined with the (probably) finite demand for transactions could make the loop converge at some upper limit. However, this equilibrium would be unstable. When a miner suddenly decreases his block size, all the others would follow suit to reach their individual market equilibrium again. The miners might even end up at an unstable lower equilibrium point

As far as I can see, no stable market equilibrium can be reached by all miners at the same time. For mining market has the peculiarity that whenever a miner increases its own supply, the supply of all the other will decrease. In contrast to regular markets where the players only compete to meet the demand, Bitcoin miners also compete to increase their own supply at the cost of their competitors since total block production remains capped even with unlimited block size.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 22, 2017, 07:35:25 PM
 #530

@aklan & @dinofelis:

So why is BU preaching equilibrium with unlimited blocks as the most prominent item in its FAQ.

Because they want to sustain the illusion of decentralization. Their supporting white paper is meaningless and doesn't model anything that can exist in reality (it models a perfectly uniform distribution of propagation and hashrate where all miners experience the same orphan rate, but if that were the case then no miner would make any profit because in that impossible scenario marginal cost = lowest cost).
dinofelis
Hero Member
*****
Offline Offline

Activity: 770
Merit: 629


View Profile
March 22, 2017, 09:15:56 PM
 #531

Yes, this is with block rewards constant.  Tail emission.  But in the case of rewards proportional to block length (fees), you have to multiply A's revenues with the fact that his blocks bring in more money.  He has a lower percentage of blocks on the chain, but these blocks bring him more rewards as they are bigger.

So if his big blocks bring him 20% more income per block, this is neutral.

Well, I'm not sure if we are on the same page.

Of course, you can include more transactions and collect more fees by building bigger blocks, but that doesn't solve the fundemental problem that A's fee revenues must be multipled by his blockchain production (fraction of the chain built by A), rather than by his rate of successful blocks (ratio of non-orphaned blocks).

Let me come back to my example of the three miners A, B and C, all with a hashrate of 1/3 and an orphan rate of 0.01.

Now, assume that A and B stick to a block size of 1mb, while C tries to find the block size that maximizes his profits.
C can do so by gradually increasing the block size as long as the higher orphan rate (resulting in a lower production share) is outweighed by the higher fees. As the orphan rate follows an exponential distribution and the marginal fee income tends to decrease, there will be an equilibrium where marginal revenue = marginal costs. Let's assume that C's profits are maximized with an orphan rate 0.2, so that his current blockchain production rate will be 0.288, while that of B and C 0.356 each.


I think that's the point where you need to stop, and why I think that all of this has not much sense.

I start from the idea that miners have incentives to be on a good backbone network, directly between them, and do not wait for the P2P network to bring a block to them.  In other words, the 10 or 20 big mining pools are on a rather fully meshed, high speed backbone.

I already explained why, because that diminishes their orphan rate, and they are mutually inspired to improve their network links.

If you accept that as a given, then it is impossible to start considering orphan rates that become important due to network and block size problems.  There is of course always a given orphan rate, but that orphan rate must be small.

If your (relative) orphan rate is, say, 1%, it means that your income is multiplied by 0.99.  If your block size doubles, and your relative orphan rate doubles because of that, you multiply it with 0.98.

By the time that "doubling your blocks" starts to be OFFSET by the diminishing of your income because of orphaning, you see that the orphaning rate must be HUGE.  Not 2 or 4%, but 50% or so.  

Well, that is impossible.  Because if you orphan 50% of your blocks on the chain, it means that you even orphan more than 50% of your successful blocks, which means that you don't even reach your other miners over the back bone with the blocks.

If that is true, nobody else can download the block chain.  It is being produced at a rate that is almost saturating a back bone.  So no one with a lesser link can ever download the block chain and keep up to date.  

So by the time that this problem of orphaning blocks because of their size starts influencing the income of miners, the block chain is growing so fast that NOBODY CAN DOWNLOAD IT.

This story is different if miners are random nodes in a P2P network. But they aren't.  They have all interest to invest in strong network links to other miners, exactly because of this orphaning problem.

So in other words, all this theoretical BS over how the orphaning rate offsets the desire for bigger blocks and imposes a natural equilibrium is meaningless, because if ever such an equilibrium would theoretically exist, it occurs for such big blocks that nobody can download the block chain apart from the miners themselves.

Yes, you can say that an "optimum" is reached when the network stops downloading the chain, and nothing works any more.  True, in a certain way Smiley

EDIT: I hadn't understood something in your post, but now I see what you are getting at. 

If we consider *really small* fees, then for an extra included transaction, that extra delay on the network will mean an extra probability of the block being orphaned, putting in jeopardy the whole income.  This even happens with small blocks.

Yes, this will simply result in cutting off the very lowest fees of the fee distribution, which will remain for ever in the mem pool.

I don't think that this has much to do with "optimal size" ; it only means that one doesn't include the cheapest transactions below a given fee threshold, because their extra transmission time penalises the whole income while not contributing enough to it.


That said, a market doesn't need to come to "equilibrium".  An erratically chaotic market dynamics can be fun too Smiley
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 02:38:40 PM
Last edit: March 23, 2017, 09:33:45 PM by iamnotback
 #532

I guess we are going to need to move our discussion off of BCT since the mod apparently can't comprehend the importance of the following post which he deleted without my permission!

The mod can't seem to understand that by highlighting the parts I wanted to emphasize, I can make my reply more concise. Programmers understand that I am communicating a lot of information with the +1 and the yellow highlighting.

The mod needs to stop deleting that which he doesn't understand.

Perhaps it is because of that @mindtrust user upthread who complained that my posts are walls of text and too many posts. Well then don't read the damn thread, since the thread was starting to talk about my project dufus. And I have a lot to say because of that fact.

Quote from: Bitcoin Forum
A reply of yours, quoted below, was deleted by a Bitcoin Forum moderator. Posts are most frequently deleted because they are off-topic, though they can also be deleted for other reasons. In the future, please avoid posting things that need to be deleted.

Quote
The good thing about bitcoin client is it's very well tested , mature etc, but still not that easy to quickly add new feature or encapsulate into a bigger application.

And im perfectly in tune with iamnotback on the need of programming paradigm switch, and im already on this for 2/3 years Smiley

So going back into multithread boost c++ like bitcore with the super high coupling everywhere, with all hard coded config, I would rather not Cheesy

+1
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
March 23, 2017, 02:56:12 PM
 #533

We have a slack channel https://iadix.herokuapp.com/ http://iadix.slack.com

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 03:29:21 PM
 #534

In case anybody was wondering how I plan to improve decentralized consensus, here is a helicopter view summary:

We actually have to fix the technology of decentralized consensus. Satoshi's work is not yet complete.

If only that could be done.  I have many ideas to *improve* but I'm almost sure that all of it corrupts eventually. This is why no single tech should be dominant for a long time, no uniformity.  Nothing should have a long life time, nothing should have a broad importance and acceptance.

Open source. Open source. The Inverse Commons. I keep telling how to do it.

Miners can do what they are doing because it is not illegal in the protocol. Make it illegal then open source the enforcement of the objective law that makes it so. Byzantine fault detection instead of Byzantine agreement.

That is my solution. Now figure out the details.  Tongue

However, I also can't conclude with 100% certainty that there isn't some mode by which centralized control is still attainable.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 07:42:20 PM
 #535

So two names that immediately come to mind are... Flow and Fluid

Flow is already taken:

https://flowtype.org/

Fluid is somewhat interesting.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 08:22:55 PM
Last edit: March 23, 2017, 09:31:52 PM by iamnotback
 #536

I agree with Shelby that music may not be the best place to start. There is too much entrenched establishment thinking in that domain by both consumers and especially content creators (although it has gotten much better in recent years). Artists are still locked into old ways and I don't believe it will be the easiest market to "attack" first.

Yes, the indie market does offer some inroads, and in time I believe that will be the place to dig in and make our mark, but I don't believe it is the best place to start in the grand scheme of things. I would tend to think a market to focus on first would be one that already has its roots in upending the tradition media/content distribution status quo. The primary ones that come to mind are podcasting/vlogging (and to a lesser extent blogging). These industries are built on the idea of creators getting their content directly in the hands of users with minimal middle man interaction. And the content creators are always looking for new and better ways to monetize their offerings.

Unlike the music industry, where there are a plethora of preconceived ideas and biases holding people back, the podcast/vlog sphere has very little of that. They want innovative distribution ideas, that is why they came into existence in the first place. They want easier/better ways to spread their "art".

Interesting idea.

With my plan, app creators are in charge of their own destiny. So they decide which marketing sectors they want to develop for. But we can brainstorm now so hopefully we have some well thought out apps to launch with, so that the synergy is good for all of us.

If we contrast with Ethereum's model, Ethereum is ostensibly making it very confusing for speculators, because now they have to choose between dozens (or soon 100s) of ICO tokens. Which ones do they invest in? The cognitive load is too high. It ends up diluting speculator focus, and then as you see Byteball gets all the votes (even though I've already explained that Byteball has some serious flaws).

Although Ethereum is getting a lot of experimentation now and we will be behind initially, possibly we can overtake them because for example, the MobileGo smart contract will be taking 10% fees on all game transactions (e.g. game purchases). In OpenShare, the game app would keep all of its revenues on sales plus it would gain additional massive revenues from my planned onboarding monetization model (something similar to Steem's but not based on voting). Tangentially also other crucial differences such as Ethereum's consensus algorithm doesn't appear to scale decentralized (and I don't think Casper will either nor do I believe it is sound, nor do I think Raiden will be the complete solution in all cases, but the devil is in the details on all of that).

All debasement of OpenShare is going towards onboarding, not to an ICO.

You app developers are in effect being issued the money supply in exchange for your onboarding performance, in addition to any other monetization you want to add to your app (e.g. ad funded, in game upgrades, etc). The OpenShare token will sub-second confirmations as well you can post data to the blockchain with sub-second confirmations and practically no limit on TPS or bandwidth.

But I want to emphasize that focusing on the power users and evangelist users first is the most effective. They earn more and bring more users into the onboarding process. The other users want to emulate them and be as great and prosperous as the power users are.

Every transaction will require burning a small transaction fee and so your users get some tokens so they can use the system (and then they get some tokens from using the system, so they can continue using it). The money supply is perpetually deflationary with an unbounded divisibility of the money supply (although it is initially inflationary due to all the onboarding but this onboarding is extremely valuable and will pay us back in terms of a huge marketcap).

Tokens that are rewarded will be non-fungibly locked up for period of time in order to encourage users to use their tokens and not dump then to whales. This will also keep the float small, so that the price is always going up.

Speculators are going to have to fight each other to get into this early. The longer they wait, the higher the price goes. Well at least that is the plan, given the small float relative to the marketcap. But we don't want the Zcash experience where float is too small at the start and then price continually declines as the float increases. But this can be dealt with by selling tokens to raise more funds for development as needed. Maybe even provide advances (against future revenues) to some app devs who need funding upfront.

Note we will not have the crazy dilution of the original Steem design. This will be a very small rate of dilution compared to the impact on the marketcap and price that it should cause. Well that is the theory any way. Remember blockchains and their tokens are just a computer game right.  Wink

I'm not particularly interested in liquidating, it's a computer game after all.

Note I have been purposely just a little bit vague and elusive, because I am not ready to have copycat projects of my design just yet.


I believe the largest hurdle to overcome in this market is the idea that consumers have always gotten these things "for free" and there may be some resistance to now paying for them. But the whole idea of a micropayment social media platform is that the consumers wouldn't even really feel the brunt of paying anyway since the transactions would be so small. So I don't think this will be as difficult to overcome as it initially appears.

Consumers are going to be paying with tokens they received for free. And you will also monetize your app from onboarding having nothing to do with whether the consumer spends or not. So you have two ways to monetize. As for teeny-tiny microtransactions, I don't know if that is a good model. I doubt it. The system design will be able to do, but I think you'll be better off with the onboarding monetization perhaps coupled with some monetary transactions for upgraded features or higher quality versions of content downloads (e.g. 256-bit MP3 song upgrades from 128-bit freeware) or what have you. One advantage is the instant transactions so if the user wants an in game upgrade (i.e. to buy a more powerful gun), they can get it instantly and not have to wait. I suggest readers study up on gamification monetization strategies if they aren't already aware of the Farmville model and what has come since that.

Now I hope you start to see why you should be working on an app for OpenShare so you are one of the lucky apps at launch time. That is the best way to get tokens when they are low priced.

Speculators find some devs and hire then to make apps. That is the way you invest in this early. Else you can buy coins on the open market as they trickle out from the onboarding.

What you think guys? Interesting?
alkan
Full Member
***
Offline Offline

Activity: 149
Merit: 103


View Profile
March 23, 2017, 09:27:46 PM
 #537

If we consider *really small* fees, then for an extra included transaction, that extra delay on the network will mean an extra probability of the block being orphaned, putting in jeopardy the whole income.  This even happens with small blocks.

Yes, that's probably the way Peter R's model was intended to work (though it not even works out in theory).
Furthermore, as long as you have a block reward that is significantly higher than the total fees per block, even a small increase of the orpahing risk might deter a miner from including every transaction.

Yes, this will simply result in cutting off the very lowest fees of the fee distribution, which will remain for ever in the mem pool.

I don't think that this has much to do with "optimal size" ; it only means that one doesn't include the cheapest transactions below a given fee threshold, because their extra transmission time penalises the whole income while not contributing enough to it.

I think that's the whole point of it. If the model worked as intended, the market would agree on a minimal fee for a transaction to get included in a block. It's not about finding the "optimal size" but the optimal price.

That said, a market doesn't need to come to "equilibrium".  An erratically chaotic market dynamics can be fun too Smiley

It would be indeed fun to see the majority of nodes switching to Bitcoin Unlimited...
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 09:56:08 PM
 #538

For the language, I have mostly the low level part on the git, with the module system, the dynamic tree with reference pointer, http/rpc server, and most C runtime, and the vector lib for the raytracing. Then the modules method can be called from a js application with http/json/rpc.

For the high level part to make application, i dont have much theory Smiley I would go for something very basics who can call module function and handle Event with json like data type, and in the spirit akin to BASIC with simple one line statement easy to evaluate, charm is really the kind of things im looking to have good OOP encapsulation in module with an interface.

Js as far as i know is weak for defining good object sur typing and interface, but if I understand well, your idea is to have a source language with good sur typing and interface définition, and transpiling to js to have équivalent code.

But for me i would still rather stick to a kernel in pure C, that can expose module interface to js/html5 app, and having the node/server side in C rather than node.js. Much better for performance, memory use, and portability. The only thing it really miss now is a good html templating system to generate html5/js page based on input data. For this im not sure if the best is to have browser side generated html like angular js, or something that can generate preformated html from the node, or using xslt which can be done by both server & browser. Or something entierely different to define UI and Event handling and rpc call in html5/js.

That would do in sort to have part of application in C modules with the framework, and part of application on js/html5 who can call those modules. But having another source language to transpile this part of the app with the UI and modules interface, why not.

But all the part with binary data crypto & transcoding in js  Cry Cry Cry

We will have to dig into the details in an interactive slack soon. I am thinking apps should be mostly client-side code and they should run both in the browser and as installed app on mobile. App devs could use any languages and frameworks they want. I would like to try to over time standardize a platform so the app dev only has to maintain one body of code and it runs every where. We'll even eventually have our own superapp which acts as an app installer and an app store (all decentralized, no censorship/fees possible).

But I need to work with some app devs to see what works well and how we can stage it. Might have to take it in incremental steps and our first goal is to get to launch asap.

Any way, that is why a strongly typed language that transpiles to JavaScript is something I am considering. C can also be compiled to JavaScript with Emscripten (as I think you've alluded to above).

Server-side we probably want C/C++ code, but perhaps Node.js is okay for the proof-of-concept stage. Remember we are trying to get to market asap. Then there will be funds to throw into development to refine everything.

Let's talk this out on a slack soon.
greentea
Legendary
*
Offline Offline

Activity: 1418
Merit: 1002



View Profile
March 23, 2017, 10:00:05 PM
 #539

Byteball could be the killer of BTC. The end of BTC's dominant position.

4. Afaics, his distribution model totally ruined any chance for a funding model to onboard the app developers, content providers, and users. This is really the killer mistake.


Expand on this, I wasn't a big fan of his distribution model either, having large ICOs hold the lions share, makes no sense.  Do these other coins plan building on top of BB? 

But he still has 85%+ yet to be distributed ...

NEM   NanoWallet   SuperNodes   Apostille   Landstead   Catapult   Mijin
▃▃▃▅▅▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▅▅▅▃▃▃
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
March 23, 2017, 10:04:20 PM
Last edit: March 24, 2017, 12:03:41 AM by iamnotback
 #540

Expand on this, I wasn't a big fan of his distribution model either, having large ICOs hold the lions share, makes no sense.

See my prior reply to 7thKingdom. They can't use the distribution to fund app devs, content producers, and power users in order to drive the ecosystem.

But he still has 85%+ yet to be distributed ...

Okay that is a wildcard. I've heard this mentioned but never verified it. What is his commitment on that 85%?

When I was in Byteball's official thread last year to discuss, I thought he had committed to keep only a small premine and distribute all the tokens to BTC holders. What changed?

Edit: it is right there on the website:

Fair initial distribution

98% of all bytes and blackbytes will be distributed to current Bitcoin holders who bother to prove their Bitcoin balances during at least one of distribution rounds. No investment is required, you need just to link your Bitcoin and Byteball addresses by making a small BTC payment or by signing a message with your Bitcoin address. Then the number of bytes and blackbytes you receive in each round will be proportional to the balance of your Bitcoin address in the snapshot block of that round.

In the first round, over 70,000 BTC was linked, and we distributed 10% of all bytes and blackbytes according to linked Bitcoin balances in the first Bitcoin block timestamped Dec 25, 2016 (Christmas block). In the second and third rounds, we distributed another 3.7%, over 120,000 BTC was linked.

In the 4th round, which is scheduled for the full moon of April (April 11, 2017 at 06:08 UTC), we'll distribute 62.5 MB for each 1 BTC of linked balance and 0.1 new byte for each 1 byte you already hold. To participate, install the wallet and chat with the Transition Bot that will help you link your Bitcoin and Byteball addresses. Track linking progress at transition.byteball.org.

So there is another dilution of Byteball coming next month. But the current issued supply of 137,000 GB (out of 1,000,000 GB total supply that can be issued) is equivalent (at 4th round offer) to 2 million linked BTC that registered previously (in prior rounds at more favorable terms). So there would need to be serious interest from a large portion of BTC hodlers but they are only awarding $4 of Byteball (at current exchange price) for each 1 BTC linked. So I doubt many BTC hodlers will even bother to waste their time. So apparently the remainder of the supply is never going to get issued unless Tony breaks his commitment on how it should be issued.

The first round of Byteball was awarding 1 GB for each 0.7 BTC. The 4th round is only awarding 0.0625 GB for each 1 BTC. So apparently the distribution is nearly completed.

So the developer/creator of Byteball (named Tony) was very sneaky. He said his premine for himself was only 2%, but in effect it will be more than 10% (currently 14.6%) because less than 20% of the total planned supply will ever be issued.


Edit: "1%" (which is really 6 - 7%) will be given to up to 100m user wallet installs, at 100KB each, so roughly $0.0067 per wallet install at current exchange rate:

https://bitcointalk.org/index.php?topic=1839699.msg18309534#msg18309534

What is not clear is whether the 1% referred to is only of issue supply or of maximum planned supply, as I explained in more detail at the above link.

Note I don't think $0.0067 per wallet install (even if the price increases by 10X or even 100X) is going to incentivize any users to install wallets.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 [27] 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!