Bitcoin Forum
June 28, 2024, 07:06:01 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 [36] 37 38 39 40 41 42 43 44 45 46 47 »
  Print  
Author Topic: Do you think "iamnotback" really has the" Bitcoin killer"?  (Read 79922 times)
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 04, 2017, 11:11:30 AM
 #701

Bitcoin is total 666  Grin Grin

Fatoshi
Sr. Member
****
Offline Offline

Activity: 672
Merit: 251



View Profile
April 04, 2017, 01:08:36 PM
Last edit: April 04, 2017, 01:20:04 PM by Fatoshi
 #702

If you can outdo bitcoin and reddit then holders of such a token must be rich.

Yup. That's the plan.

I'm not joking. I know their weakness.

My significant limitation right now is health. More so than anything else. My liver pain and delirium returns when I slack off my exercise, because I was overworking nonstop past week or two (lost track of days and night and day). Just finished a hard run and sprinting. My liver was really bad past two days. I am still on these toxic meds for another 14 weeks (but 2 drugs instead of 4 kinds, i.e. half the dose of before).

The main problem with this health condition is I am often just feeling like shit which makes it difficult to get into the focus of mind to code. I guess I am going to try to restart the barbell workouts and see if that might raise my level of health. Very frustrating to have 14 more weeks of meds to go. I need my excellent health immediately.

I am a voracious coder if healthy.


Have you heard of the Wim Hof (The Iceman) method? Supposedly is amazing at building inner physical/mental endurance and strength long lost by modern living. Could be just the thing as its been proven to improve the immune system as well.  I'm gonna try out the basic breathing and cold showers methods. Good luck finding cold water in the Philippines though. lol


https://www.youtube.com/watch?v=q6XKcsm3dKs


http://www.livingflow.net/wim-hof-method/




iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 04, 2017, 08:56:36 PM
 #703

For those who need some further explanation on the math I presented which shows why Bitcoin is a winner-take-all evil plan:

I notice you give a formula which doesn't not allow for the possibility for a power broker to pay 100% of the fees.

Okay you just proved you are incompetent. When x = 100, then that entity is the only miner and transactor, which is the ultimate outcome of the math. Thus an infinite relative profit (x / (100 - x)%) is correct, because there is no other entity at that point.

The math is stating that the entity with the largest x, will be more profitable than all others and thus his x always grows faster than any other entity.

As for the term "power broker", use any term you prefer such as 'whales' or any term which describes those who issue the transactions which pay the highest fees and thus can fit into the small blocks (because small blocks limit transactions to those with the highest fees).
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 07:08:28 AM
 #704

Segwit still seem less 666 in some aspect, because it's more deterministic, and remove some burden from the reward / pow / emission scheme, but it's maybe 666 in other area on the line of fractional reserve issue, and the illusion with having duplicated btc off chain being used indifferently, but still not always being able to put everything back at any time, and it's not totally equivalent, and im not sure they are very honest on all the intended side effect on the network.

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 08:55:57 AM
Last edit: April 05, 2017, 12:02:32 PM by iamnotback
 #705

with reference counter, all the object are manipulated via reference, it just like GC.

Refcounting has serious drawbacks.

What ? Smiley

Can have similar unpredictable surges in memory collection overhead as GC. It is unpredictable when you'll get a dominos cascade of refcounts going to 0.

And refcounts can't handle circular references, yet GC can.

Refcounting does consume less memory than GC for the same performance. However refcounting has a performance penalty of every reference assigned, which GC amortizes into mark-and-sweep operations by which time many of the references may be destructed. Generational GC improves efficiency of GC.

For maximum convenience of the programmer then GC is superior, especially if extra DRAM is abundant. For apps. Perhaps not good for the blockchain nodes, which should be highly tuned.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 08:56:27 AM
Last edit: April 06, 2017, 01:16:16 PM by iamnotback
 #706

I hope this is the end of this diversion of trying to explain why the concept of a blockchain for trading knowledge is going to be more valuable than Bitcoin which is only a payment system for fungible money:

Metcalfe’s law does not state in any way that if you add users to a network then the value, and therefore the price, increases. This would be an absurd claim and the analogy that comes to mind is an increasing population full of infants using fax machines.

Metcalfe rather speaks to POTENTIAL value, if we are going think about the price and market cap of a network. That is to say that there is room for matchmaking connections for N² users, but there is nothing in such a law to suggest that any amount of users can efficiently use a certain network (later in this writing we will read Szabo alluding to such redundancies).

In other words it cannot be said that the addition of each user adds the same amount of value which is then multiplied across the network. In most cases it is probably easier to show such a claim cannot be true (how much value would be added from the last person in the world to have a fax network?).

Here we get Szabo’s extension of Metcalfe’s law in regard to emerging economics (through Adam Smith):

Quote from: Szabo
Metcalfe’s Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe’s Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation.

Notice Szabo’s use of the word “potential”.

I had already linked to that Szabo quote before you mentioned it:

Szabo also wrote about that (@anonymous there is myself).

Szabo is incorrect. He fails to consider that we live in a relativistic universe. The maximum potential of the network is not the reciprocal of the number of nodes to the fourth power. Szabo is computing the potential as if the maximum is where every node communicates/trades to every other node and the cost of the transport being the limiting resource or cost. But as I showed in my other comment in the blog which I linked you to, that value is meaningless unless it is considered relative to all other opportunities, i.e. opportunity cost is the limiting resource. Value is always relative, not absolute. Szabo is ascribing an absolute cost of communication and assuming that is the dominant opportunity cost from the individualized perspective of every node. But I showed mathematically that rather it can be the grouping compatibilities that can be a limiting opportunity cost that can invert the assumption of greater relative value for the larger network. Networks increase the degrees-of-freedom of the node participants, thus the potential energy. To the extent that transport cost is a significant opportunity cost of the nodes, then Szabo's point applies, but as the cost and latency of communication decreases, there are a proliferation of opportunities which are significantly more valuable than those transport costs. As Lima pointed out, the Inverse Commons was one of those huge value opportunities that was enabled by the Internet. The value from exchanging knowledge in the Inverse Commons over the Internet far exceeds any communication costs. So as you now see, money is not the only agent that can increase degrees-of-freedom in trade and increase surplus production. Communication networks can increase the access degrees-of-freedom for non-fungible knowledge, which becomes fungible collaborative within the Inverse Commons. So thus, we see fungible money becomes only a small component of the value creation, such as to pay for the communication infrastructure costs. This is why fungible money is diminishing in utility in the knowledge age. Fungible money is applicable to increasing the degrees-of-freedom for solving the coordination issues around physical resources. Yes atoms are heavy, but relative to knowledge production value, the atoms are asymptotically (an inexorable trend to) massless. So into the knowledge age we go, and fungible money will diminish in importance and our insatiable quest will shift from power to knowledge.

Note this is second time I caught Szabo with a fundamental error. Szabo does raise interesting historical examples and his anthropological research is sometimes interesting.

@traincarswreck, your four-color theorem theory of tripartite essential resources for human life is the most basic example of the fact that fungible money increases degrees-of-freedom over barter. The increase in efficiency of fungible money w.r.t. barter, increases as the diversity of physical resources (tangible goods) increases. With three bartered resources and no fungible money, one has a 1/3 chance of holding the resource that another may want to trade for, i.e. being bordered on one of the other two colors in your four color theory. The probability fraction decreases and complexity of risk mitigation increases as the diversity of goods to be bartered increases.



But I brought it forth to show Szabo's use of the word POTENTIAL (which is bolded in my original writing I believe).  Szabo didn't miss this, search for the word "redundancies".  He is not stating the formulated value for a network, he is simply saying that metcalfe's law is extendable.  It makes perfect sense that a reduction in transport cost as a dramatic effect on the value of the network.  But neither metcalfe nor Szabo imply that such efficiency is perfectly transcribable to market cap of a network.  That's not either of their mistake, its others' mistake.

Disagree.

Szabo is referring to redundancies that reduce the need for transport between every pair of nodes in the network and thus reducing the exponent to lower than four, but he is still relying on transport costs being the value limitation in the network. He is not recognizing that value is relativistic and the transport costs may not be a significant factor at all in some cases as I pointed out.

You say fungible money.  What is un-fungible money? Money becomes money only because of our definition which is somewhat related to "the most fungible good".  Otherwise its just barter right?

I don't disagree that money is relating to fungible transfer of utility. The point I am making is that fungible transfer of utility can occur by transferring a fungible intermediate good (i.e. money) or by adding knowledge to an Inverse Commons. The fungible transfer of utility within the Inverse Commons is a shared rise in utility for all participants of the Inverse Commons, i.e. although each unit of knowledge added to the Inverse Commons might be of a different kind (thus non-fungible if were applied to a barter trade between parties), the units of knowledge don't have to be matched to another same kinded unit of knowledge in order to transfer the utility within an Inverse Commons. Given my explanation of how the Internet has raised the degrees-of-freedom of (the nodes which are the individuals in) humanity's access to a plurality of Inverse Commons, there is magnificent increase in the (knowledge) value (and thus utility) that can be fungibly transferred without money (i.e. without a fungible traded intermediate good). Nobody owns the Inverse Commons, because it is open for everyone to see. It gains value the more people contribute to it and the marginal utility of adding more participants doesn't decline precipitously as in other forms of knowledge exchange because it due to the decentralized access control (i.e. everyone can see and fork it without any top-down permission required) there is not the top-down management coordination problem known as the Mythical Man Month. We see that the more degrees-of-freedom an Inverse Commons has for the participants to harmonize their contributions yet not be bound to coordination gridlock, the faster the value grows and the less slowly the marginal utility declines.

Inverse Commons are usually specialized so that only those interested or expert in the subject matter of that particular case of an Inverse Commons, are gaining value from contributing to it, because only they can understand or utilize the increase in shared knowledge in it. Because of this maximum division-of-labor, Inverse Commons can't be entirely captured by finance. Because participants can transact directly and instead via Firms (thus bypassing the Theory of the Firm), it becomes less plausible for finance to gain the economies-of-scale to consolidate everything in corporations and the winner-take-all weakness of fungible finance that I explained upthread is (I posit) ameliorated.

@traincarswreck, your four-color theorem theory of tripartite essential resources for human life is the most basic example of the fact that fungible money increases degrees-of-freedom over barter. The increase in efficiency of fungible money w.r.t. barter, increases as the diversity of physical resources (tangible goods) increases. With three bartered resources and no fungible money, one has a 1/3 chance of holding the resource that another may want to trade for, i.e. being bordered on one of the other two colors in your four color theory. The probability fraction decreases and complexity of risk mitigation increases as the diversity of goods to be bartered increases.

Yes its a generalization.  The reason I think it is worth something is that you present it without the four color theorem and we weren't able to proof the theorem without computers.  Is a flat plane, and changes to the topology would increase the complexity and possibly outcomes for equilibriums.  As does the addition of money as you say.  

But we can also think earlier for example in a transition from water to land, or that underwater examples of such relationship between entities, wouldn't have gravity so to speak (or near as much of an effect of it), so the equilibriums form in a different way.  

Yet it's conceivable that there is some transition to different orders (ie water to land, land to internet).  In this sense money serves as the lubricant for transition, and natural arises as such.

What I want you to see that we don't have to actually individually (selfishly) own the medium where the transfer of fungible value takes place, which is the case with money. This is why the Inverse Commons is new form of fungible utility transfer which ameliorates the monopoly formerly held by money. But the fungibility is not in the form of a barter trade between two owners of the good. If knowledge units were to be traded as barter, they would be non-fungible and thus can't be modeled (nor financed) with the concept of money.

Now I want to suggest, that objective truth, as knowledge, is fungible, and I want to understand what you say to that.

There is no such thing as a grand objective truth. We live in a relativistic universe. There is unbounded diversity of perspectives that can be formed. Humans will tend to cluster around specialties. Within those Inverse Commons of specialized interests and expertise, there will exist fungible value transfer, but there is no overall fungibility. And even within an Inverse Commons, the value that each participant gains from portions of the Inverse Commons will vary.

The idea that there could be one fungible money that accurately reflected all that complexity of value transfer is preposterous. I think this is why Nash was speaking of asymptotic approximation where there would be many stable currencies, but again I have shown upthread that is impossible. I know you will reply and claim I have not, but I simply won't reply any more. I have more important work to do than argue with you about the preposterous notion that there could exist any stable fungible good or goods which could be an accurate value or utility transfer system for all the complexity in nature. Fungible money was something man used which was the best fit for the technology available at the time. We have the Internet now. We are leaving the fungible money era.
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 09:29:58 AM
Last edit: April 05, 2017, 09:50:35 AM by iamnotback
 #707

I am answering some of these in public for those portions which I feel aren't too tangential nor require privacy:

The thing is the way i see, you have good expertise in at least two area :

High level scripting language.

Blockchain economics.


And i can very easily see how the two can connect. Aka having blockchain protocol and node wrote in high level language allowing for easy interfacing with javascript or rpc/json.




The big down side is see making it with node.js, is that first it doesn't look all that stable, not sure how it can be easily packaged in single file easy to run on any computer. (again the UI side can be made in a separate app in a js browser). And if you really want to do multi layered design, it's hell. I can show you many way in which code in javascript is very hard to debug, if you start to play with multi layered OO hierarchy, asynch call back handling, it can quickly become a big mess as there is not way to check or force typing anywhere, and it's impossible to even have any global analysis or checking of the code at all.

I can see where you want to get with the typescript, but if it's to make a js like language with asynch event handling, json object, garbage collector, object list processing, i think i really have all the base bricks to do this quickly.

[---8<--- snipped]

Because the way i see where our path can cross is this :

You want to make a blockchain node / protocol with high level language who support asynch io, green threading, json object, json rpc, and friendly ness with javascript and webapp.



I want to make a high level language to program easily nodes for different blockchain, with the http/rpc/json interface, and being able to kick start node based on custom blockchain protocol, which involve ideally an high level script language, which can also be used either as middle ware or full stack to develop distributed graphic application, using potentially openGL, c++ UI, or html5/js/webgl for front end.



If you want to really work on high level language with the feature you want, and i can update my framework too if you see things that are not really ok and there are good technical/economic reason for it, i have no pb with it, but to me it already does what it has to do ok, but i can start developping the basics of js like script engine quickly, and it would be very friendly to make js app out of this, with standalone executable include full node, wallet, block explorer, and script engine.

@iadix, the first step for me is getting the unique algorithms for my blockchain completed. And to see if I am going to complete any transpiler to TypeScript in order to make app GUI and mobile programming easier. C would not be the idea choice for these high-level activities.

The C code comes later when optimizing and making bullet-proof the key components.

You're down in very low-level coding and we can't yet see where your design choices fit optimally into the plan.

The best way to proceed is to complete the high-level rough draft of the coding first. Then from there we can see where if any of your design ideas would fit in.

We are doing too much handwaving and theoretical talk. In software, it is actual code and use cases that matter.

I think as soon as possible I need to try to get the APIs decided and documented, so that you can then show some examples of why your design is superior. I can't really relate to your code as it is now, because it seems to be based around a blockchain assumption, but my consensus design is not a block chain is some sense. However, there is overlap of many concepts with a blockchain. So I think I need some APIs so we aren't talking in such abstract terms. We need concrete APIs so we can compare examples of coding paradigms employing those APIs. I can certainly read all your code, but it doesn't mean I understand the holistic design of the code (your code has no comments and no documentation in the source code files!). As is typical for most programmers, you write mostly write-only source code, meaning other programmers can't understand what was your holistic design points, because there is no comprehensible documentation. I am more focused on trying to write high-level code with good documentation and specifications, before moving to optimization and hardening of the code base with low-level improvements.

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

I am not saying there is nothing valuable if what you've done. I am saying that I can't relate your existing code base to what I want to do. There is too much to try to assimilate and I don't want to expend my scarce time trying to figure out other people's code. It is possible that once we start communicating about specific APIs of my project, then I will start to understand benefits and tradeoffs of your approach better.

Part of the issue may be that I am not that sharp minded right now. So perhaps I can't assimilate as much code and try to figure out all that as well as I could if I wasn't so lacking of cognitive energy. Or maybe it is because I have my mind in the economics lately and not the coding. Maybe the understanding of what you are doing will become more familiar to me if we interact on examples I am more familiar with. I think also the issue is that you are trying to do everything in C and I just would have never even started from that perspective, so thus I don't see the point of even figuring out what you are doing with that code base. Perhaps through examples and in time, I would come to better appreciate your design decisions and then I can make more specific feedback and decisions.

Yes I am a high level thinker. But hey I can write low-level assembly and C code also, but it is not the priority to start at low-level. The priority is to focus first on the high-level concepts, especially the economics and key algorithms.

The most important is to get forward movement. I got sidetracked on the economic discussions and relapsing health problems. Today I ran twice and I am trying to do everything I can. I have commitments from BTC whales to help me go get better medical care, if my current standard TB treatment fails. I need to follow up a chest xray and liver enzymes test next week. Hopefully this is just the normal side-effects of these toxic TB antibiotics, which I have to take for another 14 weeks. Very frustrating. Trying to do what I can. I have diarrhea, liver pain, and chronic cognitive fatigue past few days, but maybe it was due to drinking fruit juices. I am trying to rectify it. If I feel enough energy tomorrow, I am going to try to go lift barbell and to force my body to better/stronger condition. I am pissed off that this TB is interfering with my project. So I will try to get pissed off and use that as motivation in gym. Let's see how my body responds. My body might get stronger from that aggressive action, or it might get overly exhausted. I can't predict. I will just try.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 10:24:47 AM
Last edit: April 05, 2017, 10:37:03 AM by IadixDev
 #708

Code:
For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

There is two format for the "json data".

There is the "text form" , coming from json/RPC (from javascript app)

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

There is reference counting, thread safety for arrays manipulation, etc, Internally , it use native C type to store/read the data, keyname hash, and the tree structure itself (list of named/typed children references etc) , direct memory pointers, etc.

The memory holding the native C type, and direct pointer to it is never accessed from the application directly. ( or it can be explicitly if you want raw pointer to the binary data, equivalent of rust unsafe code).


When a C function make the call

the caller create the tree manually in the binary form, from the binary data, instead of using a structure . ( it can be transformed to/from textual json with a generic function for the json/rpc interface ).

it pass a C pointer to the reference of this object stored in binary form to the callee function. (And another C pointer to an object reference for the output result if needed.)

The function can access all the tree, add or remove element to it, add an array to it, add element in a child array, copy object reference in local permanent internal list, etc

On return either if the function is just supposed to modify content of the passed object, or add/remove element to/from it, all can be tested and accessed via the tree interface from the caller ( and it just passed a C pointer, with total binary compatibility, useful also for logging or many things instead of vargs / stdio).

If the function return something to the caller ( eg from http/json/rpc request), then this result can be converted to textual json automatically and sent back to the javascript caller from the http server.


The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

And in the same time, it's also the C internal API , used by C programs like a regular dll API, but using the json binary form representation of the arguments, instead of C style argument passing ( the C style argument passing are only used for simple native C type and pointers).

It can be confusing from your perspective, because it's made in sort that high level concept are manipulated and implemented from the C language, and can use all the C language on local C variable with the native C type ( int , pointer, etc) , and also in the same time to manipulate high level object.

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 10:29:18 AM
 #709

I asked you to please:

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

Yet you still wrote another wall of text. Can't you make your point more concisely? Shouldn't you think more carefully about what you want to write?

You don't put care and attention into writing English. Communication is a very important skill. You may be the most talented coder in the universe, but being able communicate efficiently is very important.

Quote
For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

There is two format for the "json data".

There is the "text form" , coming from json/RPC (from javascript app)

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

There is reference counting, thread safety for arrays manipulation, etc, Internally , it use native C type to store/read the data, keyname hash, and the tree structure itself (list of named/typed children references etc) , direct memory pointers, etc.

The memory holding the native C type, and direct pointer to it is never accessed from the application directly. ( or it can be explicitly if you want raw pointer to the binary data, equivalent of rust unsafe code).


When a C function make the call

the caller create the tree manually in the binary form, from the binary data, instead of using a structure . ( it can be transformed to/from textual json with a generic function for the json/rpc interface ).

it pass a C pointer to the reference of this object stored in binary form to the callee function. (And another C pointer to an object reference for the output result if needed.)

The function can access all the tree, add or remove element to it, add an array to it, add element in a child array, copy object reference in local permanent internal list, etc

On return either if the function is just supposed to modify content of the passed object, or add/remove element to/from it, all can be tested and accessed via the tree interface from the caller ( and it just passed a C pointer, with total binary compatibility, useful also for logging or many things instead of vargs / stdio).

If the function return something to the caller ( eg from http/json/rpc request), then this result can be converted to textual json automatically and sent back to the javascript caller from the http server.


The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

And in the same time, it's also the C internal API , used by C programs like a regular dll API, but using the json binary form representation of the arguments, instead of C style argument passing ( the C style argument passing are only used for simple native C type and pointers).


It can be confusing from your perspective, because it's made in sort that high level concept are manipulated and implemented from the C language, and can use all the C language on local C variable with the native C type ( int , pointer, etc) , and also in the same time to manipulate high level object.

But i already have a full API, and i did not make this framework for blockchain specially, even if for application yes i have blockchain ecosystem in mind specifically, but the frame work can handle all kind of things ( ray tracing, manipulation of graphic object, manipulation of blockchain objects, private key, signature, in browser staking etc.

All the API is documented in the white papper , and in other places. All the code source is on the git. There are working example running.

You have all the low level code and explanation in the PMs.

Now if you can't understand my grammar, don't have time to read my code, and your idea is to start to program a blockchain and api for distributed application alone from scratch including high level OO interfacing, good luck with that Smiley I'll see where you get lol

If you are not interested to work on collaboration again ok, i can have idea how to handle most of the issue you rise on the git discussion, with local stack frame, circular references, multi threading, no mem leak, and simple expression evaluator, cross language interface and object definition compatible with json/rpc and javascript, etc.

I don't really know what you have in mind either Smiley

But i already have, API, interface, documentation, code, working example, test net, demo app in html5 for the raytracer in webGL. Etc

I already have this.

If you are not interested to look at it, or read the code, or read my answer, i don't know what to say to you. I have read most of the thing on the git discussion, and i think i solved those problem, and posted all the code, and explanation to you in private messages.

If you want to go alone from scratch developping an high level language to make blockchain node, and cross language interface, with portable code, and high level API for web application, go ahead. I already have this.

Otherwise you're just talking in the wind.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 10:31:43 AM
 #710

I asked you to please:

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

And you still wrote another wall of text:


There is two format for the "json data".

There is the "text form" , coming from json/RPC (from javascript app)

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.






The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

And in the same time, it's also the C internal API , used by C programs like a regular dll API, but using the json binary form representation of the arguments, instead of C style argument passing ( the C style argument passing are only used for simple native C type and pointers).




IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 10:36:05 AM
 #711

( the rest is to answer other points.)


I already have a full API, and i did not make this framework for blockchain specially, even if for application yes i have blockchain ecosystem in mind specifically, but the frame work can handle all kind of things ( ray tracing, manipulation of graphic object, manipulation of blockchain objects, private key, signature, in browser staking etc.

All the API is documented in the white papper , and in other places. All the code source is on the git. There are working example running.

You have all the low level code and explanation in the PMs.

Now if you can't understand my grammar, don't have time to read my code, and your idea is to start to program a blockchain and api for distributed application alone from scratch including high level OO interfacing, good luck with that Smiley I'll see where you get lol

If you are not interested to work on collaboration, again ok, i can have idea how to handle most of the issue you rise on the git discussion, with local stack frame, circular references, multi threading, no memory leak, asynchronous events with generic function declaration , compatibility with typescript/json and javascript objects, list /  array processing.

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 10:40:32 AM
 #712

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

So you want to serialize binary data structures to JSON text and deserialize then back to binary data structures? Why?

The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

You mean you are supplementing C with a meta type system residing in the run-time parsing of these JSON tree structures? But then your meta type system is not statically checked at compile-time.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 10:48:47 AM
 #713

I asked you to please:

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

Yet you still wrote another wall of text. Can't you make your point more concisely? Shouldn't you think more carefully about what you want to write?


In one sentence , the idea is to have a representation of a hierachy of objects and lists ( keyname-ref ) ( collection of collection of collection of collection) manipulated in C, and used as function arguments, to facilitate cross compiler compatbility, and memory leaks detection, and allowing to represent simple high level operator on object and variables from C, and being convertible to/from textual json with a generic functions.

IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 10:53:01 AM
 #714

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

So you want to serialize binary data structures to JSON text and deserialize then back to binary data structures? Why?

The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

You mean you are supplementing C with a meta type system residing in the run-time parsing of these JSON tree structures? But then your meta type system is not statically checked at compile-time.

The p2p protocol use binary data format, opengl too, crypto too.

Yes it cant be checked at compile time, but there is way to have a static definition of type too like with DOM objects style, with a default structure template  associated with the meta type, and can make sure all object of this meta type have a certain forced structure on instanciation, even at binary level. ( so they can be serialized/hashed from json or binary data definition).

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 10:59:23 AM
 #715

All the API is documented in the white papper

...

Now if you can't understand my grammar, don't have time to read my code

...

If you are not interested to work on collaboration

Your white paper is incomprehensible to me. I tried to read it.

Your low-level code is doing strange things, which I am not sure if they are good design or not.

I don't have time to reverse engineer your high-level concepts, by combing over 1000s of lines of low-level code.

Collaboration is a mutual responsibility. I will definitely collaborate with those who make me more efficient.

I am most interested in new ideas, when the progenitor of the ideas is able to explain their ideas succinctly, coherently, and cogently.

Most important is for me to make APIs and a testnet so that app developers can start coding. You can use what ever code you want to write apps. We don't really need to collaborate. You and I should be independent.

Now if there is something I can use from your framework in my work, then great. But it isn't really a requirement for what we need to do.

I think your concern is that my work won't be done in time. I understand that. That is a legitimate concern. You can surely go your own way, if you see my progress is too slow or if you feel my design decisions are incorrect. But as of this moment, you haven't even seen any APIs or design decisions from me yet. So it is difficult for you to judge.

No offense is intended. I am just being frank/honest. I am not intending to piss you off. But you keep slamming me with explanations which are not very cogent from my perspective. We aren't forced to collaborate on your framework. If your explanations were easier for me to readily grasp, then I could justify perhaps the tangential discussion on your framework. But if your explanations are difficult or cryptic for me to try to understand, then I reach the point I have by now where I see I am losing a lot of time reading stuff that doesn't quickly convey to me your high-level justifications and concepts.

Maybe its my fault or yours or both. But it isn't intended to be offensive. Just is what it is.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 11:02:50 AM
 #716

Well idk what format you want to use to define the api then ? ( to start somewhere)

Yes,  otherwise Yes see you in 6 month when you have code and an api to show.

iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 11:10:36 AM
 #717

I asked you to please:

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

Yet you still wrote another wall of text. Can't you make your point more concisely? Shouldn't you think more carefully about what you want to write?

In one sentence , the idea is to have a representation of a hierachy of objects and lists ( keyname-ref ) ( collection of collection of collection of collection) manipulated in C, and used as function arguments, to facilitate cross compiler compatbility, and memory leaks detection, and allowing to represent simple high level operator on object and variables from C, and being convertible to/from textual json with a generic functions.

Okay this is slightly better communication. Now you are talking to me in high-level concepts that can be digested and understood.

1. I don't need cross-compiler compatibility if I am using Java or JavaScript that runs every where. Performance and up-time hardening are not my first priorities. That will come later. I am one guy trying to get to testnet, not trying to write the perfect C++ implementation on the first draft of the code.

2. I don't need memory-leak detection (i.e. refcounting) if I have GC from Java, JavaScript, or Go.

3. Emulating high-level data structures in C with a library, means the static typing of those data structures is lost. I remember you wrote that you didn't want to use C++ because it is a mess. So apparently you decided to forsake static typing.

4. I would prefer to have a language which can statically type the data structures and which doesn't require the boilerplate of library calls for interfacing with higher-level data structures.

In other words, I see you have made compromises because of priorities which you think are more important. And what are those very important priorities? Performance?
iamnotback
Sr. Member
****
Offline Offline

Activity: 336
Merit: 265



View Profile
April 05, 2017, 11:23:49 AM
 #718

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

So you want to serialize binary data structures to JSON text and deserialize then back to binary data structures? Why?

The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

You mean you are supplementing C with a meta type system residing in the run-time parsing of these JSON tree structures? But then your meta type system is not statically checked at compile-time.

The p2p protocol use binary data format, opengl too, crypto too.

I thought you wrote you are serializing these to JSON text? Why are you now saying you are transmitting them in binary format?

Your communication is very difficult for me to understand.

Yes it cant be checked at compile time, but

Why the "but"? Can it or can't it?

there is way to have a static definition of type too like with DOM objects style, with a default structure template  associated with the meta type, and can make sure all object of this meta type have a certain forced structure on instanciation, even at binary level. ( so they can be serialized/hashed from json or binary data definition).

What do these words mean?

Do you understand that is very difficult to get a huge number of programmers to adopt some strange framework written for one person's preferences. Generally changes in programming have to follow sort of what is popular and understood.

If you have a superior coding paradigm, then it should be something that can be articulated fairly simply and programmers will get it easily and say "a ha! that is nice!".

Something that is very convoluted to explain, is probably not going to be popular.
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 11:30:36 AM
 #719

I asked you to please:

For example, I have no idea why you are serializing to JSON textual format in your C code and not just passing (little or big endian) binary data that is compatible with C structs. And please don't write another long explanation that I can't really understand your descriptions well. You try to say too many things at the same time, and it is ends up not being that comprehensible.

Yet you still wrote another wall of text. Can't you make your point more concisely? Shouldn't you think more carefully about what you want to write?

In one sentence , the idea is to have a representation of a hierachy of objects and lists ( keyname-ref ) ( collection of collection of collection of collection) manipulated in C, and used as function arguments, to facilitate cross compiler compatbility, and memory leaks detection, and allowing to represent simple high level operator on object and variables from C, and being convertible to/from textual json with a generic functions.

Okay this is slightly better communication. Now you are talking to me in high-level concepts that can be digested and understood.

1. I don't need cross-compiler compatibility if I am using Java or JavaScript that runs every where. Performance and up-time hardening are not my first priorities. That will come later. I am one guy trying to get to testnet, not trying to write the perfect C++ implementation on the first draft of the code.

2. I don't need memory-leak detection (i.e. refcounting) if I have GC from Java, JavaScript, or Go.

3. Emulating high-level data structures in C with a library, means the static typing of those data structures is lost. I remember you wrote that you didn't want to use C++ because it is a mess. So apparently you decided to forsake static typing.

4. I would prefer to have a language which can statically type the data structures and which doesn't require the boilerplate of library calls for interfacing with higher-level data structures.

In other words, I see you have made compromises because of priorities which you think are more important. And what are those very important priorities? Performance?

1. Up time should already be good Smiley but Yes you can write code in C using this data format as function arguments and call it from js or java, even remotely via http/json

2. Yes normally, but need to check the certain case sent in pm, but for most things Yes.

3. Static typing can be emulated at the meta typing level at run-time, but hardly by the C compiler, but maybe some tricks with compile time check could be made with macro or pragma.

4. There is some form of static typing internally but it's not visible on the C level. It could be seen by higher level script supporting static typing.


Performance not in short term.



Initially the real motivation is operating system project based on micro kernel, with system agnostic binary modules who can be compiled from windows or linux, and can abstract most need for complex memory allocation and tree of objects at drivers level.

So it can be booted directly from a pi , or pc, or in virtual box from bare metal. To have also rpc and distributed modules in mind. For doing efficient server side operation in C, for 3d , or data processing, with distributed application programmed from this.

Like an application server. With integrated crypto, vectorial math, data list, and somehow like small tomcat for embedded system. Oriented with json and webapp.


The goal originally is this, except I integrated modules to deal with the blockchain protocol and implemented the low level functions with win32/Linux kernel api to make blockchain nodes with rpc server.

IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
April 05, 2017, 11:37:05 AM
Last edit: April 05, 2017, 11:52:04 AM by IadixDev
 #720

And there is the "binary form", a tree of referenced objects, constructed from a text json, or manually with the API. ( or eventually from a script)

The idea is to allow to completely abstract function calling and parameters passing using a format that can be converted to/from json transparently, automatically, with object reference counting, and stored internally as binary data aligned on 128 bits for fast SIMD operations.

So you want to serialize binary data structures to JSON text and deserialize then back to binary data structures? Why?

The same function prototype in C can be used to declare different functions with different input parameters.

Like a generic function type who can be implemented with different type of data format for arguments, and to define the json/rpc API from the C side.

You mean you are supplementing C with a meta type system residing in the run-time parsing of these JSON tree structures? But then your meta type system is not statically checked at compile-time.

The p2p protocol use binary data format, opengl too, crypto too.

I thought you wrote you are serializing these to JSON text? Why are you now saying you are transmitting them in binary format?

Your communication is very difficult for me to understand.

Yes it cant be checked at compile time, but

Why the "but"? Can it or can't it?

there is way to have a static definition of type too like with DOM objects style, with a default structure template  associated with the meta type, and can make sure all object of this meta type have a certain forced structure on instanciation, even at binary level. ( so they can be serialized/hashed from json or binary data definition).

What do these words mean?

Do you understand that is very difficult to get a huge number of programmers to adopt some strange framework written for one person's preferences. Generally changes in programming have to follow sort of what is popular and understood.

If you have a superior coding paradigm, then it should be something that can be articulated fairly simply and programmers will get it easily and say "a ha! that is nice!".

Something that is very convoluted to explain, is probably not going to be popular.

The tree can be serialized to binary format and text json. Both.

The node has both p2p protocol in binary data, and rpc interface in json. Working on the same objects.

The type definition will always escape the C compiler comprehension, but can use typedef alias on the reference pointer to help.

The articulation in higher level wont be made in C. It can be distributed in pure binary form for linux and windows, module with new rpc interface can be added to the node without recompiling. Nobody has to see any bit of those source to develop à js app with it.

And if it's to be used at low level, ill document the low level API, to use it to make C or C++ apps. But it's not the point for the moment.


The relevant part of the api for app programmers is not there, but in the javascript code. With the rpc/json api.

Only block chain protocol implementation, or the host side of interface for application modules has to be done in C with the internal api.

Js app developers or node hoster can just get the exe and modules and make their app from the rpc/api.

It could be in assembler or Lisp it wouldnt change a thing.

But I can document the internal api and interface, but it's already all in the code source, and there are examples.

To program high level scripting with it need to know the high level API with the tree.



If there was already a well adopted solution to do this that make all app developers happy with a safe, secure, efficient distributed application framework in high level I would say you ok, but there isnt ... next what ...


Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 [36] 37 38 39 40 41 42 43 44 45 46 47 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!