I honestly think it's a bad idea to sell the white paper. I understand your principle but people simply won't pay into something new until they know exactly what they are getting themselves into. Referring them to the blog is not helpful for many to decide whether to get a share or not. It has further nothing to do whether they can afford it or not. Just my 2 cents
A shorter version will be made available for free. This will be the version that will be translated.
|
|
|
I don't know if this is answered anywhere, but why OCaml?
the founder is French. Then shouldn't he have used Eiffel instead? ;-)
|
|
|
I don't know if this is answered anywhere, but why OCaml?
|
|
|
Hello Ohad, What do you think of the BOScoin platform? It is a decentralized and self-evolving blockchain, that is, it could be self-defined as TauChain. It has a network of congresses and any user can join the congress to vote and change the code or rules of the network. It also uses the web ontology language but unlike TauChain that will use RDF, BOScoin will use OWL DL since according to the white book of BOScoin the ontological language RDF does not support P-time integrity (although I do not know exactly what it means). BOScoin wants to combine the OWL language with TAL, since the latter models the programming logic. What significant differences and advantages could TauChain bring to BOScoin? https://boscoin.io/en/home/boscoin, like tezos, broadly speaking, interested in a consensus/governance/amendment model of a currency, part of its underlying mechanics, and smart contracts. tau is about the very general concept of people forming sharing and following an idea collaboratively (and eventually doing so to tau itself). therefore unlike the other two, it doesn’t even have a coin. not only we claim that “there is no one consensus scheme to rule them all”, we also claim that “there is no one language to rule them all”, far to mention one complexity class like P, which i can understand the rationale leading to this choice in a coin ledger setting, however i doubt whether P is enough to express self-amendment. the new tau is about human-machine-human group communication, that also lets its users to define languages and even reason over their logical properties (like completeness and consistency). the language doesn’t matter, can speak any language, as long as someone taught tau to understand it. im holding myself from pointing to more differences in order to keep some technical surprises for later. I am curious if you have any literature that discusses what you are talking about here? More specifically this idea about human-machine-human group communication.
|
|
|
Is the token sale separate from the tokens (free) you get from points collection?
Yes, tokens will be allocated for the bounties.
|
|
|
Did you compare the contents?
The original title of the NEM thread was "NEM :: descendant of NXT" See: https://bitcointalk.org/index.php?topic=578492.msg6348606#msg6348606So there you go, I claim ownership to the original idea of NEM. Spin it however you want, but the facts are right there to all to see. Anyway, too bad I'm not a multi-millionaire right now. But at least that it validates my initial instincts on what will work.
|
|
|
Thanks for the links, saved me some time. So this was basically your "idea" on January 4th 2014, and now you imply that NEM copied this "idea": Yes, January 4th is two weeks before January 21st. NEM followed the same idea of creating a better Nxt. The original development and marketing strategy was the same. Here a discussion on the same subject: https://bitcointalk.org/index.php?topic=578492.msg6348606#msg6348606That's from 19 January - 3 months ago!! What date did the rule about altcoin giveaways come into force? That thread was from a time when FrictionlessCoin started NEX, a clone NXT. Why wasn't he banned? A whole bunch of people made 'mock' clone threads to send him up. I think the giveaway rule started AFTER that time. Why did MadCow get banned and not these users? They made mock NEX threads on the same date 19th January 2014. utopian Future later started a new thread when NEM bacame 'serious', but at the start the original NEM thread was exactly the same style as the ME.TOO thread. 2Kool4Skool made NEXX - https://bitcointalk.org/index.php?topic=422272.0Utopianfuture made NEM - https://bitcointalk.org/index.php?topic=422129.0But props. goes to NEM for executing very well. You can believe whatever you want, but the evidence is right there for everyone to see.
|
|
|
Why?
Is this a scam?
Just dig thru his post history and Internet. One of his strongest qualities - he hasn't completed a single project yet. For the uninitiated, 3 years ago we had an idea for NEX which was a new coin. We got pledges of support, but we decided against creating a new coin. That was because the legal situation of new cryptocurrencies was murky at the time. NEM copied the original idea and is actually doing very well indeed. They took the risk that in hindsight we should have taken also. Nevertheless, back then we could not see a unique use-case for a new crypto-currency. A majority of the coins out there have no new functionality. Right now, the legal environment is much clearer that in 2014, furthermore there is a new opportunity here that was not obvious in early 2014. That is, the emergence of Deep Learning as a disruptive new AI. This of course is not a new currency, but rather a protocol token that exists in the Ethereum block chain. Intuition Fabric (INT) is one of the few assets being introduced with real functionality. That's the background here. Anyway, ignore the trolls. They show up when they know a good thing is happening and want to discourage everyone from participating so they get a bigger piece of the pie. Smart folks know how to separate the noise from the true signal.
|
|
|
well the old tau design (which you see in the website) is nomic-based, but not the new one (which was partly revealed here). to come up with a candidate rule, which is a single turn in nomic, is a whole mission by its own, that deserve collaboration. aside neglecting more human aspects (like the need to vote to possibly so many and so complex rules). nomic therefore cannot really scale or become decentralized, or useful in large groups. the point is to have a decentralized ever growing knowledgebase, in a social network setting. this as for itself doesnt require a blockchain. but the next step does: to make the system itself self-defining and self-amending, collaboratively yet securely, in a decentralized fashion
What's the ETA when you get something that describes this in more detail? unless the buyers stop me due to fear of competition or so, some in few days, some in few weeks, it seems (only seems) Well looking forward to this. I know writing a white paper can be very challenging, but sometimes you can't be perfect.
|
|
|
Here is an explanation that I posted just recently on Ethereum forum:
To summarize:
1. Deep Learning models are stored on IPFS to be executed by a DL VM. 2. Training data can optionally be stored on IPFS so models can be recreated 3. Users can use the models on IPFS to run locally or use a cloud provider 4. User's data used for inference are therefore private. 5. Trainers contribute new models by deriving from existing models. 6. Trainers are compensated for their models using Tokens. 7. Users access the network using Tokens. 8. Bounties will be available to encourage developers to create the latest SOTA (state-of-the-art) models. Compute resources will also be made available to developers. 9. There are other roles such as Curators and Auditors that participate to improve the network.
The idea is to build a democratized Deep Learning system that anyone with Tokens have access to perform their daily activities while preserving their privacy. So for example:
(1) Automatically reading email and replying. (2) Automatically categorizing own personal data. (3) Advanced searching of confidential data. (4) Conversion of confidential unstructured data into knowledge bases. (5) Reducing information overload. (6) Running prediction algorithms that watch markets.
The list can be very big. The main idea is to allow 'frictionless' access to these capabilities while still preserving one's own privacy. So think of having an 'app store' of Deep Learning applications that you can make use of with your email reader or similar apps.
I hope that clarifies things for folks who have not read the whitepaper.
|
|
|
So, how much money are you going to raise for the token distribution? Will there be an ICO?
No ICO. There will be a Protocol Token sale.
|
|
|
well the old tau design (which you see in the website) is nomic-based, but not the new one (which was partly revealed here). to come up with a candidate rule, which is a single turn in nomic, is a whole mission by its own, that deserve collaboration. aside neglecting more human aspects (like the need to vote to possibly so many and so complex rules). nomic therefore cannot really scale or become decentralized, or useful in large groups. the point is to have a decentralized ever growing knowledgebase, in a social network setting. this as for itself doesnt require a blockchain. but the next step does: to make the system itself self-defining and self-amending, collaboratively yet securely, in a decentralized fashion
What's the ETA when you get something that describes this in more detail?
|
|
|
is a special case because machine learning algos can be formalized in logic. eth's coordination is over the blockchain only, and that's part of their limitation. it's "code inside blockchain", not "blockchain inside code". on tau, it gives rise to reason over the network as a whole. in other words, if eth would replace the contract language to decidable one, they still cannot assert that the contract won't break the global eth network/economy. feel free to discuss machine learning or blockchain stuff with me, here or in any other media
So if I understand your approach correctly, you want to do something like Tezos (i.e Nomic), where all rules are immutable? That's an interesting research idea, but I'll watch the area to see how it progresses. I don't know yet if I need that kind of P2P network, but we shall see.
|
|
|
In scanning your website, it appears that you are taking a GOFAI (Good Old Fashion AI) approach.
That is different from Intuition Fabric (INT) that uses a Deep Learning approach.
It is the difference between the Symbolists approach to AI versus the Connectionists approach to AI.
almost. is the difference between learning from example using statistical methods, to rule-based-ai. to just use logical inference. the former is a special case of the latter, not vice versa. further, the latter makes sense in a trustless p2p network, not the former. Yes, I think we can agree on the differences in the approach. I would disagree though with the former being a special case. Humans aren't built like machines, but are capable of logic inference. I use the Ethereum network for P2P capabilities. I don't use it as part of the inference process. Anyway, best of luck with your project! I will be at the NYC Token Summit on May 25th, we can chat then if you are attending.
|
|
|
In scanning your website, it appears that you are taking a GOFAI (Good Old Fashion AI) approach.
That is different from Intuition Fabric (INT) that uses a Deep Learning approach.
It is the difference between the Symbolists approach to AI versus the Connectionists approach to AI.
Anyway, do let me know when you get your whitepaper out. I definitely would be interested in reading it.
Mine is out, so you can always read it.
|
|
|
That is why you probably should wait as long as you can to release the whitepaper (or maybe a short version first then the full paper when it is needed.) the plan is to finish the paper first, that'd be safe and successful to my estimation, but, if buyers will repeat this request from me due to e.g. fear of competition (i.e. if more people agree with you), i can postpone publications by putting time into code (i'll reply later with little more detail about tezos) I'd rather you code. It's all about the product. Most people won't even understand the whitepaper anyway. The whitepaper will only raise the price of the agoras. Product, product and product. We move forward! Agree. I also don't mind waiting with the white paper. This is dog eat dog environment. It's not the first time I mention Microsoft here but Billy also stole someone's ideas, look where he's today... There is a lot of hype around Tezos now http://www.cnbc.com/2017/05/05/billionaire-investor-tim-draper-backs-new-cryptocurrency.html so this is a very tough decision imo and everything should be taken into account. I guess it also depends on how far they can go with what they already copied Just want to add a note that Intuition Fabric's whitepaper is out. It weighs in at 124 pages: http://intuitionfabric.launchrock.com/Of course, if you are talking about something else, then my apologies. generally speaking, some buzzword combination, no matter how popular, cannot possibly make sense. deep learning + blockchain is one of them. i'll be glad to be proven wrong but i also have proofs to be right. machine learning was was my main field of r&d before i began with crypto. if there was a sound way to combine the two, i'd do it first. Well, I don't really know exactly what your plan here is, however rest assured that Intuition Fabric's (i.e. INT) approach to combining Deep Learning with Blockchain technology is quite innovative. You should read the whitepaper. Well, clearly you won't be able to do it first since you just said that 'it cannot possibly make sense'. But just to be clear, whatever you are planning with Agora Tokens, it definitely would be very different from what we are building with Intuition Fabric. So there's no conflict here. Just a very different approach.
|
|
|
|