Bitcoin Forum
May 27, 2024, 04:01:33 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 »
201  Other / Archival / Re: delete on: October 04, 2014, 04:26:27 AM
I am so hungry. For example, I posited a way to continually increase the difficulty by always structuring the attackers blocks to make the fastest block solutions in the discarded 20% set, thus skewing the statistics of the hashrate. I wrote the caveat that I hadn't studied the implementation to see if this was feasible.

Structured how? Specifically.

According to the statistics used to choose the 20% set, and given the loose rules about the timestamps the attacker can put on his blocks. Again I said I didn't study the implementation to see if the actual calculation can be so gamed. It is just a conceptual point.

If the code was something I could quickly wrap my head around, I would have looked at it. I have not seen the algorithm used described in sufficient detail, like most things in Cryptonote, you have to go look at spaghetti code instead.
202  Other / Archival / Re: delete on: October 04, 2014, 04:17:02 AM
Whether your output is included in a mix makes no difference to your ability to spend it.

Try re-reading what I wrote.
203  Other / Archival / Re: delete on: October 04, 2014, 03:56:53 AM
A minor price drop is nothing more than the weak hands pissing themselves and they will regret it soon enough and buy back in at a loss.

This thread has become a joke.

Unless ring signatures are qualitatively the wrong solution for anonymity. The jury is still out on this one. Needs more analysis.

One thing I don't like personally is IBM says we are 10-15 years from a quantum computer and all that anonymity history goes poof and then the government backtrack and go after all those assets that were hidden from the coming global implosion 2016 - 2032.

But not everyone even agrees with my pessimism about the next 15 years.

Also I don't trust those simultaneous equations that I showed which mixed operations over different number fields. That is entirely new cryptography and it could potentially enable some new mathematical attack at any time. There is no Diffie-Hellman refutation that has a lot of cryptanalysis.

I'd rather not put my anonymity in some unproven math on the block chain that is saved forever. Eventually there will be a quantum computer and all that will be cracked.

And we still have to see what the outcome will be on the de-anonymization and respective mitigation algorithms which are already known but not yet fully explored. Might be duds, but I doubt it.

Also although smooth claims they know how to prune ring signatures to make a better scaling blockchain, and even the algorithm I presented to them in theory does some pruning, I am not yet convinced ring signatures are congruent with the decentralization I would be aiming for. But this is very vague at this point and nothing I can really do to immediately get all the specifics enumerated.

That is several different vectors of weakness for ring signatures. I never really understood why some investors think they found the Holy Grail. We need more analysis to know how they compare against all other options.

Edit: that the spender of a ring is culpable for which public keys he mixes with, unlike other anonymity mixing methods which remove this choice and thus culpability from the spender.
204  Other / Archival / Re: delete on: October 04, 2014, 03:49:04 AM
Certainly ring sigs don't automatically cause massive numbers of otherwise-unrelated transactions to suddenly depend on a rejected fork, especially if the fork is of limited duration. Granted there are slightly more dependencies, but that is quantitative difference not a qualitative one.

I posited to NewLiberty upthread that the development of a Gordian knot would depend on the duration of such an attack.

I argue it it also qualitative because my outputs get mixed in rings without my permission. Thus I can't spend in times of such an attack without incurring the risk that my spend must be unwound. Whether I know an attack is underway is irrelevant.

Quote
Quote
Quote
3. Non-Cryptonote coins do not have throw away 20% of the timestamp information upon difficulty adjustment. I know you think the vulnerability I have broad-sketched above is not sufficiently detailed to warrant any concern, but nevertheless this is a risk that doesn't exist in other coins.

More vague uncertainty and doubt without some sort of positive statement.

I have described a specific set of steps for an algorithm upthread.

I missed that. Please quote it or summarize it.

I am so hungry. For example, I posited a way to continually increase the difficulty by always structuring the attackers blocks to make the fastest block solutions in the discarded 20% set, thus skewing the statistics of the hashrate. I wrote the caveat that I hadn't studied the implementation to see if this was feasible.

I posited this would cause the network hashrate to drop (because miner's profits depends on difficulty) thus increasing the attackers % of the network hashrate.

Btw, the selfish mining white paper shows the math for this effect, so you've just opened a window to make it easier with less hashrate. You can work through the equations there. I suppose 20% as BCX said wouldn't be far off because of my recent interaction with that math.

Quote
You actually did this in describing the existence of stronger-than-MRL-0001 deanonymation attack (though not its scope and practical effect).

Oh I see you are recognizing that. Thanks.

Exactly, and this is not meant to personalize the issue with respect to you or anyone else or Monero or any other project. Specific, well-supported and well-presented contributions are more valued than vague ones. Always and everywhere.

I agree they are valued, but I entirely disagree they are universally more valuable in every case. Sometimes just the inkling of an incredibly powerful idea is more valuable to me than some implementation of something.

I am 100% sure you agree there are such cases.

Quote
It is very intuitive to me mathematically that you've got aliasing error in your difficult adjustment.

Show a specific example (or more general mathematical proof, but I'm guessing that proof-by-example might be easiest here).

Too hungry. Will see if something comes to me later.
205  Other / Archival / Re: delete on: October 04, 2014, 03:10:58 AM
Auroracoin (which btw rpietila invested in

He suggests that he didn't. What is your evidence?

Even though I was interested in this before the great pump in March (and would have made up to 100x gains if I had bought), now it is in a "following" mode after crashing back. If I moved to Iceland, I would probably start using it. Not an unconditional "sell" though.

Okay I thought he had because it seemed like he was very interested, but I did tell him that it had the wrong distribution model thus couldn't be anything other than a pump and dump. I had assumed he sold on the way up and wasn't left holding the bag, but now I learn he never bought.
206  Other / Archival / Re: delete on: October 04, 2014, 03:01:03 AM
...
Strictly by the definitions, the "Bitcoin Mining Problem" (BMP), the partial inversion of SHA256, is O(1), because it does not matter what the algorithm outputs when n is not 256.  Hence, strictly by the definitions, BMP is in P, and therefore is in NP; but is not NP-complete.  And, indeed, it could be solved by a giant table.  Of course, that table is too big to be built in practice; but that obstacle is not relevant for the theory.

I am not aware of any theory that would give a useful estimate of the practical difficulty of solving some problem with fixed-size inputs, like the BMP.  At best, one can prove theoretically that a certain specific shortcut will not work; but there is no way to prove that there are no practical shortcuts.  The only "proof" that BMP is hard is that many people have tried to find such a practical shortcut, for many years, with no success.

It is a terrible misconception to think that "exponential cost" is bigger (or grows faster) than "polynomial cost".  That is true only for "sufficiently large n", but the theory does not say when n is "sufficiently large"; it could be true only for n greater than 10500...

Indeed you are correct that the theory only makes proofs about the asymptotic complexity. It is analogous to Random Oracle proofs of cryptographic properties, because it is in fact proven that no perfect Random Oracle can be created.

We can prove nothing in this universe with absolute certainty because we are not an external observer (study the assumptions of the Shannon-Nyquist theorem for example). I get into the philosophical discussion of this in my article about the Universe.

However, I can analyze at which ranges of n algorithm resource costs scale more like nk versus n log n. Maybe I can even write a proof for a specific algorithm that it scales approximately like some function over a certain range of n. Are you saying that can't be done? I've never tried, I just intuitively looked at the code and could see it was scaling exponentially and one line was the culprit.

You are correct that if our best known algorithms are impractical to implement with current resources, it doesn't mean there isn't any possible algorithm that will. But here I want to take you back to my discovery about the edge of the universe. I was toying around with the duality of the Bottom and Top type in the two difference classes of programming languages and it made me realize that time and the universe is co-inductive and thus the finality or edge is indeterminate, which is analogous to undecidable in the Halting problem.

Thus in the real world all we have are the observations we've made, i.e. the calls we made to our co-inductive universe.

In summary, you were talking about what is provable and I was talking about what is observable. That is why we were talking past each other. And I used the wrong terminology, because O() and NP/P have very specific meanings that are essentially meaningless, because they can't ever be observed. I get your point now, yes indeed.



Edit:

I have seen several computer scientists and students who, trusting that "O(log n) is much better than O(n)", tried to use this  method for things like pattern matching, where each site is an observed pattern, represented by a point in some high-dimensional space.  Only to find that, for that application, the quad-tree method is often much slower than plain sequential search.

In the real-world there is no absolute referential transparency, thus we can never be sure there aren't other input variables other than n that impact the performance.

It turns out that the quad-tree begins to work only if the number of sites n is at least 2d.  It works quite well for n = 1000 and d = 2 or 3; but, for  d = 30, it is a total waste of time unless n is a lot greater than a billion.  Not "smallish" at all...

The big-O notation was invented by pure mathematicians, to describe the limiting behavior of functions from reals to reals, such as the solutions of differential equations.  It was adapted to algorithm analysis by Don Knuth, IIRC.  However, for Knuth it was only a working tool, that would give an idea of the type of formula that had to be determined.  For example, if the number t(n) of inner block executions was found to be O(n2), one could write t(n) ≤ A n2 + B n + C, and then look for suitable values of A, B, and C.  If t(n) was found to be exponential, one would look for a formula like t(n) ≤ A × Bn. An so on.

However, finding those constants requires lots of boring algebra, so Knuth's lazy followers tacitly agreed that finding the big-O of the running time was good enough.  And then came the really lazy ones, who decided that "polynomial time" was all one needed to know.

Ah so you do agree with me that we can apply the concept of approximated scaling to analysis of algorithms, with caveats. Yeah we can't prove anything absolutely.
207  Other / Archival / Re: delete on: October 04, 2014, 02:31:41 AM
You ignored my point that each independent coin toss trial outcome is uniformly distributed whereas the Poisson distribution is exponentially distributed.

That is why I asserted that your and xulescu's analogies are inapplicable. Rare trial outcomes in a Poisson distribution occur less often then less rare ones (look at the area under the distribution curve at the tails). Whereas all trial outcomes in a coin toss occur at the same probability.

Ah *snooze*.

Do you really think we're idiots? The analogy was for an error in your modelling that you yourself accepted as valid, not for the numbers.

Er, no I didn't. I said I made no proof whether Poisson distribution is applicable. But note every research paper I read about Bitcoin claims the block chain is a Poisson process.

To my models it makes no difference whatsoever if they're coin tosses, loaded D20's or, indeed, exponentials or Poisson.

And I posited to you that your regression model is blind because you don't know the distribution of the process. You claim there isn't enough entropy to establish one, but that is your assumption. I don't see any proof from you to validate your assumption that block chains are not approximated by Poisson distributions. Why should I believe you when the research papers claim otherwise?

To address your point directly, a uniform distribution CANNOT have a tail by definition.

Exactly I made that point.

In terms of counts, both Poisson and binary distributions have normal tails.

If you mean the Binomial distribution, the random variable is the # of outcomes in a series of trials. I was referring to the distribution of a single trial. In the p=0.5 Bernoulli trial coin toss, the distribution of a trial is uniform—all possible outcomes for each independent trial are equally probable aka uniform. Whereas, for the Poisson trial, the probabilities of the possible outcomes has an exponential tail.

Both also completely fail because they assume complete independence between the samples.

You make two afaik unproven assumptions:

1. Blocks are not significantly independent.

2. Imperfect independence makes the Poisson model a useless approximation.

In the meanwhile you keep wasting your time on this triviality.

I will not let myself be insulted on the math, unless I deserve it (which can happen) in which case I will mea culpa.
208  Other / Archival / Re: delete on: October 04, 2014, 02:14:38 AM
Someone wanna sum up whats going on right now?

Can anyone dig up that link to the blog that was summarizing the thread? Has the author updated his summary?
209  Other / Archival / Re: delete on: October 04, 2014, 02:13:16 AM
Come-In-My-Behind, please.

Fine I will stop, only because I respect you.

I was just cracking a joke. Carry on.
210  Other / Archival / Re: delete on: October 04, 2014, 02:08:04 AM
Come-In-My-Behind, please.
211  Other / Archival / Re: delete on: October 04, 2014, 02:03:08 AM
1. Some coins have much higher network hashrate (difficulty) thus can't be as realistically attacked by someone with BCX's level of alleged resources.

As you have said before, BCX doens't really matter. If there is a vulnerability, and BCX doesn't exploit it, someone else may, and probably will. I find hash rate attacks uninteresting and by now they should be well understood by all cryptocoin participants (if not, then caveat emptor applies).

Hashrate attacks combined with alleged new vulnerabilities in Cryptonote are not yet fully understood.

Quote
2. Non-Cryptonote coins do not have ring signatures which make the block chain untracable and thus make it implausible (or very difficult) to do manual repair by segregating the stolen coinbase and double-spent traces from the transactions you'd like to keep.

So you are talking about blacklisting. Because otherwise there is no manual repair. One fork wins, the other fork loses.

You are referring the block hash being immutable with the transactions in the block. That immutable relationship is not commutative because the transactions are orthogonal to the block hashes.

Thus you can add the good transactions (from the bad fork) back to your good fork if you can untangle them. No blacklisting needed.

Quote
3. Non-Cryptonote coins do not have throw away 20% of the timestamp information upon difficulty adjustment. I know you think the vulnerability I have broad-sketched above is not sufficiently detailed to warrant any concern, but nevertheless this is a risk that doesn't exist in other coins.

More vague uncertainty and doubt without some sort of positive statement.

I have described a specific set of steps for an algorithm upthread.

Nothing in life is entirely certain. There are degrees of contribution and certainty. Apparently you think my contribution on that is immaterial?

Quote
4. BCX killed Auroracoin

This is disputed, but again you are personalizing the issue with respect to BCX. I don't.

You don't evaluate people based on their performance thus their likelihood of achieving their stated goals?

I am not aware of it being disputed. How certain is that dispute you claim?

Quote
Thus from my perspective at least, it gives the appearance you are still doing FUD control and refusing to be open-minded, rational, and objective. And this is the cultural problem of Monero.

Again personalizing. I disagree with your characterizations but they don't really matter.

Show some actual work, shut up, or continue to FUD. There is no fourth way.

Past days I have been discussing ideas about potential attacks. I have no idea why you would characterize this as not being actual work. I've already explained to you that it is part of my actual work. If XMR can benefit too great. I share for any eyeball that wants to avail. Perhaps you think I am doing this to intentionally hurt XMR or perhaps you think I am do this to waste my time. The former is not true because I can't hurt XMR with my words long-term. And the long-term is all that matters to me in crypto-currency. As for wasting my time, the exploration of attack ideas is very productive work for me, but this silly arguing with you is a waste of our time. I am surprised you came to some feeling where you felt I was trying to non-constructive. I expect that you can see I am trying to do useful work. But apparently you've become convinced that only code is useful (or something like that).

Scala, math, pseudocode, even precise English that doesn't rely on phrases such as "it might be possible to" or "it can't be proven that this isn't a flaw." Or a simple precise example of a set of actions that can be taken by an attacker to accomplish something. It doesn't matter which.

I have given you pseudocode for the bounty. I never heard back if it needed more details.

You actually did this in describing the existence of stronger-than-MRL-0001 deanonymation attack (though not its scope and practical effect).

Oh I see you are recognizing that. Thanks.

Since then you have contributed no substantive information to this thread, just repeated over and over again the same vague warnings about time warps, simultaneous equations, entanglement and similarly ill-defined and underdefined notions.

Or to borrow one of your favorite quotes, "Talk is cheap. Show me the code." I don't even ask for actual code, just specifics.

Look it is a process man. You don't eat the pie before it is cooked.

Interaction spawns insights.

I have provided a lot of specifics over the past 2 days. Maybe not the specifics you want, but they are leading some where (I think).

I would have removed that 20% crap from the difficulty adjustment immediately.

All the difficulty attacks have had one common denominator. They all were based on exploiting information that was thrown away. For example, KGW has a weakness that you can push the difficulty way up instantly, then bring it down fast but stay under the threshold for adjustment.

It is very intuitive to me mathematically that you've got aliasing error in your difficulty adjustment.

I don't have to code a damn thing to see that.
212  Other / Archival / Re: delete on: October 04, 2014, 01:32:11 AM
Believe me - I'm not posting because I'm in XMR. I'm posting because I was forced out of XMR.

No one can force you if you know what you are doing in the first place.

And more importantly I'm posting because I hate wankers and cunts that try to make themselves out to be some kind of protagonists of truth and virtuous social salvation while just being narcissistic wankers and cunts.

Your inadequate technical acumen appears to be the cause of your misplaced angst.
213  Other / Archival / Re: delete on: October 04, 2014, 01:20:41 AM
You have unfounded assumptions about shills being around.

Seems like you may have gotten the word shill and supporter mixed up. Or are they the same to you?  Grin Grin Grin

Evidence:


214  Other / Archival / Re: delete on: October 04, 2014, 01:13:14 AM
I am 80% certain that this cultural stance is going to be why XMR is beaten by another effort that understands better how to spur innovation by not suppressing or expecting a Cathedral style of progression.

The cultural stance is not significantly distinguished from the range of cultures I've seen on other open source projects on which and with which I've worked, which is quite a few.

Your characterization of it as a cathedral is straw man.

Perhaps it is a cathedral compared to your internal mental model of what constitutes bazzaar-style development, but by the standards of reality it is not.

Reality is the bazaar has been ongoing on this forum for years now. And you can't stop it. Don't you see all the experimentation in altcoins and all the wide-open discussion of ideas?

Your microcosm is your reality, but it is not the only reality.

For example, you were overruled on higher level of perpetual debasement. Some other coin will do what XMR doesn't.

My point is also that the Inverse Commons is not just intra-project, rather it is also inter-projects (as in the distinction between intrastate and interstate).

XMR is competing for developer resources, not developers competing to code for XMR as is the case for Linus.

Quote
You are equating the ability for a user to wait for say 6 confirmations to have a mathematically quantifiable probability of assurance, with the risk of a coin vulnerability allowing spends of any age to be double-spent (reverting a spend is double-spending).

No, I'm saying that any recipient of any coin is vulnerable to a chain fork, which is a judgement of the recipient that they have waited "long enough" for such a chain fork to become unlikely. I said nothing about 6 confirms. In practice recipients make their own judgement about number of confirms. In Bitcoin many use less, and in altcoins many require far more. In all cases there is likely some sort implicit or explicit fraud scoring, but that is really none of my business and is up to the recipients to sort out. Ring signatures do nothing to change this except what they are intended to do (inhibit tracing, and thus indirectly blacklisting), and nothing here is remotely specific to Monero.

Afaics, you entirely missed my point.

You had a category error. You compare quantifiable (known a priori) probability with attacks that have no quantifiable (no known a priori) probability of assurance. You compare from the perspective of the user a knowable and user-selected risk with an unknowable risk and no possible user choice (other than to divest).

There is no such error. The same exact risk in kind exists with every other coin (including Bitcoin). The only difference is one quantity (how plausible such a fork is to occur). That is a judgement of the coin recipient. If your suggestion to them is to not accept any such coins ever, that's a valid point of view, but it applies equally to other coins as it does to Monero. Nothing presented is specific to Monero, except as I said the relative impossibility of blacklisting. Perhaps your advice is to not accept coins that don't allow for blacklisting, or perhaps I am misinterpreting.

Sorry there is a category error.

You erroneously equated different categories of risk as I already explained.

Now you are moving the goal posts and saying that all coins have this risk. That is a different point. And that point fails for numerous reasons:

1. Some coins have much higher network hashrate (difficulty) thus can't be as realistically attacked by someone with BCX's level of alleged resources.

2. Non-Cryptonote coins do not have ring signatures which make the block chain untracable and thus make it implausible (or very difficult) to do manual repair by segregating the stolen coinbase and double-spent traces from the transactions you'd like to keep.

3. Non-Cryptonote coins do not have throw away 20% of the timestamp information upon difficulty adjustment. I know you think the vulnerability I have broad-sketched above is not sufficiently detailed to warrant any concern, but nevertheless this is a risk that doesn't exist in other coins.

4. BCX killed Auroracoin (which btw rpietila invested in and asked my opinion about and I warned him it would be a pump and dump) and now he tells you what the vulnerabilities of XMR are, so these have to be taken as slightly more credible than if randomjoeblow said it.

Thus from my perspective at least, it gives the appearance you are still doing FUD control and refusing to be open-minded, rational, and objective. And this is the cultural problem of Monero.
215  Other / Archival / Re: delete on: October 04, 2014, 12:46:13 AM
This post is just my personal preference. It should not be taken as representative of most developers...

If the XMR code wasn't a shitload of C and written in some higher-level language such as Scala, I would enjoy coding it. Sorry I don't use C any more except for the optimized smallish portion of a code base. I wrote 10s of 1000s of lines of assembly and C code in the 1980s. Enough of that.
And thus I also like to code in high-level paradigms.

1. There is exceedingly little C except the crypto libraries (essentially copied from another existing crypto library project), which we aren't working on because existing crypto libraries are fine (other than verifying that the code has not been tampered with and tracks upstream fixes if any).  The bulk of the project is C++

Worse stench imo.

2. Simulations, prototypes or other contributions can be in any language. Tacotime's simulation used Python. Prototypes have also been doing in Python and then converted to C++. Other work has been done with Matlab, and probably some others I don't remember. Scala or well-presented math or anything else is welcome and useful.

I am the type of programmer who makes sure I understand all the code. I don't program blinded because I feel too handicapped if I attempt to do so (because I am always thinking total paradigm shifts, out-of-box, and global cascade, i.e. I load up the entire design in my head). So I don't just pop in an offer some code without investing myself in the code base of the project. I don't like to offer some half-assed simulation for example and then not be able to follow through with the implications found, etc..

I also like K.I.S.S.. Nothing irritates me more than projects that have so many dependencies I can't wrap my mind around it all. I believe in breaking large projects into smaller orthogonal projects with well defined APIs when desirable.

Vague assertions of uncertainty and doubt are close to the definition of FUD.

I don't care about any spin on the definition for political reasons. All I care about is what is productive or not towards the goal of refining crypto-currency.

No one is trying to suppress that even (futile in practice to even try anyway),

I have a copy of an IRC in my my private messages which refutes any claim there is no effort to suppress "FUD" (which might not necessarily mean trying to suppress my post but rather other less informed n00bs).

Any way, I don't care about that. The last point below is more relevant to me.

but I am pointing out that it isn't useful to Monero any other project.

If you are stating that the effort I put into this thread is not useful to XMR then that confirms for me, my pet theory of the reason XMR despite its formidable brain power won't be winning the crypto-currency race.

If you are claiming that no other project will benefit from my effort in this thread, I think that is myopic because for example I can see how the concepts I tossed around in this thread have aided my own project. Now you might mean any project that will ever be seen by the public, nevertheless it still seems highly presumptuous.
216  Other / Archival / Re: delete on: October 04, 2014, 12:27:18 AM
I am 80% certain that this cultural stance is going to be why XMR is beaten by another effort that understands better how to spur innovation by not suppressing or expecting a Cathedral style of progression.

The cultural stance is not significantly distinguished from the range of cultures I've seen on other open source projects on which and with which I've worked, which is quite a few.

Your characterization of it as a cathedral is straw man.

Perhaps it is a cathedral compared to your internal mental model of what constitutes bazzaar-style development, but by the standards of reality it is not.

Reality is the bazaar has been ongoing on this forum for years now. And you can't stop it. Don't you see all the experimentation in altcoins and all the wide-open discussion of ideas?

Your microcosm is your reality, but it is not the only reality.

For example, you were overruled on higher level of perpetual debasement. Some other coin will do what XMR doesn't.

My point is also that the Inverse Commons is not just intra-project, rather it is also inter-projects (as in the distinction between intrastate and interstate).

XMR is competing for developer resources, not developers competing to code for XMR as is the case for Linus.

Quote
You are equating the ability for a user to wait for say 6 confirmations to have a mathematically quantifiable probability of assurance, with the risk of a coin vulnerability allowing spends of any age to be double-spent (reverting a spend is double-spending).

No, I'm saying that any recipient of any coin is vulnerable to a chain fork, which is a judgement of the recipient that they have waited "long enough" for such a chain fork to become unlikely. I said nothing about 6 confirms. In practice recipients make their own judgement about number of confirms. In Bitcoin many use less, and in altcoins many require far more. In all cases there is likely some sort implicit or explicit fraud scoring, but that is really none of my business and is up to the recipients to sort out. Ring signatures do nothing to change this except what they are intended to do (inhibit tracing, and thus indirectly blacklisting), and nothing here is remotely specific to Monero.

Afaics, you entirely missed my point.

You had a category error. You compare quantifiable (known a priori) probability with attacks that have no quantifiable (no known a priori) probability of assurance. You compare from the perspective of the user a knowable and user-selected risk with an unknowable risk and no possible user choice (other than to divest).
217  Other / Archival / Re: delete on: October 04, 2014, 12:01:54 AM
so there paying u to find flaws and ur just badmouthing them and saying tehy don't care?

No that is a mischaracterization. What happened was that a bounty was offered for revealing an exploit that was shown to be stronger than what we had already published. The exploit turned out to have no obvious relevance to BCX, but nevertheless the requirements were met and the bounty was paid.

Agreed, except "a bounty" was paid and so far not "the bounty". Or at least last time I checked only 7.5 of 10 had been paid. Note smooth's group and jl777 paid instantly. It was explained to me that we have Christian leader who doesn't follow the Biblical instruction to pay wages daily upon agreed completion of work. A rich man floating on a poorer man's income is a sin according to the Bible. I used to do payables net 30 etc, until I became of aware of this Biblical verse and it shamed me.

The exploit turned out to have no obvious relevance to BCX...

Smooth is referring to the fact that I have not shown any proof that the private keys can be cracked, I only pointed to some simultaneous equations that are revealed when the spender of the ring is revealed per my (or their published) algorithm. One case of the simultaneous equations has been shown by their mathematicians to be equivalent to Diffie-Hellman exchange thus proven to be uncrackable by current assumptions. However, not all of the cases were disproven. Thus he is factually stating any relevance to BCX is unproven and not obvious. He is not saying there is no possible relevance, although I am assuming he thinks the probability is exceedingly small thus the onus on someone other than them to prove it is relevant.

Note afaics the mathematical point of Taleb's Antifragility is that many improbable long tail events accumulate.

Yet I think the algorithm they paid a bounty for may have an obvious relevance to BCX. If a BCX chain attack gets hopelessly wound in a Gordian knot, my algorithm may be able in some cases to untangle the anonymity so that it can be determined which transactions actually spent the stolen coin base rewards. I don't know how realistic it is until it is coded, tested, and characterized.

Useful contributions would likely take the form of code or well formed mathematical analysis. That is exceedingly rare here, but it does happen.

In case the issue of Linux comes up again, I will point out that I have personally made contributions to Linux. They took the form of recommended code changes coupled with test cases that clearly and explicitly demonstrated the benefit of the proposed code changes. If I simply posted about how this or that was a possible flaw without doing the work to support my claims, I would likely be ignored or ridiculed, as I have seen happen to many people in the Linux developer community.

Agreed. But you are conflating apples (Linux) and quarks (crypto-currency) because Linux is an effort to copy an existing operating system (Unix) and concept which was already fleshed out in the real world. Whereas, crypto-currency is still very much in the formative, research, experimentative, conceptual stage.

I am 80% certain that this cultural stance is going to be why XMR is beaten by another effort that understands better how to spur innovation by not suppressing or expecting a Cathedral style of progression.

TFM: "Entanglement" is no more an issue than reversal of the often-untraceable transfer of coins after a long (but ultimately temporary) chain fork, particularly through exchanges and other intermediaries. In both cases, when the fork is resolved one way or another, some transactions will be unwound, and those who accepted coins as valid on the wrong fork without a sufficient number of confirmations (always a judgement call) are out of luck.

Shocking.  Shocked

You are equating the ability for a user to wait for say 6 confirmations to have a mathematically quantifiable probability of assurance, with the risk of a coin vulnerability allowing spends of any age to be double-spent (reverting a spend is double-spending).
218  Other / Archival / Re: delete on: October 03, 2014, 11:16:12 PM
One person's FUD is another person's eureka instigator.

I'm waiting for that to happen even once. Then I will be convinced.

Some Bitcoin devs thought selfish mining was FUD.

I understand you want BCX to walk them through an attack with more details, but you know even most very smart Bitcoin devs didn't think selfish mining was real after the white paper was published until they went and built simulations to disprove it and ended up proving it.


In this space the reasonable a priori belief is that FUD is just FUD.

That does not mean it can't be reevaluated in light of new evidence, but without evidence, no, and without reevaluation the a priori remains rationally useful.

Irrational.

FUD (fear, uncertainty, doubt) is the reality because you don't know how vulnerable XMR is or is not. Thus it is healthy and should be priced in. Controlling information flow to prevent credible FUD dissemination is very unhealthy and prevents Antifragility. Note I am distinguishing credible and rational discussion of concepts from pure chicken little noise from n00bs.

http://unheresy.com/Information%20Is%20Alive.html#Knowledge_Anneals

Quote from: myself, who ever I am
Knowledge Anneals

Unsophisticated thinkers have an incorrect understanding of knowledge creation, idolizing a well-structured top-down sparkling academic cathedral of vastly superior theoretical minds. Rather knowledge primary spawns from accretive learning due to unexpected random chaotic fitness created from multitudes of random path dependencies that can only exist in the bottom-up free market. Top-down systems are inherently fragile because they overcommit to egregious error (link to Taleb's simplest summary of the math).

Note I traded emails with Taleb and it seemed he basically concurred.


The code is open to all eyeballs, just as is Linux. Github, including developer's forks on github are open. Discussions on #monero-dev and other suitable channels are also open.

There is nothing being hidden, but we await actionable competently analyzed information. So far there has been none.

If the XMR code wasn't a shitload of C and written in some higher-level language such as Scala, I would enjoy coding it. Sorry I don't use C any more except for the optimized smallish portion of a code base. I wrote 10s of 1000s of lines of assembly and C code in the 1980s. Enough of that.



I sat on the developer IRC for a while and I see mostly talk about the nitty gritty details about the source code. Sorry I am a high-level thinker. And thus I also like to code in high-level paradigms.
219  Other / Archival / Re: delete on: October 03, 2014, 10:50:19 PM
bite the hand that feeds u much?

Expending my scarce time to for example explain why checkpoints are not a complete solution is not helping?

That some people think I am not helping means they are biting the hand that feeds them.

So you agree ring signature entanglement from the coinbase transactions of the attackers fork could in theory occur??

How to unwind those?
This is a good question.
I'd like to think about it a bit, but I am still cleaning up the NEFT that spurted out of my face a while back.
After that, I'm going to feed the alpacas and see if anything comes to me.
It is a really good question.  Just too many distractions at the moment.

220  Other / Archival / Re: delete on: October 03, 2014, 10:38:19 PM
-blah-

Jimbob...you read our Monero Research Lab's very first publication, right? You know the one where we spoke about a cascading privacy failure if an attacker owned sufficient outputs? Here's a link for you to save yourself. At any rate, this could occur in a CryptoNote coin where persons unknown to everyone else controlled, to thumb suck an example, 82% of all the outputs. That would be an exceedingly unsafe CryptoNote coin to use, as those person(s) could easily reveal the actual signature of just about any transaction, thus negating any benefit of ring signatures.

When choose a currency to shill for, you really should choose one that doesn't have that flaw.

And XMR paid me (7.5 BTC thus far, 2.5 BTC in arrears) to supplement that with a potential amplification and mitigation, where for example in some cases the attacker doesn't even need any of the outputs that are in the ring signature. Omitting my contribution (which y'all paid for, thank you) could possibly be construed as subconscious "not invented here bias" (hope not).

The extent of that vulnerability has not been quantified and thus nothing of substance is demonstated, we merely have a proof of concept and a supposition. By contrast, the MRL-0001 analysis that shows numerically that 82% control with low typical mixin usage on the network is clearly devastating.

FUD (aka repeatedly proclaiming "this might be a flaw/vulnerability" ) is easy. Proving something to be an actual practical vulnerability is much harder.

Agreed.

That is certainly not "not invented here." We've uncovered various potential issues that are also in need of in depth analysis. We consider both to be in the same category, ...

Commendable.

... except that we don't go around shouting about "possible flaw" before we actually know the validity and extent we are looking it.

Controlled information is never as efficient as an Inverse Commons. The reason is Linus's law, "given enough eyeballs, every bug is shallow".

One person's FUD is another person's eureka instigator.

Myself and others have many times tried to suggest that the culture of controlling information flow is what is causing such a big problem for XMR's public perception. I also have this private speculation (my pet theory) that it limits the technical innovation, diluting the impressive brain power in the group.

Information and truth want to be free(dom).
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!