Bitcoin Forum
July 08, 2024, 09:08:31 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 »
121  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 14, 2019, 08:59:35 PM
Sbox are very common in many algorithm like this Smiley but yeah if you pick one from existing algorithm from service there can always be a backdoor and they are hard to design properly, maybe its possible to find simple one Who can fit for this. But its just in case if there can be too much weak numbers on long ring maybe it could improve, especially that the brute force can be made with // cores.

But the old ones like gost/des they already have been studied many times and their inner working is well known now.

The blowfish they can generate sbox from the data directly. But same never can be 100% sure with already made algorithm.

Maybe even simple huffman coding could remove some problems in case there is lot of zero or repetitive bit sequence. To make the number "more compact" so to speak. Even maybe only as a test like if the number can be easily compressed with huffman it mean if has low entropy and it should be changed.
122  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 14, 2019, 07:58:37 PM
S box they are like simple substitution tables, like look up table that make sequence less repetitive, you can find this in all advanced block cypher ( des,gost etc)

https://en.m.wikipedia.org/wiki/S-box


https://who.paris.inria.fr/Leo.Perrin/pi.html

The cryptographic properties of the S-box play a crucial role in the security of the algorithm because they are the only source of non-linearity. They are also at the center of the security arguments given by algorithm designers. In fact, designers are expected to explain how the S-box they used was designed and why they chose the structure their S-box has. For example, the AES has an S-box which is based on the multiplicative inverse in the finite field . This choice is motivated by the fact that both the linearity and the differential uniformity 1 of this permutation are the lowest known to be possible.

Its essentially to improve security when data can be predictible. Like if there is a sequence of zero or repetitive it will change it to something else less predictible which can deter certain analysis.

Even a simple compression algorithm could reduce "blanks" or repetitive sequence that can exploited but on short keys like its not very efficient, i think an s box would be more efficient.

Im not sure if its going to be very efficient for this, but could be improve it i guess. Normally its supposed to make bit operation like this more efficient/less predictible.

I know many programming language ( c c++ Java js php assembler).

I dont have strong background in cryptography but i study maths and been into cracking groups before so i know the basics Smiley i can never resist when i see numbers grid like this to make sense of them Cheesy

The attack it can be given a start num and the keys, the possible signatures after X rounds could be reduced and brute forced. Even on good algorithm its possible to divide key size, with a simple algo like it on weak numbers its possible attacks can exists. But need to see if it take more time to brute force than compute, as in the case its "Real time" race, so i dont think there is huge risk.

Not sure how i can help but the idea is interesting Smiley
123  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 14, 2019, 01:43:38 PM
Anyway even if there is a problem with that, normally things like  s-boxes can solve it easily. With a good s box worked on the input number, you could as well use 0 or such as input number and it would still be safer.

Maybe im wrong To think this, but if there is let say only 1 bit set in the start number even if Its a big number the sequence with bit operations are going To be more predictible.

In the simple text cypher algorithm, if both the key and the data are "weak" it can be exploited by certain attacks, in the case as the number is both the data and the key, if the number is "weak" there can be some attack to predict the sequence.

Its why adding some salt/initilization vector or an sbox rolled over the sequence of signature at each block could improve that, and wouldnt complexify the algorthm too much, but maybe its not necessary.

Normally its supposed to make block cypher algorithm safer, as the algorithm use the same principle, it would not cost much and make it safer, but not 100% sure Smiley

Maybe its not necessary because the cost of the attack is superior than the pow cost, so its more when cracking encryption when the attack time can be long, and its to narrow the possible numbers on brute force attack. But maybe even with weak number and analysis the brute force will be higher than the computation.
Oh, I understand what you're talking about, but this is a vain concern. All input data will be hashed, therefore, no “weak” numbers will be input to the algorithm. In addition, even one weak ring out of 1000 (for example) can in no way affect the result of the entire chain of rings.

Ha yes its true its hashed before, and its what i was thinking on hundreds of rings even if there is a weaker number it shouldnt matter too much. An s box doesnt cost a lot either if it can improve the security, if a very long ring would happen with a weak number its still a bit of waste, but i dont think its a big problem.

But need to see the brute force attack can be made with // units, so very weak rings could still be vulnerable.
124  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 14, 2019, 08:34:44 AM
Anyway even if there is a problem with that, normally things like  s-boxes can solve it easily. With a good s box worked on the input number, you could as well use 0 or such as input number and it would still be safer.

Maybe im wrong To think this, but if there is let say only 1 bit set in the start number even if Its a big number the sequence with bit operations are going To be more predictible.

In the simple text cypher algorithm, if both the key and the data are "weak" it can be exploited by certain attacks, in the case as the number is both the data and the key, if the number is "weak" there can be some attack to predict the sequence.

Its why adding some salt/initilization vector or an sbox rolled over the sequence of signature at each block could improve that, and wouldnt complexify the algorthm too much, but maybe its not necessary.

Normally its supposed to make block cypher algorithm safer, as the algorithm use the same principle, it would not cost much and make it safer, but not 100% sure Smiley

Maybe its not necessary because the cost of the attack is superior than the pow cost, so its more when cracking encryption when the attack time can be long, and its to narrow the possible numbers on brute force attack. But maybe even with weak number and analysis the brute force will be higher than the computation.
125  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 14, 2019, 04:40:32 AM
Ring Bit Function. 2 part.
https://www.youtube.com/watch?v=Ir9Ptfg0Nbg&feature=youtu.be
In this part we make a chain of RBF rings.


The thing with the ring diagram is you want to show two things in the same time.

There is the ring as the total work to do to complete the ring, and the repartition of the numbers, but the ring on the diagram show the total amount of work to do, not the total space of possible 256 bits numbers. Each round still advance linearly into the ring of total work to do even if the number distribution along the steps is not linear.
Yes, you are right - I did not immediately realize this. But I have a lot of work, so I try to explain as much as I can. Perhaps the video will be more clear. Moreover, the function is extremely simple ...


As far as i can tell, function like xor/ror are the bread and butter of most simple cypher cryptographic algorithm, so i would think it cannot be easily simulated with linear function, BUT the amount of work to do for a particular ring is still linear so the progression on the ring that represent the amount of work to do should still be linear Smiley
Here I do not quite understand what you were talking about. Indeed, for each ring, the amount of work is known in advance. However, we can change the complexity of the calculations, making the rings larger, as well as complicating the task. In addition, the most important thing is that we cannot predict in advance how much work is needed to calculate the signature. That is - how long will the chain of rings be.
In this sense, everything works exactly the same as with the usual SHA256 algorithm.


There should be a way to calculate the number of cycle needed for a certain combination of "keys" ( the number used in the ror/rol ) no ?

In any case for a given combination of keys the amount of work is determined.

The way i see it, it works like simple cypher algorithm, like the simplest cypher is only xoring a number with the key to encrypt and xoring it again to decrypt, here its like using the number itself as a key with bit rotation that cancel itself out after a certain number of iterations because rotation is cyclic.

So its why i tend to think its not easy to reverse, even if maybe certain numbers are going to be weaker than others, but maybe a "salting" or initialisation vector can be used on the start num to make it more random. Otherwise it can have same problem than plaintext cypher if the text and key are too repetitive, it can make the algorithm easier to crack, its possible with certain degenerate numbers the sequence will be more predictible, but in the average it shouldnt matter to much.

But its Nice idea, it looks like Sparta approach, when out numbered in the number of // core force the fight on one vs one Smiley
126  Economy / Gambling discussion / Re: How Truly Random is Random on: December 13, 2019, 02:47:58 PM
If its a computer RNG yes, if Its about statstics on random occurrence of an event in nature, and fair gambling game then no. Depend what is the purpose of the RNG its not the same if Its To generate secure password/key, for simulating natural pattern like fractal or perlin, if Its for gambling etc

But its the purpose of the algorithm that i posted before to make sure it tends toward a mean in the long run. How long is the run being dependant of the number of possibles values.

If you're saying that a gambling RNG (e.g. in an online dice casino) is purposefully biased towards the expected average then I again have to disagree.

Since we can safely assume as well that at some point in the future our repetitive streak is going to end, we can also claim that with each roll this point comes closer and closer. But by deduction, we can then easily reach a conclusion that rolls are not really as independent as they seem to be, or future is not as unpredictable as it appears, either

There is a difference between a probability of a single roll and a probability of streak. The probability of a single roll is what matters for the player. The probability of a streak is just a statistical curiosity. The probability of a long streak is lower than the probability of short streak and that's just simple math - multiplying the probability of each roll. However that doesn't mean you can predict when the streak ends any more accurately than you can predict a single roll.

In other words, if after 10 losses you decide to make a large bet expecting a win (the streak must end at some point, right?) you still have the same probability of losing or winning that roll as you had at the first roll of the game.



The idea for betting is you need to play a certain number of rolls to be closer to the mean, the number of time depend on the number of possible values like 2 for coin 6 for dice, 36 for roulette, on a certain number of rolls you can expect a number to come up.

But if you play always the same number, the win/loss should average every certain number of rolls, so you dont win anything in the long run.

You need to know how many rolls you want to play, and when you get above the statistical average you can expect for a number of roll you stop.

If you can play only 10 Times forget roulette, even dice it not enough to be on the safe side.

If you play only one roll, no statstics can really help.

You are always going to end up with gains at some point if you play long enough, all is to know when you are on a "lucky streak" and you have more chances to loose than win in the number of rolls you can play compared to what you gained so far.

127  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 13, 2019, 08:37:57 AM
If i may offer suggestion in the way you present it because its confusing.

The thing with the ring diagram is you want to show two things in the same time.

There is the ring as the total work to do to complete the ring, and the repartition of the numbers, but the ring on the diagram show the total amount of work to do, not the total space of possible 256 bits numbers. Each round still advance linearly into the ring of total work to do even if the number distribution along the steps is not linear.

As far as i can tell, function like xor/ror are the bread and butter of most simple cypher cryptographic algorithm, so i would think it cannot be easily simulated with linear function, BUT the amount of work to do for a particular ring is still linear so the progression on the ring that represent the amount of work to do should still be linear Smiley

Well its just my 2 cents to make it more clear, i can try to make some diagram to explain better Smiley
128  Bitcoin / Bitcoin Discussion / Re: France to launch their own cryptocurrency on: December 13, 2019, 08:23:06 AM
For all i know, if they do a crypto in France, its not going to be something you can speculate on, all the people i talked with from institution their eyes goes  Shocked Shocked when They hear speculation or trading. Public services are not here To make profits, at best They just want To balance the budget, and they wont use taxe monney to develop a system with speculative profits. Maybe They are looking into it as a way to access public service or various things, but its not going to be a thing to make profits on it.
129  Alternate cryptocurrencies / Altcoin Discussion / Re: Is Crypto/Blockchain hype over? on: December 13, 2019, 07:52:46 AM
With ever-declining prices on the crypto market since 2018, there has been a decline in the number of ICOs and the emergence of new crypto projects within the mainstream world. 2017 was believed to be the golden year of crypto and Blockchain technology as people invested massively across the market. Crypto startups, companies, and businesses have been largely successful since then. Due to the hype, we've seen many projects come up with a Blockchain solution for nearly anything in life. But the truth is that Blockchain is not needed for absolutely everything.

It's today, and there seems to be a decline in crypto interest among people in the mainstream world. The number of crypto startups and companies have been reduced too. Does this mean that the crypto/Blockchain hype is finally over? Or is it still going on?

What are your thoughts? Huh

For me what you call hype is what i would call a bubble, many people compared To the 2000 internet bubble, where it become the far West and everything possible, the limits of the system not being Well known, and people build huge expectations and everyone want to take their chance, and many start up Will exploit that.

I heard many times in past years that the previous period is just going to be a huge waste of time and monney, but that crypto are still going to take ground even if only a few will stay in the end.

Like there is going to be a crypto 2.0 like web 2.0, and at least we are going to see some system that will develop using blockchain and crypto in the next years, even if Its probably going to take some time.
130  Bitcoin / Bitcoin Discussion / Re: Who needs Satoshi Nakamoto principles? on: December 13, 2019, 07:22:21 AM
It's not miners who are driving fees up. They have strong incentive to fill blocks as much as possible. Due to limited block space, fees will always rise as transaction volume increases.

It's users employing horrible fee estimation -- including exchanges like Bitmex -- that drives fees up so much. Lots of exchanges weren't batching transactions in 2017 (and still aren't), which greatly exacerbated the congestion.
In theory in more decentralized mining you could increase the chance of a tx with small or no fees to be mined at some point, either because some miners dont have the same txs, or because they dont sort the memory pool on tx fees, because they are not mining 100% for maximizing profits etc

I'm not sure that's a function of mining centralization. I think it has more to do with how capital-intensive mining is, which in turn is a function of market demand for bitcoins.

The extreme example is when Bitcoin was CPU-minable and the mining costs were almost imperceptible. Profitability is not a concern when sunk costs barely exist. That dynamic drastically changes when it costs thousands of dollars just to buy a single miner, not to mention the other overheads involved. Naturally, hobby miners get forced out of the market, and a purely profit-motivated industry emerges.

For me its the product of total hash rate skyrocketting, above the level of what the total hash rate of the users of the network would be, which is only possible due to centralisation of the mining power.

The thing is the mining cost for those who have the specialised hardware is probably very low, so its not like the mining really is more expansive in the absolute, its expansive on general purpose hardware.
131  Economy / Gambling discussion / Re: How Truly Random is Random on: December 12, 2019, 03:05:48 PM
Its not about being far away from previous number, but the number of roll since the last occurrence of the number.

If there is 50 rolls without a one, you can still have more chance winning playing one in the next rolls.

But it will rarely reach even 50 rolls, and on the large number of roll you will still come back to average.

That's incorrect. You have the same chance to roll a 1 at any point during the game regardless of what (or how often) was rolled or not rolled before, otherwise the RNG would be flawed. There is no purposeful coming back to average. The average is the consequence of a good RNG, not something the RNG tries to simulate.

In other words, if you hit an "unusual" streak of below-expected-average numbers the RNG will not generate above-expected-average numbers to compensate. The actual average will get closer to the expected average in a (very) long run as your "unusual" streak will have less and less weight in the total.

If its a computer RNG yes, if Its about statstics on random occurrence of an event in nature, and fair gambling game then no. Depend what is the purpose of the RNG its not the same if Its To generate secure password/key, for simulating natural pattern like fractal or perlin, if Its for gambling etc

But its the purpose of the algorithm that i posted before to make sure it tends toward a mean in the long run. How long is the run being dependant of the number of possibles values.
132  Economy / Gambling discussion / Re: How Truly Random is Random on: December 12, 2019, 02:22:14 PM

You state this as a fact but it's backwards. If there was any kind of "memory" in a dice game it could be exploited by the casino or by the player, who could keep betting on numbers "far away" from the previous number to increase their chances. It would be over very quickly, most likely due to the casino going bankrupt. But fortunately it doesn't work like that.

Do you think the roulette wheel has memory too?

 Its not about being far away from previous number, but the number of roll since the last occurrence of the number.

 If there is 50 rolls without a one, you can still have more chance winning playing one in the next rolls.

But it will rarely reach even 50 rolls, and on the large number of roll you will still come back to average.


It cannot be exploited by casino because it will stay constant over long period of time for any number.

I hope you're not saying that casinos should use something like this. In the long run a good PRNG should approximate Poisson distribution and I believe certified RNGs are tested against it as well as many other statistical tests. But the RNG algorithm itself should not be based on it.

Online casino should use better algorithm, not sure what is the regulation or when an algorithm should be considered fair. The standard RNG will not necessarily have a poisson distribution, maybe it does maybe not.
133  Economy / Gambling discussion / Re: How Truly Random is Random on: December 12, 2019, 12:51:48 PM
You likely won't hit the same number again

But why? Is it because we are likely to not roll any two numbers in a row, or is there any other reason? What, in your opinion, makes the appearance of a certain number less likely? The fact that this number was just rolled? There are various ways of generating a random number: measuring the radioactive decay of an atom; measuring the atmospheric noise; measuring other processes which can create sufficient entropy needed to generate a random number. Can you imagine a process(among those used for RNG) where the appearance of a certain number becomes less likely for the reason of its recent appearance?

You actually raise valid concerns

Unsurprisingly, I've been thinking about that too (I guess we, the gambling folks, all have been thinking or feeling something to that tune at some point). The existential question is, well, how random is random? I mean if you see two allegedly random distributions but they are distinctively different from each other, can we actually consider them truly random, or at least one of them as not random?

And that gives you an answer to your questions. If two genuinely random distributions are defiantly different in certain ways , we could in fact draw a valid conclusion that there is some form of "memory" involved in the process, which makes it look like certain numbers are more probable after you hit a certain number (e.g. grouping). That's why the casinos should actually be looking for a random distribution that behaves more like a uniform one

For gambling what matter is that you know the odds of winning to establish strategy against other player on a fair game.

If the distribution is biased then the game is about knowing the bias and the distribution to make more profitable bets, a game like this cannot last long for gambling.

But actually its exactly the core principle of poisson distribution that the chances of seing a number depend on the previous sequence, because there is always a constant mean in the result on the long term, so the more it go away from the mean, the more chance it has To  rebalance in the next number.

Essentially you have exponentially more chance to see a number at each roll since its last occurrence, the exponential base being the mean, and short for coin tossing ( small amount of potential values), and longer for dice or roulette.
134  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 12, 2019, 12:33:59 PM
Maybe you should add a time scale on the vertical axis with the amount of work done on horizontal axis to show that its not going to take less time to compute it with // units.
135  Economy / Gambling discussion / Re: How Truly Random is Random on: December 12, 2019, 10:24:36 AM
For example, we can't understand what self-awareness and consciousness as one's mind and thoughts are because we simply don't have such an ability in us. Simply put, it is not so much for the lack of knowledge as for the lack of required capacity to process and interpret this knowledge. The same may be equally true for randomness (and probably a host of other phenomena). We are like ants trying to figure out things outside an anthill

So, is there a hope? I mean, currently, it is considered that with any truly random processes absolutely anything can happen next. If you just hit 99.99 on dice it doesn't mean you won't hit the same number again in the very next roll

You likely won't hit the same number again


But why? Is it because we are likely to not roll any two numbers in a row, or is there any other reason? What, in your opinion, makes the appearance of a certain number less likely? The fact that this number was just rolled? There are various ways of generating a random number: measuring the radioactive decay of an atom; measuring the atmospheric noise; measuring other processes which can create sufficient entropy needed to generate a random number. Can you imagine a process(among those used for RNG) where the appearance of a certain number becomes less likely for the reason of its recent appearance?

Its the poisson distribution Who works this way, and its supposed to model well occurance of natural phenomena either its shooting star or number of cars passing in a street. As long as there is a meaningfull "mean" of occurrence its supposed to fit, for gambling game like dice or coins its supposed to fit, not sure if there is a Real dιmonstration of why it so.

There are algorithm to generate poisson distribution from uniform RNG.

https://wiki.q-researchsoftware.com/wiki/How_to_Generate_Random_Numbers:_Poisson_Distribution

Which correspond to ideal distribution you want in a fair gambling game.
136  Alternate cryptocurrencies / Altcoin Discussion / Re: Anti ASIC/GPU/FPGA POW-algorithm. New (2019). on: December 12, 2019, 08:54:08 AM
For several years I was thinking about how to make a POW algorithm that will be stable not only against ASIC devices, but also against GPU miners.

Have you ever heard of RandomX? RandomX is not against anything. It just dont give advantage to anything. No one can build an ASIC that have double efficiency as top CPUs you can buy in computer stores all around the world.


Since Monero is one of the coins that changes its algo to stay anti-ASIC I've posted in their thread a link to this.

Monero should not change mining algo anymore. If RandomX works as is intended to that is it.

ASIC only works because the proof of work is based on each hash computation has a certain probability To gain a reward and a fixed cost, and you can compute an infinite number of them in //. Its the only thing that give asic an advantage.

With sequential computation like this, ASICs will be much less powerfull and cost efficient than even a smartphone, because 90% of the transistors are going to be useless for a simple sequential computation. Any cpu even the cheapest micro controller can do a xor/ror in one cycle, so its only a question of clock rate, which is pretty much capped now, and even common hardware already have close to max frequency,  ASICs dont have very high clock frequency.

Ring algorithm like this can still make it easy To proove the work that has been made by a miner.
137  Bitcoin / Bitcoin Discussion / Re: Who needs Satoshi Nakamoto principles? on: December 12, 2019, 06:58:19 AM
@squatter it doesn't have to be 51% attacks, Just look at what happened with the bitcoin tx fees when we were "mooning". Miners can definitely extort whatever fees they want... They can basically hold the network hostage.

It's not miners who are driving fees up. They have strong incentive to fill blocks as much as possible. Due to limited block space, fees will always rise as transaction volume increases.

It's users employing horrible fee estimation -- including exchanges like Bitmex -- that drives fees up so much. Lots of exchanges weren't batching transactions in 2017 (and still aren't), which greatly exacerbated the congestion.


In theory in more decentralized mining you could increase the chance of a tx with small or no fees to be mined at some point, either because some miners dont have the same txs, or because they dont sort the memory pool on tx fees, because they are not mining 100% for maximizing profits etc

In time of congestion the fees are almost the only way To establish a priority on transactions, and miners who have entreprise mentality on block mining Will always seek To maximise their profits, if mining work can be capped like this proposal, driving mining cost down and forcing it low, or at reasonable cost on common hardware, its not the same pressure on profitability either.
138  Bitcoin / Bitcoin Discussion / Re: Who needs Satoshi Nakamoto principles? on: December 12, 2019, 05:29:53 AM
Quote
Satoshi might have hoped for an ecosystem where mining was less concentrated

At least we are "identifying" that mining has become "concentrated", The top mining pools will BTW disagree with your statement Tongue The 1st step towards fixing something is identifying something is wrong... The worse thing which could happen is we let these things slip by till its too late to fix.

Whether it's even a problem at all is an eternal question. In 2014, GHash.IO exceeded 51% of the hash rate. There was no 51% attack, but the community took it as a wake-up call and GHash.IO's share of the hash rate perpetually fell thereafter.

For years, trolls have been fearmongering about the likelihood of collusion among the top pools to attack Bitcoin. Yet, this threat has never materialized.

Miners can quickly leave malicious pools, and mining pool administrators have strong financial incentives not to destroy their business with such an attack. These economic realities have been enough to secure the system thus far.

Remember also, 51% attacks don't give miners that much power. At worst, they can censor transactions and perform difficult-to-coordinate double spend attacks -- at great cost.

For a 51% attack its not really a problem, as long as mining interest is aligned with user interest, its not a problem.


Eventually it make it weaker against government control, raids or so, if bitcoin really needed mining farm to work Well it would still be vulnerable To crack down censorship like Napster.

And the other problem is mining profitability for small miners and the politics of it.
139  Bitcoin / Bitcoin Discussion / Re: Who needs Satoshi Nakamoto principles? on: December 11, 2019, 06:27:44 PM
"Proof-of-work is essentially one-CPU-one-vote."
Satoshi nakamoto



First of all. Satoshi is smart enough to have come up with bitcoin. But let's not forget that Satoshi is, well, human. Just like us! Stop looking at him/her/them as someone who is all-knowing and someone that doesn't make mistakes.
This people made me laugh, what do they think of Satoshi? a god? who is omniscient? Nakamoto is very clever for doing such a digital currency, we are having it right now but it has flaws coz there is nothing in this world created perfectly or to be perfect. There will and there will be imperfect things that can do the work and that made us human.

I'm not entirely convinced satoshi did get it wrong.  I think people are just taking those words too literally.  Note the use of the word "essentially".  To me, that implies satoshi was merely outlining a general concept, not issuing some sort of decree that it had to mean each user could only use a single CPU.  To the best of my knowledge, there has never been anything in the code that attempts to restrict usage in that way.

True, it's likely satoshi didn't envision Pools, GPU mining or ASICs happening quite as rapidly as they did.  Mining did become an arms race rather suddenly.  But I'm sure those words were never intended to be some sort of mantra to explicitly follow.

At the beginning its still this way it was understood by lot of users, that mining is equal to voting, and its what made bitcoin attractive to large public, not that it become a sort of mining oligarchy who extract all the profits.

What satoshi wanted and in which measure he anticipated how it turned is never very clear , but few years ago when it started to soar its mostly like this it was presented even if maybe satoshi already anticipated that mining would become a specialised industry.

https://bitcointalk.org/index.php?topic=1319681.msg13494495#msg13494495 thread resurection Smiley
140  Other / Politics & Society / Re: Health and Religion on: December 11, 2019, 06:08:55 PM



GαsEdit

Gαs activates the cAMP-dependent pathway by stimulating the production of cyclic AMP (cAMP) from ATP. This is accomplished by direct stimulation of the membrane-associated enzyme adenylate cyclase. cAMP can then act as a second messenger that goes on to interact with and activate protein kinase A (PKA). PKA can phosphorylate a myriad downstream targets.

The cAMP-dependent pathway is used as a signal transduction pathway for many hormones including:

ADH – Promotes water retention by the kidneys (created by the magnocellular neurosecretory cells of the posterior pituitary)GHRH – Stimulates the synthesis and release of GH (somatotropic cells of the anterior pituitary)GHIH – Inhibits the synthesis and release of GH (somatotropic cells of anterior pituitary)CRH – Stimulates the synthesis and release of ACTH (anterior pituitary)ACTH – Stimulates the synthesis and release of cortisol (zona fasciculata of the adrenal cortex in the adrenal glands)TSH – Stimulates the synthesis and release of a majority of T4 (thyroid gland)LH – Stimulates follicular maturation and ovulation in women; or testosterone production and spermatogenesis in menFSH – Stimulates follicular development in women; or spermatogenesis in menPTH – Increases blood calcium levels. This is accomplished via the parathyroid hormone 1 receptor (PTH1) in the kidneys and bones, or via the parathyroid hormone 2 receptor (PTH2) in the central nervous system and brain, as well as the bones and kidneys.Calcitonin – Decreases blood calcium levels (via the calcitonin receptor in the intestines, bones, kidneys, and brain)Glucagon – Stimulates glycogen breakdown in the liverhCG – Promotes cellular differentiation, and is potentially involved in apoptosis.[21]Epinephrine – released by the adrenal medulla during the fasting state, when body is under metabolic duress. It stimulates glycogenolysis, in addition to the actions of glucagon.GαiEdit

Gαi inhibits the production of cAMP from ATP. eg. somatostatin,prostaglandins

Gαq/11Edit

Gαq/11 stimulates the membrane-bound phospholipase C beta, which then cleaves PIP2 (a minor membrane phosphoinositol) into two second messengers, IP3 and diacylglycerol (DAG). The Inositol Phospholipid Dependent Pathway is used as a signal transduction pathway for many hormones including:

ADH (Vasopressin/AVP) – Induces the synthesis and release of glucocorticoids (Zona fasciculata of adrenal cortex in kidney); Induces vasoconstriction (V1 Cells of Posterior pituitary)TRH – Induces the synthesis and release of TSH (Anterior pituitary)TSH – Induces the synthesis and release of a small amount of T4 (Thyroid Gland)Angiotensin II – Induces Aldosterone synthesis and release (zona glomerulosa of adrenal cortex in kidney)GnRH – Induces the synthesis and release of FSH and LH (Anterior Pituitary)Gα12/13EditGα12/13 are involved in Rho family GTPase signaling (see Rho family of GTPases). This is through the RhoGEF superfamily involving the RhoGEF domain of the proteins' structures). These are involved in control of cell cytoskeleton remodeling, and thus in regulating cell migration.GβEditThe Gβγ complexes sometimes also have active functions. Examples include coupling to and activating G protein-coupled inwardly-rectifying potassium channels.


Hlm ok  Roll Eyes


Still its mostly empirical study

In a study of 203 male and female university students, participants with short (308-325 bp) vs. long (327-342) versions of RS3 were less generous, as measured by lower scores on both money allocations in the dictator game, as well as by self-report with the Bardi-Schwartz Universalism and Benevolence Value-expressive Behavior Scales; although the precise functional significance of longer AVPR1A RS3 repeats is not known, they are associated with higher AVPR1A postmortem hippocampal mRNA levels.[9]


The brain is not a mathematical theorem.


Not yet, but its getting closer and closer Smiley

https://en.m.wikipedia.org/wiki/Mathematical_and_theoretical_biology
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!