Bitcoin Forum
November 07, 2024, 02:37:07 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 ... 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 [68] 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 ... 152 »
  Print  
Author Topic: Economic Devastation  (Read 504797 times)
TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 25, 2015, 08:11:42 PM
 #1341

Gold and BTC are preparing (within next several months) to dump down to a final capitulation low some 20% or more lower.

I have been and remain a seller until next year.

RAJSALLIN
Hero Member
*****
Offline Offline

Activity: 665
Merit: 500



View Profile
April 26, 2015, 02:39:23 AM
 #1342

Gold and BTC are preparing (within next several months) to dump down to a final capitulation low some 20% or more lower.

I have been and remain a seller until next year.

So you think the lows will be slightly after 2015.75?

  A revolutionary decentralized digital economy 
`Join us:██`Twitter  ◽  Facebook  ◽  Telegram  ◽  Youtube  ◽  Github`
.ATHERO
.Internet 3.0 solution
CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
April 26, 2015, 08:25:20 AM
Last edit: April 26, 2015, 09:30:02 AM by CoinCube
 #1343

Discussions about entropic frame-of-reference, referential transparency, and closed systems will have to wait because that is going to require me to get much deeper than I want to write today.

Anonymint you will have a hard time writing that essay on entropy you mentioned upthread because it is in your definition of entropy that you error. As far as I can tell your misconception regarding entropy is the following.

Increased Entropy -> Increased Freedom-of-Action -> Increased Potential Energy

This argument while not necessarily wrong is an oversimplification.
It skips several intermediate steps and it is in the skipping of these transitions (your mind seems to jump immediately to end states) that you error.

When you are ready I have written the essay below that may (or may not) bring us closer to consensus.  Cool

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
April 26, 2015, 08:28:39 AM
Last edit: October 31, 2015, 03:04:19 AM by CoinCube
 #1344

Entropy

Entropy is both beloved ally and mortal enemy of life.

To understand the dichotomy of entropy we must delve deep into the heart of thermodynamics.
There are two fundamental laws of thermodynamics

Law #1: The total quantity of energy in the universe must remain constant.
Law #2: That the quality of that energy is constantly degraded irreversibly.

From these laws we can derive some general principals:
1) Ordered energy -> Disorganized energy
2) High quality energy -> Low-grade energy (heat)
3) Order -> Disorder
4) Improbability -> Probability

These principals outline a grim universe. At first glance they seem more compatible with a barren wasteland than a vibrant jungle. Thermodynamics demands constant and progressive degradation yet somehow we live in a world teaming with life and growth. Lets explore why.

The Genius of Life

Life is able to increase its internal order while simultaneously satisfying thermodynamics. At first glance this appears to violate the laws of thermodynamics. Instead of disorder and death life forms order and birth. Instead of probability and cessation it does the improbable and continues. Rather than disorganized heat it forms the ordered thought and action. Life is able to do this because it is a dissipative structure. It is a structure that achieves a reproducible state operating far from thermodynamic equilibrium in an environment in which it exchanges energy and matter.

Chemists can create complex high energy molecules in reactions that would not occur naturally by coupling those reactions with others that degrade other high energy molecules in low energy ones. As long as the combination of both reactions leads to an overall higher level of entropy the laws of thermodynamics are satisfied.Life has mastered this same process with stunning majesty. By coupling its existence to reactions that increase the entropy of the universe life is able to swim upstream against the tide of entropy. Plants harvest the energy of the sun. Animals consume that same energy indirectly.

Entropy is Mixedupness

There are numerous definitions of Entropy. When talking about the mechanics of life the most useful is the one given by statistical mechanics.

Entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification.

Entropy is a measure of the uncertainty which remains about a system after its observable macroscopic properties, such as temperature, pressure and volume, have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates.

The simple system of four balls traveling in the same direction, has less entropy than an otherwise identical system with 4 balls traveling in random directions as it takes more information to describe the exact physical state of the second system.

Entropy Devourers Life

All life struggles to avoid its eventual guaranteed entropic end.

The conditions of death, decay, cessation are higher entropy then the conditions of breathing, growth, and body integrity. Therefore life is always in constant danger of death able to delay it's destruction only by constant feeding. Deprived of energy for a prolonged length of time life quickly falls to the laws of thermodynamics.

In reproduction this gives rise to a great need for fidelity. When reproducing life must protect the integrity of its information. Unless both the ability to gather energy and the ability to reproduce is successfully transmitted that branch of life will cease.

The genetic information transmitted from parent to child is not immune to entropy. Random mutation's introduce variations into genetic code. These mutations increase entropy as they increase the spread over different possible microstates. This mutation is very dangerous to life as the vast majority of mutations either have no effect or have a detrimental one. Life acts to minimize the danger by purposefully limiting this entropy. Most multicellular organisms have DNA repair enzymes that constantly repair and correct damage. Fidelity of information is thus largly maintained between generations.

Fidelity, however, can never be 100%. The environment is not static but dynamic. Life must be able to adapt in response or life will cease. An organism with 100% fidelity of reproduction would never change improve or evolve. It would stand still while its predators and competitors grew more efficient. Long term survival requires mutation and change. For this life needs entropy.

The tradeoff between fidelity and adaptability can be best thought of as the balance between search and exploitation. If replication was without entropy no mutants would arise and evolution would cease. On the other hand, evolution would also be impossible if the entropy/error rate of replication were too high (only a few mutations produce an improvement and most lead to deterioration). Increasing the entropy results in the potential sacrifice of previously acquired information in an attempt to find superior information. Life must master the deadly dance of harvesting entropy. Absorb too much and the species succumbs to mutation tumors and death. Absorb too little and the species stagnates and succumbs to more agile competitors. Life it seems walks the razors edge.

Multicellular Organisms and Collectivism

The single celled organism is an anarchist. The multicelled organism is a collectivist.

Life is in constant search of frontiers for it is only at the frontiers that competitive advantage can be found. The single celled organism is in a constant war for survival. It lives in the base state of nature and any advantage may mean the difference between life and death. The cell with improved locomotion may find food or escape predators, the efficient cell may avoid starvation in lean times, and the larger cell may eat its smaller competitors. As a cell increased its internal complexity, however, diminishing returns accumulate. A single flagella allows a cell to move but having two does not double cellular speed. A larger size may be advantageous but cellular volume increases at a faster rate than its surface area making it difficult to transport enough materials across cellular membranes. Once a cell reaches this point it is economically more efficient to form multicellular organisms and specialize.  

High levels of specialization requires collectives composed of many cells. In the multicellular organism cells trade independence and degrees-of-freedom in exchange for the benefits of size, specialization, and efficiency. Cells in a multicellular organism lose the freedom to independently move and reproduce and their survival becomes dependent on their fellow cells. In exchange they get to be a part of something larger and can benefit from the development of specialization including specialized neural tissues.

Not all cells toe the collective line. Some cells throw off their chains and do whatever they want. When the rebels cells decide they want to divide and keep dividing the process is called cancer. In multicelled organisms cancer is simply the result of accumulated entropy gone wrong. Multicelled organisms like their simpler cousins need to adapt, change, and  evolve. A species with 100% fidelity would have no cancer but it would also never change.

Civilization and Collectivism

Civilization is collective of mutually interdependent multicellular organisms.

Civilization represents the next stage of evolution beyond the multicellular organism. Like the transition from the single to the multicelled organism it arises from the specialization and resultant interdependence of the sentient organisms that comprise it. With the onset of civilization environmental selection gives way to the selection of self-organization. The organization of the system increases spontaneously without this increase being controlled an external system. Civilization is a state of vastly higher organization and specialization. This increase in organization can be looked at objectively as an increase in potential energy.

Civilizations must change, grow and adapt or face stagnation, decay and collapse. They must maintain fidelity (stability over time) while also allowing for adaptability (growth). Self-organization to higher levels of potential energy in a self organizing system is triggered by internal fluctuations or noise aka entropy. These process produce selectively retained ordered configurations and is the order from noise principle. Search and adaptability must be maximized subject to the constraint of maintaining fidelity through time and not losing the information that has already been gained. It is only through balance that optimal outcomes are achieved.

The Future

The next stage in evolution may be the transition to an interstellar species.

If we achieve that goal we will create a system of yet higher order. This will be the entity formed by the interaction of multiple interdependent interstellar civilizations. Such an creation will have a potential energy that dwarfs our current society. It will only form if we find ways to vastly improve our technology and significantly improve our current dissipative structures. These improvements will be made possible by the very entropy we seek to overcome as we make the climb from probability to improbability.

Edit: Post edited 10/30/2015 for clarity and brevity

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
April 26, 2015, 08:32:24 AM
 #1345

I am waiting for TPTB to come back and admit that I am correct on all points  Grin

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 12:00:08 PM
Last edit: April 26, 2015, 11:47:28 PM by TPTB_need_war
 #1346

CoinCube,

That is horrendous. I give you a D grade.

You characterize increasing information content as degradation.

I can see now why you are so blinded and deluded about free markets. Your conceptualization is all wrong. More microstates doesn't imply less of anything. For example, we can get more macrostates at the human level by increasing the population. The wisdom in the Bible of "go forth and multiply" combined with the fact that every human is unique.

Entropy is the information content of a system, or in other words the minimum pieces of information required to specify the state of the system. In the digital world, it is the maximum theoretical compression of some data.

Who taught you that nonsense? At the university?

Law #2: That the quality of that energy is constantly degraded irreversibly.

The quality of the matter of the universe is constantly trending to maximum information content irreversibly.

From these laws we can derive some general principals:
1) Ordered energy -> Disorganized energy
2) High quality energy -> Low-grade energy (heat)

Low information content organization of matter -> Higher information content organization of matter

3) Order -> Disorder

Low information content -> Higher information content

4) Improbability -> Probability

Certainly few chances -> More chances

These principals outline a grim universe. At first glance they seem more compatible with a barren wasteland than a vibrant jungle.

These principals outline a beautiful universe competing to create more diversity and chances for life to prosper.

Thermodynamics demands constant and progressive degradation yet somehow we live in a world teaming with life and growth. Lets explore why.

Thermodynamics demands constant and progressive increase in the diversity of systems, i.e. higher information content, giving rise to a universe teaming with life and growth.

The Genius of Life

Life is able to increase its internal order while simultaneously satisfying thermodynamics. At first glance this appears to violate the laws of thermodynamics. Instead of disorder and death life forms order and birth. Instead of probability and cessation it does the improbable and continues. Rather than disorganized heat it forms the ordered thought and action. Life is able to do this because it is a dissipative structure. It is a structure that achieves a reproducible state operating far from thermodynamic equilibrium in an environment in which it exchanges energy and matter.

Chemists can create complex high energy molecules in reactions that would not occur naturally by coupling those reactions with others that degrade other high energy molecules in low energy ones. As long as the combination of both reactions leads to an overall higher level of entropy the laws of thermodynamics are satisfied.Life has mastered this same process with stunning majesty. By coupling its existence to reactions that increase the entropy of the universe life is able to swim upstream against the tide of entropy. Plants harvest the energy of the sun. Animals consume that same energy indirectly.

The fractal nature of life is such that high information content microstates can also be formative of lower information content macrostates without lowering the overall information content of the system thus being congruent with the irreversible trend of entropy towards maximum information content.
 
Entropy is Mixedupness

There are numerous definitions of Entropy. When talking about the mechanics of life the most useful is the one given by statistical mechanics.

Entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification.

Entropy is a measure of the uncertainty which remains about a system after its observable macroscopic properties, such as temperature, pressure and volume, have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates.

The simple system of four balls traveling in the same direction, has less entropy than an otherwise identical system with 4 balls traveling in random directions as it takes more information to describe the exact physical state of the second system.

Correct, except you use scary terms that deceive and I stated it more simply at the top of this post.

Entropy is not mixedupness. You are trying to misconstrue information content as being disorganized. The randomness is simply because there are more chances for diversity. The system is still optimally organized to maximize efficiency. The only way we get efficiency is by being congruent with thermodynamics.

There is no uncertainty per se. The uncertainty is only because information about the microstates has been excluded, i.e. that is the information content.

Entropy Devourers Life

All life struggles to avoid its eventual guaranteed entropic end.

The conditions of death, decay, cessation are higher entropy then the conditions of breathing, growth, and body integrity. Therefore life is always in constant danger of death able to delay it's destruction only by constant feeding. Deprived of energy for a prolonged length of time life quickly falls to the laws of thermodynamics.

You've definitely proven that you've been indoctrinated with Marxist bullshit at the university!

Life is in competition to create more diversity and information content. You made a similar error as Ray Kurzweil in that you are only looking at the body as physical system and failing to factor in the information content of the uniqueness of each individual brain; thus the resultant diversity and information content that spawns from it. You are welcome to reread my essay Information Is Alive! more carefully.

What I mean specifically is that you think the higher entropy state is returning the body back to dust, because you may suppose that dust has more information content  Huh Your simpleton mind is conflating micrograins with microstates. Dust may not have more microstates (i.e. may not need more information to describe its states) than the internal entropy of the body. Moreover (same error as Kurzweil), dust is static, non-interacting and life is dynamic (and exponential interaction via network effects and Reed's Law) thus the information required to describe life is orders and orders-of-magnitude greater. Thus death without reproduction is actually the trend that is not congruent with the 2nd Law of Thermodynamics.

You Marxists don't seem to understand that natural function of life and renewal is already perfected by nature. It isn't in any danger, except when you Marxists think you need to save it (and then you destroy it by lowering the information content of the system).

Perhaps someone forgot to inform you that we don't live in a 3D universe (and the earth isn't flat).

I am sure you agree that entropy is dimensionless. So why did you forget that spacetime is 4D and besides the interaction of larger things via network effects (and over the time domain) is much more complex than just breaking 3D shapes into smaller parts.

You've approached entropy as a 5 year old would (except even when I was 5 I was probably already thinking about the fact that 3D shape has much less information content than time, relativity, and wave interference).

In reproduction this gives rise to a great need for fidelity. When reproducing life must protect the integrity of its information. Unless both the ability to gather energy and the ability to reproduce is successfully transmitted that branch of life will cease.

The genetic information transmitted from parent to child is not immune to entropy. Random mutation's introduce variations into genetic code. These mutations increase entropy as they increase the spread over different possible microstates. This mutation is very dangerous to life as the vast majority of mutations either have no effect or have a detrimental one. Life acts to minimize the danger by purposefully limiting this entropy. Most multicellular organisms have DNA repair enzymes that constantly repair and correct damage. Fidelity of information is thus largly maintained between generations.

Fidelity, however, can never be 100%. The environment is not static but dynamic. Life must be able to adapt in response or life will cease. An organism with 100% fidelity of reproduction would never change improve or evolve. It would stand still while its predators and competitors grew more efficient. Long term survival requires mutation and change. For this life needs entropy.

The tradeoff between fidelity and adaptability can be best thought of as the balance between search and exploitation. If replication was without entropy no mutants would arise and evolution would cease. On the other hand, evolution would also be impossible if the entropy/error rate of replication were too high (only a few mutations produce an improvement and most lead to deterioration).Increasing the entropy results in the potential sacrifice of previously acquired information in an attempt to find superior information. Life must master the deadly dance of harvesting entropy. Absorb too much and the species succumbs to mutation tumors and death. Absorb too little and the species stagnates and succumbs to more agile competitors. Life it seems walks the razors edge.

Genetic repair exists because the result is higher entropy than without it, because as you correctly point out that with unconstrained mutation the species would mutate away from acquired evolutionary functionality because the feedback loop of survival-of-the-fittest can not anneal too fast due to the roughly constant gestation and lifespan. Note that species with much shorter lifespans can mutate faster.

The key point that you are missing is the underlined one.

You attempt to paint entropy as dangerous on a microstate level without factoring in that the overall entropy is higher with the genetic repair in place.

Life isn't on any razor's edge. You Marxists (and your “primitive, post-paleozoic, hunter-gatherer” contrived false flags and FUD) are![1]

This is why I stated you are misapplying a theory about genetic microstates to the overall entropy of the society. Major, major, major error on your part!

This is what I meant upthread when I said I would need to talk about the entropic frame-of-reference. I knew you were committing this error in your thinking. I was going to address it in my proposed future essay. Any way, now you know.

[1] https://www.youtube.com/watch?v=7W33HRc1A6c&t=73

Multicellular Organisms and Collectivism

The single celled organism is an anarchist. The multicelled organism is a collectivist.

Simpleton nonsense. Also, you may mean to say that the cells in the organism are collectivists (which is not the same as nonsense that the organism is a collectivist).

Life often tries to get bigger. Bigger things can avoid getting eaten and eat smaller things. However, as a cell increases in size its volume increases at a faster rate than its surface area. Beyond a threshold this results in an inability to transport enough materials across cellular membranes to accommodate the cellular volume.

Correct that in order to grow larger in 3D then microstates must be amalgamated in fractal structures but again focusing only on 3D size is very myopic and off-topic.

Single celled organisms therefore cannot get big. Getting big requires collectives of multicellular organisms. Cells trade independence and degrees-of-freedom in exchange for the benefits of size, specialization, and efficiency. Cells in a multicellular organism lose the freedom to independently move and reproduce and critically their survival becomes dependent on their fellow cells. In exchange they get to be a part of something big and can benefit from the development of specialization including specialized neural tissues.

This is a relevant analogy because it elucidates why my theory about the Knowledge Age says that collectivization of humans will soon become unnecessary and undesirable.

Cells can't accomplish much by themselves. An individual cell without its brethren in the body can't even produce. Thus a cell has no choice and its optimal choice is to be in a collective.

Humans in the Industrial Age were in a similar predicament. They couldn't produce much by themselves. The home (cottage industry, i.e. Luddites) production of the 1800s couldn't produce at the same low cost as factories and individuals couldn't start their own factories because there was a large upfront capital cost. Naturally this capital had to be concentrated.

Whereas production in the Knowledge Age only requires a brain and a computer. Thus humans no longer benefit much from nor need top-down control. And the exponential network effects of Reed's Law promises exponentially higher entropy once we break away from the paradigm we have now.

Not all cells toe the collective line. Some cells throw off their chains and do whatever they want. When the rebels decide they want to divide and keep dividing the process is called cancer. The body's immune system can destroy cancer cells it is able to identify. Sometimes, it succeeds and the cancer is destroyed. Other times, however, rebel cells are able to make themselves invisible to the immune systems. When that happens the cancer cells get to keep doing whatever they want... for a time.

In multicelled organisms cancer is simply the result of accumulated entropy gone wrong. Multicelled organisms like their simpler cousins need to adapt change and evolve. A species with 100% fidelity would have no cancer but it would also never change. Warning: never ever use
this argument to comfort anyone with cancer! It won't go over well.

Again attempting to misapply microstate biology to the entropy of human society is simpleton nonsense.

Civilization and Collectivism

Civilization is collective of mutually interdependent multicellular organisms.

Civilization represents the next stage of evolution beyond the multicellular organism. Like the transition from the single to the multicelled organism it arises from the specialization and resultant interdependence of the sentient organisms that comprise it.

With the onset of civilization environmental selection begins to give way to the selection of self-organization. The organization of the system begins to increase spontaneously without this increase being controlled by the environment or an otherwise external system. Civilization is a state of vastly higher potential energy. This increase in organization can be looked at objectively as a decrease in statistical entropy.

Except human organization is not a decrease in entropy. Again you are myopically focused only on 3D shape. Refer to my prior explanation on your myopia.

Collectivized society in an Industrial Age is a decrease in entropy and that is why it can't stay forever. Fortunately we now have the technology to move to the next paradigm in decentralized production.

All self-organizing systems which decrease their thermodynamic entropy must export that entropy into its surroundings. Thus civilization like the monocellular cell is a dissipative structure.Entropy in the multicelular organism produces mutations and cancer. Entropy in the higher order civilization produces rape, murder, human-trafficking, and terrorism yet allows for growth, change, and progress. In the multicelled organism cancer is suppressed by the immune system. The functional equivalent in the higher order civilization is the police.

Hahaha.  Cry That is entirely nonsense and bullshit. Since that is your worldview, it explains all your stubborn nonsense upthread.

As I already explained, this randomness is naturally constrained as necessary to maximize overall entropy and not just 3D entropy but entropy measured over every dimension including the time and network effects domains (and many other domains). There is a natural entropic maximizing reason that only about 2.5% of the population are sociopaths.

No the police don't do a damn thing. The functional equivalent of the immune system is your neighbors and their baseball bats. Remember the police never arrive to the crime on time (except in the Philippines where they are somehow always there before the crime begins  Cheesy, although I read about Civil Asset Forfeiture and other instances where this phenomenon appears to be spreading Westward at an accelerating rate  Cool)

Civilizations must change, grow and adapt or face stagnation, decay and collapse. They must maintain fidelity (stability over time) while also allowing for adaptability (growth). Self-organization to higher levels of potential energy in a self organizing system is triggered
by internal fluctuations or noise aka entropy. These process produce selectively retained ordered configurations and is the order from noise principle by Heinz von Foerster. Search and adaptability must be maximized subject to the constraint of maintaining fidelity through time and not losing the information that has already been gained. It is only through balance that optimal outcomes are achieved.

The Future

The next stage in evolution will be the transition to an interstellar species.

If we achieve that goal we will create a system of yet higher order. This will be the entity formed by the interaction of multiple interdependent interstellar civilizations. Such an creation will have a potential energy that dwarfs our current society. It will only form if we find ways to vastly improve our technology and significantly improve our current dissipative structures. These improvements will be made possible by the very entropy we seek to overcome as we make the climb from probability to improbability.


Don't assume you know how the entropy will be maximized. It could even be virtual worlds.

In summary, you are oversimplifying the concept of entropy as being one about 3D spatial order. The information content required to describer higher forms of life is not dominated by the 3D states.

Sorry. I told you that you will never win this debate. I knew already what your myopia was even before you wrote this post.

rpietila
Donator
Legendary
*
Offline Offline

Activity: 1722
Merit: 1036



View Profile
April 26, 2015, 12:32:34 PM
 #1347

I think TPTB is being too strict to define the word. CoinCube's points and explanations are very admissible from the Wikipediatic standpoint.

We can say that the block hashes are valuable in because they have low entropy. It costs a lot (25 BTC  Cheesy ) to produce one, even though it is only a short string of characters and compressible to half in length due to the other half being zeros.

In contrast, generating new private keys is not very costly. Their entropy is high. A private key is even harder to crack than a block hash. In this case entropy actually corresponds to information content.

But if a book is generated via a random method, it does not contain information and is not interesting to readers. Neither it is if it is too little entropy.

The number of possible states of the system is per se not much indicative of anything. Life has a balance, it's not excessively low entropy but it's also wrong to say that higher entropy always makes things better. My computer has higher entropy if I smash it into pieces, but then I cannot use it. Also it has a lower entropy if it is powdered and elements separated, but it's equally useless this way.

So the intermediate result is that both have brought good points into the discussion.

HIM TVA Dragon, AOK-GM, Emperor of the Earth, Creator of the World, King of Crypto Kingdom, Lord of Malla, AOD-GEN, SA-GEN5, Ministry of Plenty (Join NOW!), Professor of Economics and Theology, Ph.D, AM, Chairman, Treasurer, Founder, CEO, 3*MG-2, 82*OHK, NKP, WTF, FFF, etc(x3)
TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 12:42:31 PM
Last edit: April 26, 2015, 11:51:20 PM by TPTB_need_war
 #1348

rpietila, wait i am not done editing that post. Continue reloading the page until I have addressed every point in his post.

Being admissible in Wikipedia is not always a badge of honor. Wikipedia is often incorrect.

I will also address your post soon after.

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
April 26, 2015, 02:03:50 PM
 #1349

CoinCube,

That is horrendous. I give you a D.

TPTB I am amazed you did not give me an F.  Cheesy

Having earned a 4.0 in simultaneous biochemistry and mathematics degrees I can assure you that I know when I write at an A level.


rpietila, wait i am not done editing that post. Continue reloading the page until I have addressed every point in his post.

You look like you are still composing your rebuttal so I will reserve judgement until you are complete.

Unfortunately I have used up all my Bitcointalk time for this week. I will be back next weekend. Hopefully thaaanos or L3552 can carry the torch for me until I return but if not I will continue to try and help you see your logical blind spot next week.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 02:07:52 PM
 #1350

Having earned a 4.0 in simultaneous biochemistry and mathematics degrees I can assure you that I know when I write at an A level.

Good slaves always give the "correct", indoctrinated answers.

Quote from: Jason Hommel
I have two examples from school I'd like to share.  In my High School Junior English class back in 1987, I was getting discouraged.  I kept getting B's on my essays, despite my best efforts at analyzing the literature up for discussion.  I didn't know what else to do, and one day I just gave up.  Instead of analysis, I simply said how great the literature was, and I parroted back the same exact analysis that was discussed in class with absolutely zero new insights.  To hide the lack of real discussion and analysis in my essay, I enlarged my handwriting to fill the page.  I was expecting a D minus, or even an F.  I was almost ashamed of myself.

Some of you might guess what happened next.  I got an A.  My first A.  I was simply astounded.  Flabbergasted.  Surprised beyond belief.  I could not believe it.  I seriously wondered why.  I went to the teacher.  I explained myself.  I admitted there was no analysis.  She rebuked me.  Of course there was analysis; the same one we discussed in class, she said.  Exactly, I said.  Exactly, she said.  What?  I don't get it, don't you want us to analyse it?  But you did, she said.  And you kept it short, simple, to the point, and you were exactly on point, and understood the class discussion exactly, she said.  But I felt I didn't analyse anything; I felt like a tape recorder with zero brain activity or real analysis.  I brought no new insights to the table, nothing original, no indication that I was thinking about what we read.  But I showed I was paying attention in class, she said.  That's thinking about it.  Wow.  I don't know if the goal of my teacher was an intent to crush my spirit, but wow.

My second example is from my college days.  I was three credits short to graduate, and so I took one final class, stretching out graduation another semester.  (I now realize I should have taken two classes that last semester on the rare event that I failed a class.)  Anyway, it was some sort of political science class that I thought would be easy and interesting.  But it was more like political indoctrination, and I ended up hating the class, and during discussions, I mostly was just working on keeping my mouth shut so I could get through it.  I could sense that arguing against the indoctrination was risking being failed, and I really didn't want to risk not getting my diploma for another semester!

For the final exam, I had to write an essay.  The topic was something like "fairness in education".  The basic thrust of the essay was to parrot back the views presented in class, that equality of funding was the only way to really ensure fairness of opportunity in education to be able to allow the potential geniuses of the world the chance to better themselves to allow them to make the maximal contribution to world society.  So, my innovation that was not any sort of innovation, was to advocate a world government, and equality of funding for all children all over the world, to ensure the most fair educational environment to most greatly assist in the development of humanity.  In other words, I had to pretend to be a socialist!  I hereby admit my guilt, and let me pay my penance and make up for that essay now.

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
April 26, 2015, 02:23:19 PM
Last edit: April 26, 2015, 02:37:07 PM by CoinCube
 #1351

Having earned a 4.0 in simultaneous biochemistry and mathematics degrees I can assure you that I know when I write at an A level.

Good slaves always give the "correct", indoctrinated answers.

You again demonstrate the classic ENFP weakness.

http://www.16personalities.com/enfp-strengths-and-weaknesses

Quote
Independent to a Fault - ENFPs loathe being micromanaged and restrained by heavy-handed rules - they want to be possessors of an altruistic wisdom that goes beyond draconian law.

The challenge for ENFPs is that they live in a world of checks and balances, a pill they are not happy to swallow.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 02:25:27 PM
Last edit: April 27, 2015, 12:09:00 AM by TPTB_need_war
 #1352

Having earned a 4.0 in simultaneous biochemistry and mathematics degrees I can assure you that I know when I write at an A level.

Good slaves always give the "correct", indoctrinated answers.

You again demonstrate the classic ENFP weakness.

http://www.16personalities.com/enfp-strengths-and-weaknesses

Quote
Independent to a Fault - ENFPs loathe being micromanaged and restrained by heavy-handed rules - they want to be possessors of an altruistic wisdom that goes beyond draconian law.

The challenge for ENFPs is that they live in a world of checks and balances, a pill they are not happy to swallow.

You've made no point. My accusation was backed by my prior factual rebuttal. You've tried to say entropy is only about 3D information. Duh!

P.S. I am very much aware of my rebellious nature and also I am aware of where I need to toe the line and where I don't. In the case of the Knowledge Age and the fall of the old world into a one-world eugenics and the Knowledge Age breaking away into the new world, I am confident I am aligned with reality. My correct understanding of entropy and economics tells me this.

Add: Attaining a 4.0 GPA (and with a dual major!) is not easy and I applaud your achievement. Surely you have learned many things that can be usefully applied to the real world. And since you have real major in Math and I only dabbled in a minor in math, then surely you can run rings around me in areas where I haven't formally studied, e.g. Topology, etc.. Before I got ill, I had intended to go back and complete all the academic books for a Math major on my own. But unfortunately life robbed me. I suppose I am getting too old now (although this may be more the effects of the Multiple Sclerosis). The mental dexterity for Math declines precipitously after age 40. I never claimed that I don't have weaknesses nor am I trying to boast. My emphatic point is that you are debating me on a topic I spent years thinking about and even wrote 3 essays on. And this information content topic falls right into my career vocation of computer science. So please don't feel bad if you lose this debate.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 02:33:32 PM
Last edit: April 26, 2015, 02:56:29 PM by TPTB_need_war
 #1353

I think TPTB is being too strict to define the word. CoinCube's points and explanations are very admissible from the Wikipediatic standpoint.

We can say that the block hashes are valuable in because they have low entropy. It costs a lot (25 BTC  Cheesy ) to produce one, even though it is only a short string of characters and compressible to half in length due to the other half being zeros.

In contrast, generating new private keys is not very costly. Their entropy is high. A private key is even harder to crack than a block hash. In this case entropy actually corresponds to information content.

But if a book is generated via a random method, it does not contain information and is not interesting to readers. Neither it is if it is too little entropy.

The number of possible states of the system is per se not much indicative of anything. Life has a balance, it's not excessively low entropy but it's also wrong to say that higher entropy always makes things better. My computer has higher entropy if I smash it into pieces, but then I cannot use it. Also it has a lower entropy if it is powdered and elements separated, but it's equally useless this way.

So the intermediate result is that both have brought good points into the discussion.

Again the frame-of-reference is critical, because the entities that are creating the most entropy in their system decide what is information content and what is not because they are maximizing the overall entropy.

Does a tree fall in the forest if no one ever sees it and it decays before anyone does? In what relevance did it exist?

A random generator could spit out zillions of "books" but do they exist if no one reads them?

But even that is not the rebuttal (well actually relevance is exactly what I write below so the above is the same point as below but that is too abstract for most readers).

The actual rebuttal cuts straight to the meat of the issue. All those random books can be described by the instructions on how to create the random generator.

Again you are commiting a similar error as CoinCube.

It is the humans who are creating the entropy, not the random generator.

TADA!  Wink

(Don't even try to win a debate against me that I've already told you that I can't lose)

(For any readers that didn't understand the above point, remember entropy is the minimum information required to describe the outcome, thus instructions on building the random generator is the compressed information content and thus the actual information content)

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 262


View Profile
April 26, 2015, 02:51:09 PM
Last edit: April 27, 2015, 12:22:16 AM by TPTB_need_war
 #1354

CoinCube,

That is really brave of you. You have done a service of showing how academics are trained to view humans as dumb-downed cells and thus why they think we have to controlled like cows in a corral.

Your academic cathedral entirely missed the point that the potential network effects of billions of unique human minds is an unfathomably high entropy. I suspect that potential energy efficiency exceeds the number of atoms in the universe, but I haven't tried to calculate it. Perhaps I should try to think about how to calculate the human entropy. Would be a very interesting thought experiment.

rpietila
Donator
Legendary
*
Offline Offline

Activity: 1722
Merit: 1036



View Profile
April 26, 2015, 02:59:53 PM
 #1355

I think TPTB is being too strict to define the word. CoinCube's points and explanations are very admissible from the Wikipediatic standpoint.

We can say that the block hashes are valuable in because they have low entropy. It costs a lot (25 BTC  Cheesy ) to produce one, even though it is only a short string of characters and compressible to half in length due to the other half being zeros.

In contrast, generating new private keys is not very costly. Their entropy is high. A private key is even harder to crack than a block hash. In this case entropy actually corresponds to information content.

But if a book is generated via a random method, it does not contain information and is not interesting to readers. Neither it is if it is too little entropy.

The number of possible states of the system is per se not much indicative of anything. Life has a balance, it's not excessively low entropy but it's also wrong to say that higher entropy always makes things better. My computer has higher entropy if I smash it into pieces, but then I cannot use it. Also it has a lower entropy if it is powdered and elements separated, but it's equally useless this way.

So the intermediate result is that both have brought good points into the discussion.

Again the frame-of-reference is critical, because the entities that are creating the most entropy in their system decide what is information content and what is not because they are maximizing the overall entropy.

Does a tree fall in the forest if no one ever sees it and it decays before anyone does? In what relevance did it exist?

A random generator could spit out zillions of "books" but do they exist if no one reads them?

But even that is not the rebuttal (well actually relevance is exactly what I write below so the above is the same point as below but that is too abstract for most readers).

The actual rebuttal cuts straight to the meat of the issue. All those random books can be described by the instructions on how to create the random generator.

Again you are commiting a similar error as CoinCube.

It is the humans who are creating the entropy, not the random generator.

TADA!  Wink

(Don't even try to win a debate against me that I've already told you that I can't lose)

Your lucid points are inundated below beta-males' insistence on using the common definition for entropy, instead of the one by you.

In common usage, entropy has been defined equal to "randomness"/"lower energy state" (producible by thermodynamic deterioration or a RNG). The definition of "information content" (producible by humans) is secondary.

HIM TVA Dragon, AOK-GM, Emperor of the Earth, Creator of the World, King of Crypto Kingdom, Lord of Malla, AOD-GEN, SA-GEN5, Ministry of Plenty (Join NOW!), Professor of Economics and Theology, Ph.D, AM, Chairman, Treasurer, Founder, CEO, 3*MG-2, 82*OHK, NKP, WTF, FFF, etc(x3)
l3552
Newbie
*
Offline Offline

Activity: 31
Merit: 0


View Profile
April 26, 2015, 07:52:49 PM
 #1356

CoinCube, what essay! It seems that the SPIRAL POWER is always looking for frontiers.

*CLAP CLAP*

So, when cells first started to trade info back and forth on multicelular organisms they were no more than clones of each others flowing on the soup. The core information was present on every cell and the environmental pressure was so huge and indistinct that granted their high fidelity. They were tribes. There were the ones that avoided entropic end by gathering energy from sun light, farmers, and others by gathering energy from the prior, hunter-gatherer.

Then the "Eukaryote" incorporated "Prokaryote" and their genetic pool to the multicelular organization forming a system less fragile. Reproduction behaviors changed and more energy could be spent on size.
Came the time when the primitive systems of the multicelular organism couldn't handle the new stress for size. Vessels(transport) and receptors(money/politics) were improved and the multicelular systems born with new solutions and energy provisions. Circulatory system allowed the development of the urinary, digestive, musculoskeletal, nervous, immune, respiratory...

You know what I think. Our civilization is now facing immune and urinary failure due to stress caused by mis-signalling. A reform of our signaling system should remove the costs from separating politics and market, renewing a stronger individual pressure on politics by customizing jurisdiction, legislation and administration settings and the collective interest on markets by making the unit of trust traceable and not a commodity.

Both sides tremble by the dream. Anonimint fears political responsibility. Big Brother fears competition.
thaaanos
Sr. Member
****
Offline Offline

Activity: 370
Merit: 250


View Profile
April 26, 2015, 10:24:22 PM
Last edit: April 26, 2015, 10:35:15 PM by thaaanos
 #1357

Efficiency and knowledge age,
Efficiency is not a capital accumulation or monetary thing, it is simple a ratio of  result / work
Example, when microsoft integrated all their various windows technologies ie evc, vc++, mfc, windows api, vb under .net microsoft inceased efficiency, every time a programmer makes a choice on a platform or technology he sacrifices dof and makes a commitment to increase efficiency. Thus communities are grown and due to network effects some technologies or platforms dominate, this domination is a result of efficiency seeking actors, so efficiency matters even in knowledge age, in some form or another,why because in order to produce anything you *have* to make choices, and the more efficient a choice the more actors will select it efectively lowering their commulative entropy

Capital and knowledge age
If we think capital as the means of production, then in the knowledge age, knowledge and knowledge workers is the capital, essentialy threatening "capitalism", because the means of production are now free to walk off the factory! So capitalism will be reinvented in less centralized and more distributed fashion. But old money will play a role too just as the landlords became rhe first capitalists, likewise the capitalists will play a role in the knowledge age.
Consider that knowledge is education, not long ago education was elites only priviledge, we may see a regression here
Consider that knowledge is data, and see the disconcerning aggregation of data into the likes of google and facebook etc
Consider that knowledge is Patents, what is the trend? Do you see many garage inventors applying or the legal costs are barrier?
It not as rosy as you think but I cannot deny its an opportunity for a new start.

Entropy and networks
Consider N Actors with information content of size C bits, lets see it as a variable that holds the state of each actor. So total actor entropy S=N*k*lgC. Now let us have those actors communicate (link up in a network), say 1bit channel. This means that between 2 actors a single bit is shared thus reducing the states of the reciever anď his entropy to k*lg(C-1) . From this exercise one can see that the disconected agents have max entropy, while the maximaly connected have min entropy.
So in a network if an agent simply tries to maximise his entropy he is to be disconnected, or simply network participation limits your freedom. This is fine for immortal agents, but mortal agents also seek to hedge death, or achieve immortality of their content by comunication, and network participation.

Annealing is slowly by slowly lowering temperature, how is that process which you promote increase entropy?

Coincube it is mybelief tha life cheats the second low by producing more entropy to offset the lower entropy inside, pumping entropy outside so to speak, a global rule leaves room for violation localy, much like the warp drive!
OROBTC
Legendary
*
Offline Offline

Activity: 2926
Merit: 1863



View Profile
April 26, 2015, 10:40:33 PM
 #1358

...

Ain´t my thread, so take this for what it is worth (very little).

(Foreign keyboard, please excuse errors)

But, it looks like the varying uses of the term "entropy" and "Marxism" (not to mention comments on four-letter personality types) are not really advancing us any further in discussing "Economic Devastation", and in fact the current discussion approaches that of the how many angels can dance on the head of a pin...  

My eyes glaze at the long, intricate and convoluted discussions of TERMS and ideas not really focused on CoinCube´s title of this thread, now ell over 100,000 views...  I3552 seems to have nailed how I feel in his recent last sentence re end result of too much vs. too little "entropy".

Or possibly my just above remarks mean I don´t belong here...?

*   *   *

I remember reading in Gleick´s book on chaos (about 1990) that life itself skirts close to the edge of too much order ("low entropy") vs. too much random disorder ("high entropy").  Maybe that is good enough for the purposes of exploring how and why the S seems to be heading for TF...

I suppose that we have multiple reasons for the crisis in the Western economies: too much debt, too much government, too much dependence, etc.

And I have to thank TPTB very much for his suggestions to explore both his own work (a bit I have seen: brilliant) and Martin Armstrong´s work, however right or wrong he may be.  Armstrong is working a niche that looks like will become very productive and useful in examining our plight, and perhaps what we as individuals can do about it.

(Thanks for the SQL coding too, TPTB, it is hard for me to think hard while we are preparing for our daughter´s wedding down here)

*   *   *

Lately all I have seen re defending one´s self vs. an overly greedy .gov in need is the need for advanced education (TPTB) and HODLING assets like gold and BTC (from me for example).

1)  I would like to see other suggestions on what we can do as individuals to keep ourselves & families OK in the devastating storms to come.  And other than that of acquiring advanced programming skills WAY beyond me and most others.

2)  I am also interested in finding and examining SCENARIOS of Economic Devastation, and the probabilities of each and possible resolutions (at individual and perhaps (dare I suggest it) at some sort of Collective level.

I do understand that this may be kind-of a thread-jack, and if so voted, I will STFU and retire from this most interesting thread.
thaaanos
Sr. Member
****
Offline Offline

Activity: 370
Merit: 250


View Profile
April 26, 2015, 10:46:00 PM
 #1359


The actual rebuttal cuts straight to the meat of the issue. All those random books can be described by the instructions on how to create the random generator.
(For any readers that didn't understand the above point, remember entropy is the minimum information required to describe the outcome, thus instructions on building the random generator is the compressed information content and thus the actual information content)

We can create random generators by simply tapping into the randomness of the universe, see the implementation of some chips: TRNG
Same as creating offsprings, creation doesnot imply complete knowledge.
thaaanos
Sr. Member
****
Offline Offline

Activity: 370
Merit: 250


View Profile
April 26, 2015, 11:12:14 PM
 #1360

CoinCube,

My points are irrefutable.

I do not believe they are. However, to go further we will need to take this discussion to a more technical level and clarify some terms.

Please provide your definition of entropy.
Here is mine: the information you dont have access to.

If that data is irrelevant to a particular actor’s contribution to maximum entropy then it is not information.


Data irrelevant to an actor is noise,  information that cannot be accessed, environment entropy
Pages: « 1 ... 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 [68] 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 ... 152 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!