CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 16, 2017, 03:23:29 PM Last edit: June 16, 2017, 03:39:44 PM by CoinCube |
|
1) Regarding the metric of entropy. Here are two post you may find interesting. Entropy is InformationEntropy and FreedomThe first is a discussion on the relationship between entropy and information by Anonymint that is informative. The second is an excerpt from the book Knowledge and Power by George Gilder where the relationship between entropy and freedom is explored. The above post seem to be confusing this concept of emerging properties and entropy =) In the context of thermodynamics, the level of entropy is measured as defect from the expected output. The definition of entropy used here is not that of thermodynamic entropy but information or Shannon Entropy. Shannon Entropy is a measure of unpredictability of the state, or equivalently, of its average information content. The exact relationship between thermodynamic entropy and informational entropy is complex. Here is an excerpt from Wikipedia. " At an everyday practical level the links between information entropy and thermodynamic entropy are not evident. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution...Furthermore, in classical thermodynamics the entropy is defined in terms of macroscopic measurements and makes no reference to any probability distribution, which is central to the definition of information entropy.
In the view of Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics.Thus if Jaynes is correct classical thermodynamic entropy is simply a special case of broader information theory much as Newton's laws of motion emerge as a special case of general relativity. Both Anonymint and George Gilder above take the position that Jaynes is correct and this in turn supports their use of the term entropy. To make your case you need to show either: 1) That Jaynes is incorrect. or 2) That Anonymint and George Gilder are incorrectly using Shannon entropy when they should instead be using your concept of "emerging properties" which you need to define for us. I think you will have a very difficult time showing either of these things as I am of the opinion that Anonymint and Mr. Gilder are correct. That said if you can prove them wrong I would be very interested to see it. IMO if you are waiting on other people to build a structure you agree to be free, you are never really free at all Society, tradition, religion, all about volontary bondage I am not really following you here. If you are arguing that absolute freedom is impossible then I agree. The best we can do is minimize the restrictions on our our collective freedom. If you are interested in my thoughts on how we can best accomplish this I have outlined them here. Faith and Future
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 16, 2017, 05:21:52 PM |
|
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (mean) of the information contained in each message. 'Messages' can be modeled by any flow of information.
In that case, the input is the original message, the output is the transmited message, and the entropy is the unpredictible difference the two,on the level of entropy is how much it fit with expected values. your concept of "emerging properties" which you need to define for us.
https://en.wikipedia.org/wiki/EmergenceIn philosophy, systems theory, science, and art, emergence is a phenomenon whereby larger entities arise through interactions among smaller or simpler entities such that the larger entities exhibit properties the smaller/simpler entities do not exhibit. Emergence is central in theories of integrative levels and of complex systems. For instance, the phenomenon of life as studied in biology is an emergent property of chemistry, and psychological phenomena emerge from the neurobiological phenomena of living things. In philosophy, theories that emphasize emergent properties have been called emergentism. Almost all accounts of emergentism include a form of epistemic or ontological irreducibility to the lower levels.[1] Emergent properties and processes[edit] An emergent behavior or emergent property can appear when a number of simple entities (agents) operate in an environment, forming more complex behaviors as a collective. If emergence happens over disparate size scales, then the reason is usually a causal relation across different scales. In other words, there is often a form of top-down feedback in systems with emergent properties.[21] The processes from which emergent properties result may occur in either the observed or observing system, and can commonly be identified by their patterns of accumulating change, most generally called 'growth'. Emergent behaviours can occur because of intricate causal relations across different scales and feedback, known as interconnectivity. The emergent property itself may be either very predictable or unpredictable and unprecedented, and represent a new level of the system's evolution. The complex behaviour or properties are not a property of any single such entity, nor can they easily be predicted or deduced from behaviour in the lower-level entities, and might in fact be irreducible to such behavior. The shape and behaviour of a flock of birds [3] or school of fish are good examples of emergent properties.
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 16, 2017, 07:15:49 PM |
|
Thanks that is helpful. Yes it appears we are more or less talking about the same thing. Information entropy or Shannon entropy is simply a way to emperically measure and quantify what you are calling emergence. Here are a couple of papers on this if you are interested in reading more. Measuring Emergence, Self-organization, and Complexity Based on Shannon Entropyhttp://journal-cdn.frontiersin.org/article/244727/files/pubmed-zip/versions/1/pdfWe present a set of Matlab/Octave functions to compute measures of emergence, self-organization, and complexity applied to discrete and continuous data. These measures are based on Shannon’s information and differential entropy. Examples from different datasets and probability distributions are provided to show how to use our proposed code.
...
Complexity has generated interest in recent years (Bar-Yam, 1997; Mitchell, 2009; Haken and Portugali, 2017). A complex system can be understood as one composed by many elements, which acquire functional/spatial/temporal structures without a priori speci cations (Haken and Portugali, 2017). It has been studied in several disciplines, as one can try to measure the complexity of almost any phenomenon (Lopez-Ruiz et al., 1995; Bandt and Pompe, 2002; Prokopenko et al., 2009; Lizier, 2014; Soler-Toscano et al., 2014; Haken and Portugali, 2017). us, there exist a broad variety of measures of complexity where Shannon’s entropy and its generalizations have played a crucial role (Haken and Portugali, 2017). Information Entropy As a Basic Building Block of Complexity Theoryhttp://www.mdpi.com/1099-4300/15/9/3396/pdfAbstract: What is information? What role does information entropy play in this information exploding age, especially in understanding emergent behaviors of complex systems? To answer these questions, we discuss the origin of information entropy, the difference between information entropy and thermodynamic entropy, the role of information entropy in complexity theories, including chaos theory and fractal theory, and speculate new fields in which information entropy may play important roles.
|
|
|
|
carlerha
|
|
June 16, 2017, 07:27:07 PM |
|
I think economic devastation, due to the monopoly of wealth, in my country, economy controlled by ethnic minorities who can control the country's economy, their influence was too strong as they do all the way.
I think in most of the place we are facing the same problem money is restricted to some selected families and there monopoly is becoming stronger from time to time. the poors are becoming poorer and the richest people are holding all the money, the worker and skilled and the educated people at not receiving their rights they are just working for these selected families and the total credit is going to them. hope bitcoin will overcome such kind of situation very soon.
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 16, 2017, 08:22:39 PM |
|
Thanks that is helpful. Yes it appears we are more or less talking about the same thing. Information entropy or Shannon entropy is simply a way to emperically measure and quantify what you are calling emergence. Here are a couple of papers on this if you are interested in reading more. Measuring Emergence, Self-organization, and Complexity Based on Shannon Entropyhttp://journal-cdn.frontiersin.org/article/244727/files/pubmed-zip/versions/1/pdfWe present a set of Matlab/Octave functions to compute measures of emergence, self-organization, and complexity applied to discrete and continuous data. These measures are based on Shannon’s information and differential entropy. Examples from different datasets and probability distributions are provided to show how to use our proposed code.
...
Complexity has generated interest in recent years (Bar-Yam, 1997; Mitchell, 2009; Haken and Portugali, 2017). A complex system can be understood as one composed by many elements, which acquire functional/spatial/temporal structures without a priori speci cations (Haken and Portugali, 2017). It has been studied in several disciplines, as one can try to measure the complexity of almost any phenomenon (Lopez-Ruiz et al., 1995; Bandt and Pompe, 2002; Prokopenko et al., 2009; Lizier, 2014; Soler-Toscano et al., 2014; Haken and Portugali, 2017). us, there exist a broad variety of measures of complexity where Shannon’s entropy and its generalizations have played a crucial role (Haken and Portugali, 2017). Information Entropy As a Basic Building Block of Complexity Theoryhttp://www.mdpi.com/1099-4300/15/9/3396/pdfAbstract: What is information? What role does information entropy play in this information exploding age, especially in understanding emergent behaviors of complex systems? To answer these questions, we discuss the origin of information entropy, the difference between information entropy and thermodynamic entropy, the role of information entropy in complexity theories, including chaos theory and fractal theory, and speculate new fields in which information entropy may play important roles. Emergent properties can be predicted, sometime they are not entropic But entropy is only mesurable against expected behavior from a constructed system to measure how the data fit the theory behind the design of the system. This connection between emergent property & entropy works for properties emerging from a designed/constructed system, not for measuring "natural" behavior out of the context of a fabricated system. It's only entropy if it's measured as a difference with predicted outcome.
|
|
|
|
LegendOF45
|
|
June 17, 2017, 06:29:33 AM |
|
Economic devastation due to many factors but I think because of government policies that don't support people and prefer the entrepreneurs, many businessmen who bribe to the Government so they could control the economy.
|
|
|
|
mzforfree
Newbie
Offline
Activity: 42
Merit: 0
|
|
June 17, 2017, 06:47:17 AM |
|
Source: http://www.coolpage.com/stopped reading there. The market will adjust just like it always has, just google how many market panics there has been since centralized banking alone as well as the shifting of the job market and labor force in regards to employment and implementation of new technologies. What happened during the industrial revolution when a huge % of the labor force shifted from agrarian to industrial? Or when the industrial companies of the U.S all got outsourced to 3rd world countries for cheap later? The market and labor force adjusted again and we saw a huge rise in the service industry. Or when now IT and newer/lower-education level medical/care-taker jobs are being a larger % of the labor force? Robots will never totally replace humans and vice versa. It isn't a competition and never has been. Can anyone list any SINGLE precedent in which a fundamental new technology "devastated" ANY labor force much less what the biggest fluctuation was? People, the market, supply, demand, and the education/skill needed for the labor force at large is always adjusting within any dynamic system such as a labor market. More technological advancement has ALWAYS meant higher standard of living in the longer run. The cotton gin invention didn't outlaw slaves but made more people realize it wasn't necessary and more of a burden and "way of life" than an actual long-term feasible commodity. Every labor advancement from the assembly line to the internet has made it easier for human beings and we've adjusted what our duties are and the duties that are no long necessary because of technology in every scenario. This was written by some crank on some obscure website "projecting" something that will happen in 2033, 15 years down the line. Who could have predicted bitcoins increase to peak 3000 even 3 months ago?
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 17, 2017, 10:43:32 AM Last edit: June 17, 2017, 11:06:29 AM by IadixDev |
|
IMO if you are waiting on other people to build a structure you agree to be free, you are never really free at all Society, tradition, religion, all about volontary bondage I am not really following you here. If you are arguing that absolute freedom is impossible then I agree. The best we can do is minimize the restrictions on our our collective freedom. If you are interested in my thoughts on how we can best accomplish this I have outlined them here. My idea is more that freedom is more to be though in term of capacity or skill rather than in term of if other people or system is cooperative with your own aspirations Budha said things along those line that real freedom it's necessarily a path of loneliness, it's kinda close to Jung concept of individuation on the personal level, how you need to get rid of archetypes and limit from society to become more unique and individuated. And by definition it's by getting out of the copycat behavior, or learned behavior, that you become more individuated. In the same time i completely get what he mean, it's just the term of entropy is not necessarily the best to employ to describe what he talks about lol It's seeing the issue from the wrong side IMO lol You can't get a positive definition of the processus of individuation if you only see it as entropy as how it deviate from socially expected behavior "Reducing the human mind to an electric signal is a perversion"
|
|
|
|
Michhotdog
|
|
June 17, 2017, 12:58:59 PM |
|
IMO if you are waiting on other people to build a structure you agree to be free, you are never really free at all Society, tradition, religion, all about volontary bondage I am not really following you here. If you are arguing that absolute freedom is impossible then I agree. The best we can do is minimize the restrictions on our our collective freedom. If you are interested in my thoughts on how we can best accomplish this I have outlined them here. My idea is more that freedom is more to be though in term of capacity or skill rather than in term of if other people or system is cooperative with your own aspirations Budha said things along those line that real freedom it's necessarily a path of loneliness, it's kinda close to Jung concept of individuation on the personal level, how you need to get rid of archetypes and limit from society to become more unique and individuated. And by definition it's by getting out of the copycat behavior, or learned behavior, that you become more individuated. In the same time i completely get what he mean, it's just the term of entropy is not necessarily the best to employ to describe what he talks about lol It's seeing the issue from the wrong side IMO lol You can't get a positive definition of the processus of individuation if you only see it as entropy as how it deviate from socially expected behavior "Reducing the human mind to an electric signal is a perversion" You must understand that all of our Freedom is an illusion. We use crypto currency today and believe that because of it we are not available for decentralization and free in our choice. But this is not so. If you completely understand all the subtleties of Bitcoin structure and how it all was created, then I would not be surprised that in this question there were the comma structures that control everything and everywhere.
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 17, 2017, 01:33:59 PM |
|
IMO if you are waiting on other people to build a structure you agree to be free, you are never really free at all Society, tradition, religion, all about volontary bondage I am not really following you here. If you are arguing that absolute freedom is impossible then I agree. The best we can do is minimize the restrictions on our our collective freedom. If you are interested in my thoughts on how we can best accomplish this I have outlined them here. My idea is more that freedom is more to be though in term of capacity or skill rather than in term of if other people or system is cooperative with your own aspirations Budha said things along those line that real freedom it's necessarily a path of loneliness, it's kinda close to Jung concept of individuation on the personal level, how you need to get rid of archetypes and limit from society to become more unique and individuated. And by definition it's by getting out of the copycat behavior, or learned behavior, that you become more individuated. In the same time i completely get what he mean, it's just the term of entropy is not necessarily the best to employ to describe what he talks about lol It's seeing the issue from the wrong side IMO lol You can't get a positive definition of the processus of individuation if you only see it as entropy as how it deviate from socially expected behavior "Reducing the human mind to an electric signal is a perversion" You must understand that all of our Freedom is an illusion. We use crypto currency today and believe that because of it we are not available for decentralization and free in our choice. But this is not so. If you completely understand all the subtleties of Bitcoin structure and how it all was created, then I would not be surprised that in this question there were the comma structures that control everything and everywhere. In a way it can be told that realization of freedom goes through concept of efficiency, and we are at a stage where centralized structure are not efficient because of too diversified population and needs, and with democratisation of computers and internet, it will probably become more optimal that life organize around small specialized structure, rather than in big national scale collective and objective, the society of today become much more multi polar, and there is not one big corporation or organisation who can be efficient for everything. https://hermetic.com/bey/quantum Quantum Mechanics & Chaos Theory Anarchist Meditations on N. Herbert’s Quantum Reality: Beyond the New Physics By Hakim Bey
1. Scientific worldviews or “paradigms” can influence — or be influenced by — social reality. Clearly the Ptolemaic universe mirrors theocentric & monarchic structures. The Newtonian/Cartesian/mechanical universe mirrors rationalistic social assumptions, which in turn underlie nationalism, capitalism, communism, etc. As for Relativity Theory, it has only recently begun to reflect — or be reflected by — certain social realities. But these relations are still obscure, embedded in multinational conspiracies, the metaphysics of modern banking, international terrorism, & various newly emergent telecommunications-based technologies.
2. Which comes first, scientific paradigm or social structure? For our purpose it seems unnecessary to answer this question — and in any case, perhaps impossible. The relation between them is real, but acts in a manner infinitely more complex than mere cause-&-effect, or even warp-&-weft.
3. Quantum Mechanics (QM), considered as the source of such a paradigm, at first seems to lack any social ramifications or parallels, almost as if its very weirdness deprives it of all connections with “everyday” life or social reality. However, a few authors (like F. Capra, or Science-Fictioneers like R. Rucker or R. Anton Wilson) have seen Quantum Theory both as a vindication of certain “oriental philosophies” & also as prophetic of certain social changes which might loosely & carelessly be lumped under the heading “Aquarian.”
4. The “mystical” systems evoked by our contemplation of Quantum facts tend to be non-dualist and non-theocentric, dynamic rather than static: Advaita Vedanta, Taoism, Tantra (both Hindu & Buddhist), alchemy, etc. Einstein, who opposed Quantum theory, believed in a God who refused to play dice with the universe, a basically Judeo-Protestant deity who sets up a cosmic speed limit for light. The Quantum enthusiasts, by contrast, prefer a dancing Shiva, a principle of cosmic play.
5. Perhaps “oriental wisdom” will provide a kind of focusing device, or set of metaphors, or myth, or poetics of QM, which will allow it to realize itself fully as a “paradigm” & discover its reflection on the level of society. But it does not follow that this paradigm will simply recapitulate the social complexes which gave rise to Taoism, Tantra or alchemy. There is no “Eternal Return” in the strict Nietzschean sense: each time the gyre comes round again it describes a new point in space/time.
6. Einstein accused Quantum Theory (QT) of restoring individual consciousness to the center of the universe, a position from which “Man” was toppled by “Science” 500 years ago. If QT can be accused of retrogression, however, it must be something like the anarchist P. Goodman's “Stone Age Reaction” — a turning-back so extreme as to constitute a revolution.
7. Perhaps the development of QM and the rediscovery of “oriental wisdom” (with its occidental variations) stem from the same social causes, which have to do with information density, electronic technology, the ongoing collapse of Eurocentrism & its “Classical” philosophies, ideologies & physics. Perhaps the syncresis of QT & oriental wisdom will accelerate these changes, even help direct them.For me it's more a thing along this trend that large scale organized structure stop to be really efficient to facilitate development, opposed to decentralized solution who are coming more and more. Bitcoin is only partially a good thing for this imo, because it's not really modular, and the code is quite monolothic and not that easy to adapt, and to really to do something useful with it buisness wise, need also other things like web servers, and other applications, which make it still quite hard for it to reach it's advertised goal of decentralization/fungibility to maximize fluidity in utility for trading or other, for the moment it still stay quite centralized. But it's already a completely novel approach, in the sense the node implement both server & client side, and it's already a completely new way to see distributed application development, with shared data validated in trust less manner, and RPC api to exploit the data in third party application. Already it switch from centralization of all the data in data centers, with all the monopoly and exploitation of the information it leads to, and it's mostly milking user driven content, even sometime without really their full consent lol Whereas the cost of processing / storage / bandwidth is becoming quite low, and there is lot of operations that are done by data centers who could be done in decentralized manner. It's why i'm working on my solution which i think can help a lot to really be able to run blockchain based application more complex than just wallet and miner with HTML5 in an all in one node, with much more modularity and script engine to be able to customize everything easily to tailor it to reach more the full objective of decentralized trading and decentralized application.
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 17, 2017, 02:34:20 PM Last edit: June 17, 2017, 03:17:04 PM by CoinCube |
|
In the same time i completely get what he mean, it's just the term of entropy is not necessarily the best to employ to describe what he talks about lol It's seeing the issue from the wrong side IMO lol You can't get a positive definition of the process of individuation if you only see it as entropy as how it deviate from socially expected behavior I am actually somewhat sympathetic to this position. Ideally it would be Anonymint here defending his definitions but he is boycotting the forum at the moment after getting his most recent incarnation banned so I will do my best to defend his thesis in his absence. Emergent properties can be predicted, sometime they are not entropic But entropy is only mesurable against expected behavior from a constructed system to measure how the data fit the theory behind the design of the system. This connection between emergent property & entropy works for properties emerging from a designed/constructed system, not for measuring "natural" behavior out of the context of a fabricated system. It's only entropy if it's measured as a difference with predicted outcome. Let's dive into the definitions and see where that takes us. From the papers I linked above:. Emergence can be understood as new global patterns which are not present in the system’s components. Self-organization, in its most general form, can be seen as a reduction of entropy. Self-organization is the complement of Emergence and a metric of order and regularity. Complexity comes from the Latin plexus, which means inter-woven. something complex is difficult to separate. Complexity represents a balance between change (Emergence) and regularity (Self-organization), which allows systems to adapt in a robust fashion. Regularity ensures that information survives, while change allows the exploration of new possibilities, essential for adaptability. In this sense, complexity can also be used to characterize living systems or artificial adaptive systems, especially when comparing their complexity with that of their environment. More precisely, complexity describes a system’s behavior in terms of the average uncertainty produced by emergent and regular global patterns as described by its probability distribution. So where does entropy come in? Information entropy is a deterministic complexity measure, since it quantifies the degree of randomness. If we can measure degree of randomness we also quantify emergence with some limitations As you said entropy is only measurable against expected behavior in a constructed system. The complexity of different phenomena can be calculated using entropy-based measures. However, to obtain meaningful results, we must first determine the adequate function to be employed for the problem. In the case of "natural" behavior out of the context of a human fabricated system this becomes problematic as we don't know the underlying function. I believe it was Gödel who said, the world is either a perfect order of God, or chaos. The difference is in the belief that infinity comes before entropy. If we go with infinity then we can assume that some underlying function exists for all systems including "natural" ones. Lacking an understanding of the system we may not be able to measure the the entropy and the associated emergence but we can assume the relationship persists outside of our knowledge. What you call the process of individuation is not just about maximizing emergence aka maximizing entropy. The "process of individuation" is the maximization of emergence while maintaining overall self-organization. It is the long term maximization of the complexity of the system.
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 17, 2017, 02:45:28 PM |
|
That's too bad you missed out on an interesting essay. Wisdom is not limited to the ivory tower.
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 17, 2017, 05:54:12 PM |
|
In the same time i completely get what he mean, it's just the term of entropy is not necessarily the best to employ to describe what he talks about lol It's seeing the issue from the wrong side IMO lol You can't get a positive definition of the process of individuation if you only see it as entropy as how it deviate from socially expected behavior I am actually somewhat sympathetic to this position. Ideally it would be Anonymint here defending his definitions but he is boycotting the forum at the moment after getting his most recent incarnation banned so I will do my best to defend his thesis in his absence. Emergent properties can be predicted, sometime they are not entropic But entropy is only mesurable against expected behavior from a constructed system to measure how the data fit the theory behind the design of the system. This connection between emergent property & entropy works for properties emerging from a designed/constructed system, not for measuring "natural" behavior out of the context of a fabricated system. It's only entropy if it's measured as a difference with predicted outcome. Let's dive into the definitions and see where that takes us. From the papers I linked above:. Emergence can be understood as new global patterns which are not present in the system’s components. Self-organization, in its most general form, can be seen as a reduction of entropy. Self-organization is the complement of Emergence and a metric of order and regularity. Complexity comes from the Latin plexus, which means inter-woven. something complex is difficult to separate. Complexity represents a balance between change (Emergence) and regularity (Self-organization), which allows systems to adapt in a robust fashion. Regularity ensures that information survives, while change allows the exploration of new possibilities, essential for adaptability. In this sense, complexity can also be used to characterize living systems or artificial adaptive systems, especially when comparing their complexity with that of their environment. More precisely, complexity describes a system’s behavior in terms of the average uncertainty produced by emergent and regular global patterns as described by its probability distribution. So where does entropy come in? Information entropy is a deterministic complexity measure, since it quantifies the degree of randomness. If we can measure degree of randomness we also quantify emergence with some limitations As you said entropy is only measurable against expected behavior in a constructed system. The complexity of different phenomena can be calculated using entropy-based measures. However, to obtain meaningful results, we must first determine the adequate function to be employed for the problem. In the case of "natural" behavior out of the context of a human fabricated system this becomes problematic as we don't know the underlying function. I believe it was Gödel who said, the world is either a perfect order of God, or chaos. The difference is in the belief that infinity comes before entropy. If we go with infinity then we can assume that some underlying function exists for all systems including "natural" ones. Lacking an understanding of the system we may not be able to measure the the entropy and the associated emergence but we can assume the relationship persists outside of our knowledge. What you call the process of individuation is not just about maximizing emergence aka maximizing entropy. The "process of individuation" is the maximization of emergence while maintaining overall self-organization. It is the long term maximization of the complexity of the system. Well the concept of entropy is mostly relevant in the context of engineering, where one build a system, and has to consider that what matter in term of right or wrong is that the machine work as planned / wanted by the designer, and in this perspective, the entropy is always something unwanted, or defined negatively as a divergeance from the expected /wanted result. In the overall i'm not big fan of newton theories lol I prefer liebniz =) Newton is ok if really limited to experimental physics, but the second law of thermodynamics already it show where it want to get at, and meaning that what is to be considered as good in environment is things that works as planned by a designer, and 'chaos' is seen by definition as something to be avoided / deconsidered, or as some kind of waste of energy that make the system not as efficient as it should be to full fill its purpose. There are the social theories with social darwinism who say that even so before it was fitting to law of nature that was driving evolution, but then since maybe rome or so, it's more social selection, and how one will fit into a culture or another that will be driving force of evolution, or even now how he fits into economic agenda, which can give this sort of sense of 'ideal individual' in the context of social or economic development, and that society has to shape it's citizen to fit this ideal, and like if human being were just passively waiting to be programmed to exist as a person, and anything that popup from a person not being programmed is seen as entropy, as seeing individuals as only target for social engineering.
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 17, 2017, 06:34:05 PM |
|
Source: http://www.coolpage.com/stopped reading there. The market will adjust just like it always has, just google how many market panics there has been since centralized banking alone as well as the shifting of the job market and labor force in regards to employment and implementation of new technologies. What happened during the industrial revolution when a huge % of the labor force shifted from agrarian to industrial? Or when the industrial companies of the U.S all got outsourced to 3rd world countries for cheap later? The market and labor force adjusted again and we saw a huge rise in the service industry. Or when now IT and newer/lower-education level medical/care-taker jobs are being a larger % of the labor force? Robots will never totally replace humans and vice versa. It isn't a competition and never has been. Can anyone list any SINGLE precedent in which a fundamental new technology "devastated" ANY labor force much less what the biggest fluctuation was? People, the market, supply, demand, and the education/skill needed for the labor force at large is always adjusting within any dynamic system such as a labor market. More technological advancement has ALWAYS meant higher standard of living in the longer run. The cotton gin invention didn't outlaw slaves but made more people realize it wasn't necessary and more of a burden and "way of life" than an actual long-term feasible commodity. Every labor advancement from the assembly line to the internet has made it easier for human beings and we've adjusted what our duties are and the duties that are no long necessary because of technology in every scenario. This was written by some crank on some obscure website "projecting" something that will happen in 2033, 15 years down the line. Who could have predicted bitcoins increase to peak 3000 even 3 months ago? The problem imo is not that much technology, but the whole obscurantism surrounding it, with the whole lot of proprietary closed technology, deceptive marketting, it leave its understanding benefits and operation / decision piwer etc under selected hands. Assange explain well how feudalism always emerge from control of technology who increase production that become vital to sustain a certain population, it was with windmill in middle age, same goes with cpu & IT now.
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 17, 2017, 09:21:44 PM |
|
Well the concept of entropy is mostly relevant in the context of engineering, where one build a system, and has to consider that what matter in term of right or wrong is that the machine work as planned / wanted by the designer, and in this perspective, the entropy is always something unwanted, or defined negatively as a divergeance from the expected /wanted result.
...what is to be considered as good in environment is things that works as planned by a designer, and 'chaos' is seen by definition as something to be avoided / deconsidered, or as some kind of waste of energy that make the system not as efficient as it should be to full fill its purpose.
This is because we have until very recent focused our engineering efforts on creating predictable or "dumb" devices. In the context of the discussion upthread the goal has been to accomplish a fixed task and then maximise the self-organisation of the system ideally driving emergence to zero aka minimising informational entropy. In this context entropy represents loss or misdirected efforts. However, if we want an adaptive machine capable of responding to unanticipated environmental changes or improving over time then we need a component of emergence and thus Shannon entropy. This can be seen when looking at one of the more famous new machines Google's Go playing machine AlphaGo. This machine must respond to unpredictable responses from opponents and still win in a game that is to complex to simply play via simple brute force. Christopher Burger who has a Ph.D in machine learning wrote this interesting analysis of how AlphaGo works. https://www.tastehit.com/blog/google-deepmind-alphago-how-it-works/AlphaGo uses a Monte Carlo Tree Search. Monte Carlo Tree Search is an alternative approach to searching the game tree. The idea is to run many game simulations. Each simulation starts at the current game state and stops when the game is won by one of the two players. At first, the simulations are completely random: actions are chosen randomly at each state, for both players. At each simulation, some values are stored, such as how often each node has been visited, and how often this has led to a win. These numbers guide the later simulations in selecting actions (simulations thus become less and less random). The more simulations are executed, the more accurate these numbers become at selecting winning moves. It can be shown that as the number of simulations grows, MCTS indeed converges to optimal play.
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 18, 2017, 07:52:52 AM Last edit: June 18, 2017, 08:22:08 AM by IadixDev |
|
Well the concept of entropy is mostly relevant in the context of engineering, where one build a system, and has to consider that what matter in term of right or wrong is that the machine work as planned / wanted by the designer, and in this perspective, the entropy is always something unwanted, or defined negatively as a divergeance from the expected /wanted result.
...what is to be considered as good in environment is things that works as planned by a designer, and 'chaos' is seen by definition as something to be avoided / deconsidered, or as some kind of waste of energy that make the system not as efficient as it should be to full fill its purpose.
This is because we have until very recent focused our engineering efforts on creating predictable or "dumb" devices. In the context of the discussion upthread the goal has been to accomplish a fixed task and then maximise the self-organisation of the system ideally driving emergence to zero aka minimising informational entropy. In this context entropy represents loss or misdirected efforts. However, if we want an adaptive machine capable of responding to unanticipated environmental changes or improving over time then we need a component of emergence and thus Shannon entropy. This can be seen when looking at one of the more famous new machines Google's Go playing machine AlphaGo. This machine must respond to unpredictable responses from opponents and still win in a game that is to complex to simply play via simple brute force. Christopher Burger who has a Ph.D in machine learning wrote this interesting analysis of how AlphaGo works. https://www.tastehit.com/blog/google-deepmind-alphago-how-it-works/AlphaGo uses a Monte Carlo Tree Search. Monte Carlo Tree Search is an alternative approach to searching the game tree. The idea is to run many game simulations. Each simulation starts at the current game state and stops when the game is won by one of the two players. At first, the simulations are completely random: actions are chosen randomly at each state, for both players. At each simulation, some values are stored, such as how often each node has been visited, and how often this has led to a win. These numbers guide the later simulations in selecting actions (simulations thus become less and less random). The more simulations are executed, the more accurate these numbers become at selecting winning moves. It can be shown that as the number of simulations grows, MCTS indeed converges to optimal play. Yes i saw this kind of discussion with non deterministic algorithm But for me need to distinguish between chaotic function, predictibility and entropy =) I saw a good in depth video about this kind of algorithm, but i will never find it back lol but it was digging very deep into core science philosophy to show how determinism like newton is always based on analysis of components of a system, and defining a system as sum of its part,and seeing each part as somehow immutable and ideal, but it fail to integrate emergent properties, whereas these new kind of non deterministic algorithm are more holistic, and more consider the informations as a whole without trying to fit it to predetermined template or structure. It's more something that is result/reward driven to estimate the efficient of the algorithm rather than something based on some kind of pre determined ontology and induced properties like newtonian physics. In a way this whole distinction between self organisation and entropy is very subjective, and mostly in the eye of the beholder Maybe the whole universe is in a process of self organisation and there is not one particule or quanta in the whole thing that is not participating in this auto organisation. But it's a bit the philosophical problem i have with newtonian based theories in general is that they tend to see everything in term of what is understood by the person, and it's very easy to fall into the intellectual trap of categorizing things as entropic or self organized based on if you understand its purpose and how it serve you. Basically if you can understand the purpose of something and how it serve you in a predictible manner, it's not entropic, if you don't understand it, or if it get in the way to have predictible positive outcome, then it become entropy. But it's all very subjective in the end. Chaos theory are also different from entropy, in the sense with chaotic functions, the functions is already supposed to be unpredictible to begin with, so there is not really a concept of entropy as how the function result will deviate from expected outcome. Even high degree of variation due to complexity is not really to be called entropy. Entropy can actually be quite regular. If you take wheel spinning on it's axis, the entropy would be how it's not exactly spinning in circle, but the variations will still be statistically simple. Even if most of the time i guess what engineers will measure as entropy in a system will mostly be emergent properties, quantum stuff, etc it's mostly a concept that apply to linear system because linear system are never accurate in physics, which can make one wonder why it's even called science to begin with, it's interest is mostly for industrial economy. After you can see a tree or a child as just noise (actually children are often just this ), compared to the beautifull dystopia the megalomaniac in goldman sachs are trying to concoct A parking is certainly 'less entropic' than a forest
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 18, 2017, 09:04:47 PM Last edit: June 19, 2017, 04:28:05 AM by CoinCube |
|
But for me need to distinguish between chaotic function, predictibility and entropy... In a way this whole distinction between self organisation and entropy is very subjective, and mostly in the eye of the beholder Maybe the whole universe is in a process of self organisation and there is not one particule or quanta in the whole thing that is not participating in this auto organisation.
... Chaos theory are also different from entropy, in the sense with chaotic functions, the functions is already supposed to be unpredictible to begin with, so there is not really a concept of entropy as how the function result will deviate from expected outcome. ... Even if most of the time i guess what engineers will measure as entropy in a system will mostly be emergent properties, quantum stuff, etc it's mostly a concept that apply to linear system because linear system are never accurate in physics, which can make one wonder why it's even called science to begin with, it's interest is mostly for industrial economy. After you can see a tree or a child as just noise (actually children are often just this ), compared to the beautifully dystopia the megalomaniac in goldman sachs are trying to concoct A parking is certainly 'less entropic' than a forest IadixDev I would actually agree with your description of the universe above but would also argue that it is incomplete as it focuses only on self-organisation and neglects the other aspects of complexity. This is a similar objection to the one you raised against the term entropy. I take the position that the entire universe is in a process of ever increasing complexity and there is not one particle or quanta in the whole thing that is not participating in this growing complexity. Anonymint the author of the essays linked in the opening post is a self described anarchist and focuses on emergence, entropy, and freedom. You seem to view the world more as a process of self-organisation. I believe both of these conceptions can be brought into harmony under the broader umbrella of complexity. Complex systems exhibit four characteristics: – Self-organization – Non-linearity – Order/Chaos Dynamic – Emergence Informational entropy provides a way to empirically measure emergence but emergence is only one aspect of complexity. Self-organization can be looked at as a process that actually reduces entropy yet it undeniably also increases complexity. The chaos in this context is a observation of system dynamics. Systems exist on a spectrum ranging from equilibrium to chaos. A system in equilibrium does not have the internal dynamics to enable it to respond to its environment and will slowly (or quickly) die. A system in chaos ceases to function as a system. A system on the edge of chaos will exhibit maximum variety and creativity, leading to new possibilities. The field of complexity analysis is new and still in its infancy. “God chose to give all the easy problems to the physicists.” —Michael Lave & Jim March, Introduction to Models in the Social Sciences
|
|
|
|
OROBTC
Legendary
Offline
Activity: 2926
Merit: 1863
|
|
June 19, 2017, 08:47:44 AM |
|
... Nice quotation by Lave and March, CC. Maybe add statistics to those easy problems. Social sciences do not seem to offer easy solutions. Nor is it as easy to predict the future with all of its confounding variables and unknown Swans out there. Seems we're about due for a Swan, think I'll head over to an ATM now and pull out some dough...
|
|
|
|
IadixDev
Full Member
Offline
Activity: 322
Merit: 151
They're tactical
|
|
June 19, 2017, 10:57:30 AM Last edit: June 19, 2017, 06:20:26 PM by IadixDev |
|
But for me need to distinguish between chaotic function, predictibility and entropy... In a way this whole distinction between self organisation and entropy is very subjective, and mostly in the eye of the beholder Maybe the whole universe is in a process of self organisation and there is not one particule or quanta in the whole thing that is not participating in this auto organisation.
... Chaos theory are also different from entropy, in the sense with chaotic functions, the functions is already supposed to be unpredictible to begin with, so there is not really a concept of entropy as how the function result will deviate from expected outcome. ... Even if most of the time i guess what engineers will measure as entropy in a system will mostly be emergent properties, quantum stuff, etc it's mostly a concept that apply to linear system because linear system are never accurate in physics, which can make one wonder why it's even called science to begin with, it's interest is mostly for industrial economy. After you can see a tree or a child as just noise (actually children are often just this ), compared to the beautifully dystopia the megalomaniac in goldman sachs are trying to concoct A parking is certainly 'less entropic' than a forest IadixDev I would actually agree with your description of the universe above but would also argue that it is incomplete as it focuses only on self-organisation and neglects the other aspects of complexity. This is a similar objection to the one you raised against the term entropy. I take the position that the entire universe is in a process of ever increasing complexity and there is not one particle or quanta in the whole thing that is not participating in this growing complexity. Anonymint the author of the essays linked in the opening post is a self described anarchist and focuses on emergence, entropy, and freedom. You seem to view the world more as a process of self-organisation. I believe both of these conceptions can be brought into harmony under the broader umbrella of complexity. Complex systems exhibit four characteristics: – Self-organization – Non-linearity – Order/Chaos Dynamic – Emergence Informational entropy provides a way to empirically measure emergence but emergence is only one aspect of complexity. Self-organization can be looked at as a process that actually reduces entropy yet it undeniably also increases complexity. The chaos in this context is a observation of system dynamics. Systems exist on a spectrum ranging from equilibrium to chaos. A system in equilibrium does not have the internal dynamics to enable it to respond to its environment and will slowly (or quickly) die. A system in chaos ceases to function as a system. A system on the edge of chaos will exhibit maximum variety and creativity, leading to new possibilities. The field of complexity analysis is new and still in its infancy. “God chose to give all the easy problems to the physicists.” —Michael Lave & Jim March, Introduction to Models in the Social Sciences I will take example with turing machin and OO programing, maybe it will be clearer what i'm talking about =) As the concept of entropy is quasi inexistant with turing machine, and like this we know we are not talking about something mystical And i think it can interest also shelby because he is into this sort of problematics with language design lol The problem is this conceptions from metaphysics to organize the world based on fundemental 'objects' with properties, and 'entelechy' , which is abtracted with the OO semantic of having class of objects with properties and 'entelechy' through the alteration of its state by its methods. So far good, but then the probelm is when you want to program interaction between all the different type of object that can be present in the world, with OO programing generally it become quickly a design problem. Either you will add a member function in all class to program the interaction with each other class with specialized functions for each type, but then it mean either you have make the interaction in double in each class, or then one class doesn't know or contain the interaction it can have with the other class, which is bogus from metaphysical stand point. Either you do a visitor class for each pair of objects, and then each time you add a new type of object, you need to add visitor class for all the combination that the new object can interact with, but it's still bogus from metaphysical point of view because it mean the interaction between the object are not contained in the object themselves, but applied from the exterior through a visitor class that visit the two object in questions. Even to program physic simulation like bullet physic it's not a small problem, when need to compute mutual gravity from two object for example, and that's only a simple case. And even if there is no entropy in turin machine, it's easy to see if you run complex real time physic simulation two time, you will never have the same result at the end, but it's not really entropy, neither really chaotic function , it's not fractal or strange attractor, only plain linear newtonian physics. I think the 3 body problem of gauss run into this problem. This whole design of hard typed object make emergent property very hard to program and conceptualize. Past days i've been digging more into haskel, already through reading the discussion of shelby on the git, i'm starting to get where they want to get at, because in fact in haskel there is this concept of monad, which are generic base object that can be used in generic code 'type classes' and can be specialized into pretty much anything, and the language allow to do meta programing very easily based on monad interactions, which allow to write generic code that can apply to any type, and i think i saw somewhere they are doing stuff to be able to handle emergent property kind of things based on type class like this. http://www.haskellforall.com/2012/08/the-category-design-pattern.htmlBut it's the same principle i wanted to get at with my framework, to have generic place holder for holding reference to meta typed object with monomorphized access function in sort that the code is independant from the type of the data it manipulate as long as the node can convert it's data to the type required in the code. It allow to manipulate collection of heterogenous objects and apply generic function to them without having to do specialized function that apply to each specific combination of class. I'm not sure if typeclass are considered as being turing complete language before they are compiled/monomorphized to concrete type, but if they are turing complete already it mean it could allow to program things based on runtime dynamic data and emergent property between them, and it would probably lead with unpredictible result in the end, even if it's not chaotic functions or entropy, but in the realm between the turing undecidability and mutual interaction problem in physic. http://number-none.com/product/Predicate%20Logic/index.html , http://number-none.com/product/My%20Friend,%20the%20Covariance%20Body/ https://books.google.fr/books?id=HBZADQAAQBAJ&pg=PA103&lpg=PA103&dq=haskell+monad+emergent+property&source=bl&ots=Rk20Gl6GEy&sig=iJyXEi8DbY4LuTgFjmFNgeOxpgo&hl=en&sa=X&ved=0ahUKEwi_r9iIusrUAhWMKMAKHbThAEcQ6AEIKjAF#v=onepage&q&f=false
|
|
|
|
CoinCube (OP)
Legendary
Offline
Activity: 1946
Merit: 1055
|
|
June 20, 2017, 08:10:05 PM Last edit: June 20, 2017, 09:08:57 PM by CoinCube |
|
I will take example with turing machin and OO programing, maybe it will be clearer what i'm talking about =) As the concept of entropy is quasi inexistant with turing machine, and like this we know we are not talking about something mystical And i think it can interest also shelby because he is into this sort of problematics with language design lol The problem is this conceptions from metaphysics to organize the world based on fundemental 'objects' with properties, and 'entelechy' , which is abtracted with the OO semantic of having class of objects with properties and 'entelechy' through the alteration of its state by its methods. So far good, but then the problem is when you want to program interaction between all the different type of object that can be present in the world, with OO programming generally it become quickly a design problem. ... Either you do a visitor class for each pair of objects, and then each time you add a new type of object, you need to add visitor class for all the combination that the new object can interact with, but it's still bogus from metaphysical point of view because it mean the interaction between the object are not contained in the object themselves, but applied from the exterior through a visitor class that visit the two object in questions.... This whole design of hard typed object make emergent property very hard to program and conceptualize. ... I would agree that in Turing machines the concept of entropy is quasi inexistant. Most of the time it is entirely absent. Turing machines: https://en.wikipedia.org/wiki/Turing_machineIn his 1948 essay, "Intelligent Machinery", Turing wrote that his machine consisted of:
...an unlimited memory capacity obtained in the form of an infinite tape marked out into squares, on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol, and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings. (Turing 1948, p. 3[18]
The underlined portion is the key reason for both a lack of emergence and subsequently the lack of conceptual entropy in Turing machines. In a standard Turing machine the symbols on the tape do not ultimately change the nature of the machine (even if those symbols have been previously read). This is because the typical Turing machine draws from a finite table of instructions which are ultimately fixed and invariant. Thus the Turing machine with a fixed and finite table is a simple system regardless of how complex and long that table may be unless you allow the table of instructions to be dynamically and permanently altered based on the tape readings. As programming languages have a fixed set of basic code they are simple Turing machines. However computer programming language in general is something more and represents a complex system. The programmers using them are the equivalent of a tape that applies dynamic updates to the instruction table. Thus over time we have seen the progression from assembly language to C++ as discussed in your links above. I am not going to be helpful in a technical discussion of how to add emergence to a programmed system as I am not a programmer but I will address one of your points. You appear to arguing (in the bolded section above) that if the interaction between objects are not contained in the objects themselves but requite an external observer/visitor state then the system is not valid from metaphysical point of view. If I understand you correctly you are arguing that a programmed system must be complete to be metaphysically valid. Completeness is never possible. For a discussion on this point I would refer you to an excellent write up by Perry Marshall: The Limits of Science and Programming“Without mathematics we cannot penetrate deeply into philosophy. Without philosophy we cannot penetrate deeply into mathematics. Without both we cannot penetrate deeply into anything.”
-Leibniz
|
|
|
|
|