Non-Linear Evolution of Intelligence--
Dependence Upon Entropy
(2) The non-linear evolution of intelligence toward increasingly complex states has a direct mathematical relationship to and interdependence with the entropic process of matter-energy. This applies when viewed as a universe-wide phenomenon. It does not apply when viewing each separate open system contained within the universe, with the latter representing an inverse or reciprocal relationship.
The law of entropy, although originally restricted to thermodynamics only, has been more widely applied so that it is now regarded as the most general regulator of natural activity known to science.1... As stated by A.S. Eddington, "The law that entropy always increases--the second law of thermodynamics--holds, I think, the supreme position among the laws of nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations--then so much the worse for Maxwell's equations. If it is found to be contradicted by observation--well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics, I can give you no hope; there is nothing for it but to collapse in deepest humiliation.".2... Therefore, the construction of a theory of intelligence around the process of entropy would seem to be a valid place to begin, even though some difficulties do arise.
The understanding of the law of entropy no longer seems to be nearly as clear-cut as Eddington suggested. There are apparent contradictions and paradoxes that result from our incomplete understanding of the definition of entropy. The correlation of existing concepts is tenuous, and in all probability will require modification in time.
At the moment of creation, which is currently considered to best be modeled by the "big bang," entropy was supposedly at its minimum. It has been increasing ever since and should continue to do so hereafter.
One contradiction arises when it is assumed that entropy decreases within a system any time information increases. This doesn't represent a problem so long as the system is a local open system, since it is known that even though the local system is decreasing in entropy, it is at the expense of an overall increase in entropy within the surrounding environment.
So where is the contradiction? Well, even though in all open local systems there is a decrease in entropy when there is an increase in information, the opposite would seem to hold true when viewing the universe as a whole. On a universal level there would appear to be increasing entropy at the same time that there is increasing information. This would seem possible if we make the assumption that neither energy nor information can be created or destroyed. What we are witnessing on a universal level is a redistribution of both, with information concentrating or focusing into all local systems as energy is being continuously dissipated or distributed into an ever-expanding space.
Netscape CTRL + D
MAC Command + D