Information is a property of all physical systems and is a measure of how highly organized they are. The information content of any system is the minimum number of bits required to encode a complete statistical description of the system. According to Shannon, the concepts of entropy and information are closely related by a conservation law, which states that the sum of the information and entropy is constant and equal to a system's maximum attainable information or entropy under a given set of conditions.
H + I = constant = Hmax = Imax|
(applies to open systems only)
H||= Thermodynamic entropy.|
|Hmax||= System's maximum attainable entropy.|
|Imax||= System's maximum attainable information.|
Any gain of information is compensated for by an equal loss of entropy (both heat and information).17... This applies to any open system whether it be a star, planet or living organism. As information content increases, there is a decrease in entropy locally within the system. When the universe is viewed as a whole however, both information content and heat entropy are increasing as would be expected in a closed system.
Shannon demonstrated that the log function of a number to base two determines the measure of information. Previously, Boltzmann found that this same log function determines thermodynamic entropy. For this reason one can conclude that a gain in information corresponds mathematically to the loss of entropy in any open system.18... A corresponding increase in entropy can then be accorded to the surrounding universe if it is considered to be a closed system, so that both information content and entropy are increasing simultaneously throughout a closed universe.
Norbert Wiener, while working on his doctoral dissertation at MIT in 1949, also recognized the relationship of entropy to information, and even stated that "information is entropy.".19... I interpret this to mean that there is a net gain of information any time energy is dissipated through the entropic process. As energy dissipates, matter congeals, and a higher level of order containing greater information is achieved.
Since all systems undergoing entropy do so exponentially, and since the accumulation of information has a direct relationship to the entropic process, it would seem reasonable to suggest that the evolutionary gain of information has likewise occurred exponentially. It might also follow, then, that the evolution of intelligence, which has undoubtedly occurred non-linearly, has occurred exponentially as well, since the level of intelligence of any organism is directly proportional to its information content.
The universe as a whole, which may be a closed system, was at its low point of heat entropy at the instant of its creation and has been continually gaining in entropy ever since. It is dissipating its available energy into a constantly increasing volume of space.
Every object within the universe, organic or inorganic, is an open system and during its life cycle, or period of existence, is in a state of negative heat and information entropy (syntropy). It is becoming more complex and contains increasingly greater amounts of information. Each object (subsystem) is locally decreasing its amount of entropy at the expense of an entropy increase in its surrounding environment. There would appear always to be a net increase in entropy throughout the universe, assuming that it is truly closed.
Netscape CTRL + D
MAC Command + D