Shannon's formula, H + I = constant, would appear to apply to any open system. As a structure is more complex, containing a greater amount of information, it is also in a lower state of both heat and information entropy. The greater the amount of stored information an organism has, the lower its state of entropy. This formula does not appear to apply to a closed system such as the universe, which is both increasing in heat entropy as well as information, unless of course the universe isn't closed. There is in fact a question as to whether the universe is open or closed. If there are other universes beyond our own, then perhaps energy exchange might be taking place between them. It has been suggested that a black hole in one universe might be a white hole for an adjacent universe.
Intelligence is the only property within the universe that is evolving in the direction of increasing complexity in opposition to entropy. Boltzmann approached the recognition of this concept in 1894 when he stated that entropy was related to "missing information." He expressed this concept mathematically in the following way: S = k log W. S = Entropy, k = Boltzmann's universal constant, and W = the number of ways in which the parts in a system are so thoroughly randomized that there is no reason to expect the system to favor one particular arrangement of parts over any of the great number of other possible arrangements.20
Forces of chance and of anti-chance coexist in a complementary reciprocal relationship. The random element is entropy, the agent of chaos, which tends to destroy meaning. The non-random element is intelligence, which exploits the uncertainty inherent in the entropy principle so as to generate new structures, and to inform the world in new and more creative ways.21
Shannon came a step closer to recognizing the universal property of intelligence when he first presented his information theory. He indicated that sense and order could prevail against nonsense and chaos. The world could advance in the direction of greater information content and more complex structures, both physical and mental. He suggested that order was entirely natural. He proved that information in the form of a message could persist in the midst of haphazard disorder or noise. He also gave the first precise scientific measure of information, and indicated that the amount of information present in a system was of the same form as the equation devised for the entropy principle. Shannon's entropy equation suggested that there was a most compelling analogy between entropy and information. Information theory implies that as structures, living and non-living, become more complex, they also gain in information.22
Lila Gatlin in 1972 applied information theory to living systems. She has defined "life...as an information processing system--a structural hierarchy of functioning units--that has acquired through evolution the ability to store and process the information necessary for its own accurate reproduction.".23
Netscape CTRL + D
MAC Command + D