434 The Cell Language Theory: Connecting Mind and Matterb2861 The Cell Language Theory: Connecting Mind and Matter “6x9”the third entity as substance (also called nature or God) while
Merleau-Ponty referred to it as flesh [290].My own preference is the triadic school, because this choice is sup-
ported by the following facts:- The units of energy (e.g., Kcal/mole) and information (e.g., bits) are
different. - The energy of the Universe is constant (the first law of thermodynamics)
but the information content of the Universe may and need not be con-
stant (for example, extinctions can occur unpredictably thereby reduc-
ing the information content of the biosphere and hence the Universe). - Energy is represented as a vector field, while information is associated
with the scalar field of the cosmic plenum [422]. - In molecular machines in action (see Section 3.4), energy and infor-
mation are indistinguishably intertwined into one entity called con-
formons (i.e., sequence-specific conformational strains of polymers)
[65]. Conformons can be viewed as specific instantiations of gnergy
in the living cell. The conformon concept was supported by the
results of the statistical mechanical analysis of supercoiled DNA
double helices in bacteria. These results are consistent with the fact
that gene expression requires the storage of mechanical energies in
sequence-specific sites within DNA duplexes, and such sequence-
specific DNA deformations are called SIDDs (stress-induced DNA
duplex destabilizations) [79, 80, 226], which are synonymous with
conformons [25, pp. 240–3].
10.14 A “Philosophical Table” for Classifying
Information, Entropy, and Energy
The concept of entropy was introduced by R. Clausius (1822–1888) in
thermodynamics as a state function of a thermodynamic system, namely,
a number that characterizes the physical state of the system independent
of its past history [59]. This thermodynamic concept was then extended
to the field of information theory by C. Shannon (1916–2001) in 1945
who, at the (somewhat haphazard) suggestion of von Neumann (1903–
1957), named his quantitative measure of information “entropy” (thusb2861_Ch-10.indd 434 17-10-2017 12:13:42 PM