18 The Cell Language Theory: Connecting Mind and Matter
b2861 The Cell Language Theory: Connecting Mind and Matter “6x9”
= - log 2 (1/6)
= log 2 6 = 2.6 bits. (2.8)
The meaning of the result in Eq. (2.8) is that the amount of uncer
tainty that we have about which of the six numbers will show up upon
one throw of a dice is 2.6 bits. In other words, it will take 2.6 bits of
information to correctly predict which of the six numbers will appear
upon throwing a fair dice.
(2) The case of a “loaded” dice. Let us assume that the dice has been
tampered with so that the probability of each number showing up is
not equal but different as shown in Table 2.2.
Therefore, the result in Table 2.2 indicates that the average uncer
tainty about the appearance of a number is less when the dice is loaded
than when it is fair (2.42 bits vs. 2.6 bits).
2.2.3 Planckian Information (IPl)
A new measure of information, denoted as IPl, was introduced in 2015 [26]
following the discovery of the universality of the Planckian distribution
equation (PDE) in [27] discussed in Section 8.1. One interesting difference
between Shannon information, ISh, and IPl , defined in Section 8.5.1, is sug
gested to be that IPl is to organized complexity (studied in biomedical sci-
ences) what ISh is to disorganized complexity of Weaver (studied in
statistical mechanics) [366].
Table 2.2 Calculating the Shannon entropy, H, of a loaded dice.
ith Possibility pi Pi log 2 pi H (bits)
1 0.15 0.15 × (log 2 6.67) = 0.15 × 2.75 0.41
2 0.25 0.25 × (log 2 4) = 0.25 × 2 0.50
3 0.11 0.11 × (log 2 9.09) = 0.11 × 3.18 0.35
4 0.25 0.50
5 0.10 0.10 × (log 2 10) = 0.10 × 3.33 0.33
6 0.10 0.33
Spi = 1.00 –Spi log 2 pi = 2.42
Note: The Shannon entropy of a fair dice is calculated to be 2.6 bits; see Eq. (2.8).
b2861_Ch-02.indd 18 17-10-2017 11:38:58 AM