Newer
Older
labs / tiddlers / content / labs / lab01 / _Labs_01_Computing Entropy.md

As mentioned in lectures, entropy is measure of how much information content (“surprise”) is present in a system.

Given a set of N symbols, and the probability of each symbol occurring, we can compute the entropy (in bits) as:

$$ S = - \sum^{N}_{i=1}p_i \times \log_2 p_i $$

where $p_i$ is the probability of encountering a given symbol. A worked example of computing entropy was given in lectures. For another example, consider a system with five symbols: $A$; $B$; $C$; $D$; and $E$, each occurring with probabilities: $0.2$; $0.1$; $0.3$; $0.3$; and $0.1$. The entropy of this system is $2.171$ bits, the computation of which is outlined in the following table:

Symbol $i$ $p_i$ $\log_2 p_i$ $-p_i \times \log_2 \ p_i$
A 1 0.2 -2.322 0.464
B 2 0.1 -3.322 0.332
C 3 0.3 -1.737 0.521
D 4 0.3 -1.737 0.521
E 5 0.1 -3.322 0.332
s=2.171

Exercise

Use the table below to help you compute the entropy of a system with five symbols (A, B, C, D, E) with the probabilities 0.0625, 0.25, 0.5, 0.0625 and 0.125 (respectively):

Note: There is an editable worksheet document on Blackboard you can use for your workings in this lab.

Symbol $i$ $p_i$ $\log_2 p_i$ $-p_i \times \log_2 p_i$
A 1 0.0625
B 2 0.25
C 3 0.5
D 4 0.0625
E 5 0.125
s= ?