Newer
Older
labs / tiddlers / content / labs / lab01 / _Labs_01_Computing Entropy.md

As mentioned in lectures, entropy is measure of how much information content (“surprise”) is present in a system.

Given a set of N symbols, and the probability of each symbol occurring, we can compute the entropy (in bits) as:

{{/Labs/01/Images/entropy.svg}}

where pi is the probability of encountering a given symbol. A worked example of computing entropy was given in lectures. For another example, consider a system with five symbols: A; B; C; D; and E, each occurring with probabilities: 0.0625; 0.25; 0.5; 0.0625; and 0.125. The entropy of this system is 1.875 bits, the computation of which is outlined in the following table:

Symbol i pi log2pi -pi x log2pi
A 1 0.2 -2.322 0.464
B 2 0.1 -3.322 0.332
C 3 0.3 -1.737 0.521
D 4 0.3 -1.737 0.521
E 5 0.1 -3.322 0.332
s=2.171
### Exercise
Use the table below to help you compute the entropy of a system with five symbols (A, B, C, D, E) with the probabilities 0.0625, 0.25, 0.5, 0.0625 and 0.125 (respectively):
Note:There is an editable worksheet doc on Blackboard you can use for your workings in this lab.
Symbol i pi log2pi -pi x log2pi
A 1 0.0625
B 2 0.25
C 3 0.5
D 4 0.0625
E 5 0.125
s= ?