As mentioned in lectures, entropy is measure of how much information content (“surprise”) is present in a system.
Given a set of N symbols, and the probability of each symbol occurring, we can compute the entropy (in bits) as:
{{/Labs/01/Images/entropy.svg}}
where pi is the probability of encountering a given symbol. A worked example of computing entropy was given in lectures. For another example, consider a system with five symbols: A; B; C; D; and E, each occurring with probabilities: 0.0625; 0.25; 0.5; 0.0625; and 0.125. The entropy of this system is 1.875 bits, the computation of which is outlined in the following table:
Symbol | i | pi | log2pi | -pi x log2pi |
---|---|---|---|---|
A | 1 | 0.2 | -2.322 | 0.464 |
B | 2 | 0.1 | -3.322 | 0.332 |
C | 3 | 0.3 | -1.737 | 0.521 |
D | 4 | 0.3 | -1.737 | 0.521 |
E | 5 | 0.1 | -3.322 | 0.332 |
s=2.171 |
Symbol | i | pi | log2pi | -pi x log2pi |
---|---|---|---|---|
A | 1 | 0.0625 | ||
B | 2 | 0.25 | ||
C | 3 | 0.5 | ||
D | 4 | 0.0625 | ||
E | 5 | 0.125 | ||
s= ? |