As mentioned in lectures, entropy is measure of how much information content (“surprise”) is present in a system. Given a set of N symbols, and the probability of each symbol occurring, we can compute the entropy (in bits) as: {{/Labs/01/Images/entropy.svg}} where pi is the probability of encountering a given symbol. A worked example of computing entropy was given in lectures. For another example, consider a system with five symbols: A; B; C; D; and E, each occurring with probabilities: 0.0625; 0.25; 0.5; 0.0625; and 0.125. The entropy of this system is 1.875 bits, the computation of which is outlined in the following table:
Symbol i pi log2pi -pi x log2pi
A10.2-2.3220.464
B20.1-3.3220.332
C30.3-1.7370.521
D40.3-1.7370.521
E50.1-3.3220.332
s=2.171
### Exercise Use the table below to help you compute the entropy of a system with five symbols (A, B, C, D, E) with the probabilities 0.0625, 0.25, 0.5, 0.0625 and 0.125 (respectively): Note:There is an editable worksheet doc on Blackboard you can use for your workings in this lab.
Symbol i pi log2pi -pi x log2pi
A10.0625
B20.25
C30.5
D40.0625
E50.125
s= ?