Entropy
Entropy is a measure of the average uncertainty about what the value of a random variable is before observing it. Entropy is measured in bits.
An entropy of H bits means that in order to provide
information about the value of the as yet unobserved random variable, it will require, on the average,
an
One way to explain the meaning of the H bit message is by the following game played between
person A and person B. Person A samples
at random a value v of the random variable X. Person B knows what the probability
is of random variable
If P denotes the probability function of a discrete random variable X which takes possible values {x1,...,xN} and H(X) denotes the entropy of the random variable X, then the entropy of the random variable X is minus the expected value of log to the base 2 of P(X)
H(X) = -E[log_2 P(X)]= - ΣNn=1 P(xn) log_2 P(xn}) .