2.2、Entropy of functions. Let be a random variable taking on a finite number of values. What is the (general) inequality relationship of and if
(a) ?
(b) ?
Solution: Let . Then
.
Consider any t of ’s that map onto a single . For this t
,
Since is a monotone increasing function and . Extending this argument to the entire range of (and ), we obtain
司令塔,
with equality iff if one-to-one with probability one.
(a) is one-to-one and hence the entropy, which is just a function of the probabilities does not change, i.e., .
(b)is not necessarily one-to-one. Hence all that we can say is that 国内的英语, which equality if cosine is one-to-one on the range of .
adherent2.16. Example of joint entropy. Let be given by
Find
(a) ,.
(b) ,.
(c)
(d) .
(e)
(f) Draw a Venn diagram for the quantities in (a) through (e).
Solution:
Fig. 1 Venn diagram
(a).
(b)(come)()
(c)
(d)
(e)
(f)See Figure 1.
2.29 Inequalities. Let , and be joint random variables. Prove the following inequalities and find conditions for equality.
(a)
(b)
(c)
(d)
Solution:
(a)Using the chain rule for conditional entropy,
With equality iff ,that is, when is a function of and .
(b)Using the chain rule for mutual information,
,
With equality iff , that is, when and are conditionally independent given .
(c)Using first the chain rule for entropy and then definition of conditional mutual information,
,
megiWith equality iff , that is, when and are conditionally independent given .
(d)Using the chain rule for mutual information,
英文版动画片
And therefore this inequality is actually an equality in all cas.
4.5 Entropy rates of Markov chains.
(a) Find the entropy rate of the two-state Markov chain with transition matrix
(b) What values of ,maximize the rate of part (a)?
(c)Find the entropy rate of the two-state Markov chain with transition matrix
(d)Find the maximum value of the entropy rate of the Markov chain of part (c). We expect that the maximizing value of should be less than, since the 0 state permits more information to be generated than the 1 state.
Solution:
asked(a)The stationary distribution is easily calculated.东方丽人
Therefore the entropy rate is
(b)The entropy rate is at most 1 bit becau the process has only two states. This rate can be achieved if( and only if) , in which ca the process is actually i.i.d. with
.
(c)As a special ca of the general two-state Markov chain, the entropy rate is
.
(d)By straightforward calculus, we find that the maximum value of of part (c) occurs for . The maximum value is
(wrong!)
5.4 Huffman coding. Consider the random variable