Thomas M.Cover信息论英文教材课后题答案

更新时间:2023-05-17 06:39:31 阅读: 评论:0

2.2Entropy of functions. Let be a random variable taking on a finite number of values. What is the (general) inequality relationship of and if
(a) ?
(b) ?
Solution:  Let . Then
.
Consider any t of ’s that map onto a single . For this t
,
Since is a monotone increasing function and . Extending this argument to the entire range of (and ), we obtain
                      司令塔,
with equality iff if one-to-one with probability one.
(a) is one-to-one and hence the entropy, which is just a function of the probabilities does not change, i.e., .
(b)is not necessarily one-to-one. Hence all that we can say is that 国内的英语, which equality if cosine is one-to-one on the range of .
adherent2.16. Example of joint entropy. Let be given by
   
0
1
0
1/3
1/3
1
0
1/3
Find
(a) ,.
(b) ,.
(c)
(d) .
(e)
(f) Draw a Venn diagram for the quantities in (a) through (e).
Solution:
Fig. 1 Venn diagram
(a).
(b)(come)()
(c)
(d)
(e)
(f)See Figure 1.
2.29 Inequalities. Let , and be joint random variables. Prove the following inequalities and find conditions for equality.
(a)
(b)
(c)
(d)
Solution:
(a)Using the chain rule for conditional entropy,
With equality iff ,that is, when is a function of and .
(b)Using the chain rule for mutual information,
,
With equality iff , that is, when and are conditionally independent given .
(c)Using first the chain rule for entropy and then definition of conditional mutual  information,
          ,
megiWith equality iff , that is, when and are conditionally independent given .
(d)Using the chain rule for mutual information,
英文版动画片
And therefore this inequality is actually an equality in all cas.
4.5  Entropy rates of Markov chains.
(a)  Find the entropy rate of the two-state Markov chain with transition matrix
(b) What values of ,maximize the rate of part (a)?
(c)Find the entropy rate of the two-state Markov chain with transition matrix
(d)Find the maximum value of the entropy rate of the Markov chain of part (c). We expect that the maximizing value of should be less than, since the 0 state permits more information to be generated than the 1 state.
Solution:
asked(a)The stationary distribution is easily calculated.东方丽人
Therefore the entropy rate is
(b)The entropy rate is at most 1 bit becau the process has only two states. This rate can be achieved if( and only if) , in which ca the process is actually i.i.d. with
.
(c)As a special ca of the general two-state Markov chain, the entropy rate is
.
(d)By straightforward calculus, we find that the maximum value of of part (c) occurs for . The maximum value is
(wrong!)
5.4 Huffman coding. Consider the random variable

本文发布于:2023-05-17 06:39:31,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/78/664281.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图