求条件熵和互信息的matlab实现(2)

2018-11-19 21:21

Assignment 1_08116649_Chaoyun_Song

Derive an expression for the mutual information I(X,Y).

Plot the mutual information I(x,y) between the input and the output of the function of p where p is the probability of transmitting a `1', i.e. P(X = 1).

For what values of p is the mutual information maximisized? What is the value of this maximum?

The expression for the mutual information I(X, Y) is: I(X, Y)=??p(x,y)logp(x,y)p(x)p(y)

I(X, Y)=H(X)-H(X|Y)

For this mutual information, when P(X=1) the probability is p, and because it is a binary non-symmetric channel so P(X=0)=1-p, and we also know P(0|1)=0.1 P(1|0)=0.2, so P(0|0)=0.8, P(1|1)=0.9 then we can calculate P(Y=0)=(1-p)0.8+p0.1 P(Y=1)=(1-p)0.2+p0.9 H(X)=??p(x)log2p(x)

H(X)=-plog2p-(1-p)log2(1-p) H(X|Y)=??p(y)?p(x|y)log2p(x|y)

H(X|Y)=-P(Y=0)(0.8log0.8+0.2log0.2)-P(Y=1)(0.1log0.1+0.9log0.9) So the mutual information I(X, Y)=

-plog2p-(1-p)log2(1-p)

+P(Y=0)(0.8log0.8+0.2log0.2)+P(Y=1)(0.1log0.1+0.9log0.9) Using matlab we can get:

Assignment 1_08116649_Chaoyun_Song

Solution: Matlab Codes: >> p=0 : 0.01 : 1;

H1=-p.*log2(p)-(1-p).*log2(1-p); P1=(1-p).*0.8+p.*0.1; P2=(1-p).*0.2+p.*0.9;

H2=-P1*(0.8*log2(0.8)+0.2*log2(0.2))-P2*(0.1*log2(0.1)+0.9*log2(0.9)); I=H1-H2; plot(p,I)

From the diagram we can see

When p=0, the mutual information equals to -0.6 this is the minimum value. when p=0.5, the mutual information comes to the maximum value which I(X, Y)=0.4, in this case, the channel capacity becme the largest value.

Assignment 1_08116649_Chaoyun_Song

3. Discussion

In this paper, we finish three items. In the first item, using matlab write functions to calculate the entropy of A, B and C, we use the function ?-sum(A.*log2(A))? to calculate the entropy, after searching the matlab help. We find it can be instead by ?entropy()? and this will be more simple. And the definition of entropy diagram is not clear. In the third item, we get the diagram by plot on matlab, and the maximum value of mutual information I(X,Y)=0.4, in this point p=0.5, but in the diagram it is not accurate.

4. Conclusion

This paper including the basic exercise of elements of information theory. First there has a short introduction on A short introduction on Shannon's information content, entropy and mutual information. Then there are three assignment items about calculate the entropy of distributions; find the channel capacity of a binary symmetric channel(BSC) by plot on matlab and find the minimized capacity; plot the mutual information I(X,Y) of a binary non-symmetric channel and find the maximum value of it. By doing this exercise we get More in-depth understanding of information theory, and practice the skills on calculating the entropy, mutual information and channel capacity. It will help our future study on this subject.


求条件熵和互信息的matlab实现(2).doc 将本文的Word文档下载到电脑 下载失败或者文档不完整,请联系客服人员解决!

下一篇:读《你在为谁工作》心得体会

相关阅读
本类排行
× 注册会员免费下载(下载后可以自由复制和排版)

马上注册会员

注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信: QQ: