信息论基础 [兼容模式].pdf
《信息论基础 [兼容模式].pdf》由会员分享,可在线阅读,更多相关《信息论基础 [兼容模式].pdf(24页珍藏版)》请在淘文阁 - 分享文档赚钱的网站上搜索。
1、2014-8-311Information theoryY.DINGEmail:Email:Email:Email:TelTelTelTel:87112490871124908711249087112490Reference book:Principles of digital communicationsAuthors:Suili FENG et alPublishing house:PHEI2About the course:3?Material?Energy?InformationAbout the course:Chapter 4.Fundamentals of information
2、 theory?Measure of information(entropy)?Discrete channel and capacity?Continuous source,channel and capacity?Source coding;?Rate distortion theory4.1 Introduction2014-8-312?Message v.s.Information(1)Message can be,but not limited to:symbols,letters,numbers,speeches,images etc.(2)Message may contain
3、information,or no information.(e.g.SMS message,multi-media message)(3)Amount of information:amount of uncertainty reduced by the reception of the message.(4)Purpose of communication:information transmission(5)Milestone of information theory:“A Mathematical Theory of Communication”by Claude Elwood Sh
4、annon,1948.第4章 信息论基础4.2 Measure of information(entropy)?Measure of information(1)Amount of information=amount of uncertaintyreduced by the reception of message;Uncertainty?likelihood?probability Amount of information:a function of probability/probabilities?(relationship with probability?)(2)Differen
5、t messages may be different in amount of information.Measurement:should be additive in amount of informationAmount of information:a function of probability/probabilities第4章 信息论基础Measurement of discrete sourceDescription of statistical discrete source with Npossible symbols:()11=Niixp()()()()NNxpxpxp
6、XpxxxX.:.:2121第4章 信息论基础BPSK,QPSK,16QAM?Measurement of discrete sourceAmount of Information:a function of probability:ifandare statistically independent,satisfies the additivity property:definewe have()()iixpfxI=ixjx()()()()()()jijijijixpfxpfxpxpfxxpfxxI+=()()()1logiiiI xfp xp x=()()()()()()11loglogi
7、jijijijI x xfp x xfp xp xp xp x=+第4章 信息论基础()iI x()ijI x xMeasurement of discrete sourceDefinition:the amount of information carried by message x x x xi i i i:base 2 log:bitnatural log:nitbase 10 log:hart()()()1loglogiiiI P xP xP x=第4章 信息论基础2014-8-313Measurement of discrete sourceExample:Source x fol
8、lows the distribution as:four possible symbols are statistically independent,calculate the amount of information contained in the sequence S=“113200”.()81414183:3210:XpX()()()()()()()()()()()()()()()bitsppppppppppppSpSI83.111415.11415.1232201log01log21log31log11log11log0023111log1log=+=+=第4章 信息论基础In
9、formation contained in one QPSK/16QAM symbol??Entropy:average amount of informationEntropy of discrete sourceDefinition:the entropy of sourceis defined as Physical significance:average amount of information contained in one symbol.NixXi,.,2,1,:=()()()iNiixpxpXHlog1=第4章 信息论基础Statistical expectation?E
10、ntropy of discrete sourceExample:calculate the entropy of source X()81414183:3210:XpX()()()()4133111111logloglogloglog884444881.906iiiH Xp xp x=+=bits symbol第4章 信息论基础?Entropy of discrete sourceExample:assumethatthesymbolsabovearestatisticallyindependent,calculated the information contained in the fo
11、llowingsequence.201 020 130 213 001 203 210 100 321 010 023 102 002 10 312 032 100 120 210(1)Exact computation (2)approximation using entropySolution 1:exact computation based on probability Solution 2:approximation using entropy()()413111log23log14log13log7log8448107.55iiiInp x=+=bits()()()()4123 1
12、4 1371.096108.62iiIn H X=+=bits第4章 信息论基础?Maximum entropy theoremDefinition:convex setfor we haveDefinition:for 型凸函数型凸函数型凸函数型凸函数(下凸函数下凸函数下凸函数下凸函数,convex function)型凸函数型凸函数型凸函数型凸函数(上凸函数上凸函数上凸函数上凸函数,concave function)(凸凸凸凸/凹函数凹函数凹函数凹函数?)()1,2,.,iiin ixxxx=?()1,2,.,jjjn jxxxx=?nxR?01()1ijxxx+?()()()()()11
13、ijijfxxf xf x+?()()()()()11ijijfxxf xf x+?,ijx xx?nxR?01第4章 信息论基础?Maximum entropy theoremConvex function has minimaConcave function has maxima()2f x()121xx+()()()121f xf x+()1f x1x()()121fxx+2xx()f x第4章 信息论基础Example:Concave function2014-8-314?Maximum entropy theoremis a concave function,probability
14、vector meets ,we have:Using the above conclusion,we have the following theorem:Theorem:entropy is a concave function of vector()H X()()()()12,.,Np xp xp x()12,.,Npp pp=?()f x()()11NNiiiiiifp xp f x=第4章 信息论基础Q:is concave,when does takes the maximum value?()H X11Niip=11Niip=()H X?Maximum entropy theor
15、emTheorem:Entropy takes the maximum if x follows equal probability distribution:Namely:equally distributed source has greatest uncertainty.()()()=NiNNNNNNNHxpxpxpH1211log1log11,.,1,1,.,max()H X第4章 信息论基础Proof?Example:binary source with P(x1)=p,P(x2)=1-p,H(X)=Hmaxif p=1/2.Hints:transmission efficiency
16、 can be improvedby proper transformation to make the sourcefollow an equal probability distributionStatistical variablefollows the distribution as:NjMiyxXYji,.,2,1;,.,2,1,:=()()()()()()()()()()111112121122212222221122,.,:,.,:.,.,NNNNMMMMMNMNx y p x yx yp x yx yp x yXYx yp x yx yp x yx yp x yp XYx y
17、p x yx yp x yx yp x y()111=MiNjjiyxp第4章 信息论基础with:?Joint entropy and conditional entropy?Joint entropy()()()()符号比特=MijiNjjiyxpyxpXYH11log()()()()()()()()()()()()()YHXHypypxpxpypxpypxpyxpyxpXYHjNjjMiiiMijiNjjiMijiNjji+=loglogloglog111111第4章 信息论基础Definition:the joint entropy of is defined as:With stat
18、istically independent X and Y,we have:NjMiyxXYji,.,2,1;,.,2,1,:=How about the case of multiple variables?Joint entropy?Joint entropy represents the average information brought by a groupof events.?For the inter-independent events group,the average information isequal to the sum of information brough
19、t by each single event.?For independent statistical variables X,Y,we cant get anyinformationof X fromY.第4章 信息论基础Definition:the Conditional entropyof is defined as:Conditional entropy第4章 信息论基础NjMiyxXYji,.,2,1;,.,2,1,:=()()()=MijiNjjiyxpyxpYXH11log()()()=MiijNjjixypyxpXYH11log()()XHYXHGenerally:For 2
20、correlated variables,ones appearancecan reduce theuncertaintyof the other.(例如:相关人物的出现)2014-8-3154.3 Discrete channel and capacity?Discrete channel and capacityChannel modelchannel input:channel output:Channelmodel(characteristics)canbedepictedbytransitionprobability()NjMiyxXYXYpji,.,2,1,.,2,1,:,=Mix
21、Xi,.,2,1,:=NjyYj,.,2,1,:=第4章 信息论基础?Discrete channel and capacityMore generally,Output depends on current input and history inputs as well.Channel with memory(multi-path channel,example):,1,2,.,1,2,.,ijXY x y iM jN=()()()()()1.kkkk njp yxxx第4章 信息论基础?Discrete channel and capacity?Memoryless channel:ou
22、tput depends on current input only,?Transition probability matrix:()()()()()()()()()()112111222212/././././NNMMNMp yxp yxp yxp yxp yxp yxp YXp yxp yxp yx=()()()()()()()()()()112111222212/././././MMNNMNp xyp xyp xyp xyp xyp xyp X Yp xyp xyp xy=第4章 信息论基础Posterior probability matrix()99.01/1=p()99.00/0
23、=p()01.01/0=p()01.00/1=pTransition probability matrix:()()()/0/01/10.99iip yxpp=()()()/1/00/10.01 jip yxppij=()()=99.001.001.099.0/ijxypXYp第4章 信息论基础Detection probability:Example:Binary input discrete memoryless channel.Transition matrixMutual information:depicted by posterior probability Posterior p
24、robability:uncertainty of transmitted symbol after the received symbol is taken into account?Discrete memoryless channel and capacityixjy第4章 信息论基础Upon the reception of,the uncertainty of is jyix()()jijiyxpyxI/1log/=Definition:mutual information is defined as()()();/ijiijI x yI xI xy=Mutual informati
25、on is the elimination of uncertainty on by the reception of jyix(例:文字传输,图像传输,(可能出现部分错误的情况)不确定性之差2014-8-316(4)If?(adding uncertainty)()()ijixpyxp/0(3)If?(no elimination)(2)If?(partial elimination)Properties of mutual information:(1)if?(total elimination)Mutual information:Symmetry of mutual informati
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 兼容模式 信息论基础 兼容模式 信息论 基础 兼容 模式
限制150内