应用信息论基础 (4).pdf
《应用信息论基础 (4).pdf》由会员分享,可在线阅读,更多相关《应用信息论基础 (4).pdf(42页珍藏版)》请在淘文阁 - 分享文档赚钱的网站上搜索。
1、Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryLecture 3.Channel CapacityFundamentals of Information Theory2017-fall TBSI1 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoi
2、nt TypicalityChannel CodingSummaryOutline1.Channel Capacity and its PropertiesDefinition of Channel CapacityDiscrete Memoryless ChannelChannel Capacity and Its Properties2.Discrete Memoryless Channel3.Continuous and Analog Channel4.Joint Typicality5.Channel Coding6.SummaryFundamentals of Information
3、 Theory2017-fall TBSI2 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryDefinition of Channel CapacityCommunications and ChannelsCommunication:Reliable transmission of messages from one peer toanother.Shannons channel m
4、odel in 1948Types of channels,according to the continuity of values in both time andamplitude,e.g.,discrete channel/digital channel,continuous channel andanalog channel.Fundamentals of Information Theory2017-fall TBSI3 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and An
5、alog ChannelJoint TypicalityChannel CodingSummaryDiscrete Memoryless ChannelDiscrete Memoryless Channel(DMC)Definition 3.1 A discrete memoryless channel is defined with the set ofinput alphabets X,the set of output alphabets Y and one-step transitionprobability matrix Q=q(yj|xi)i,j.Note that q(y|x)i
6、s the conditional probability of y given x,and thechannel matrix Q contains|X|rows and|Y|columns.Theorem 3.1 Let the input of channel to be X=(X1,X2,.,Xn)andthe output values Y=(Y1,Y2,.,Yn).The discrete memorylesschannel satisfiesI(X;Y)nXi=1I(Xi;Yj).(1)Fundamentals of Information Theory2017-fall TBS
7、I4 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryChannel Capacity and Its PropertiesChannel CapacityDefinition 3.2 The capacity of a channel is defined asC=maxp(x)I(X;Y).(2)The information-theoretical channel capacit
8、y is defined as the maxmutual information between the input and output of the channel,over allpossible input distribution p(x)given a specific communication channel.For discrete memoryless channels,the equivalent formulation isC=maxp(x)I(p,Q),(3)where the mutual information is rewritten as the funct
9、ion of inputdistribution p and transition matrix Q.Fundamentals of Information Theory2017-fall TBSI5 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryChannel Capacity and Its PropertiesProperties of Channel CapacityTheo
10、rem 3.2 The basic properties of C is listed as follows1.Non-negative:C 0.2.Bounded:C log|X|,and C log|Y|.More importantly,I(X;Y)is a continuously concave function over p(x),implying that the maximum value,i.e,C,can be obtained using convexoptimization.A classical example of channel models:noisy type
11、-writer.Definition 3.3 If the rows and columns of transition probability matrixare permutations of themselves,respectively,the channel is symmetric.Ifthe rows are permutations of each other and the sum of columns areequal,the channel is weakly symmetric.Fundamentals of Information Theory2017-fall TB
12、SI6 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryOutline1.Channel Capacity and its Properties2.Discrete Memoryless ChannelSimple Examples of DMCCompute the DMC CapacityThe pmf to achieve the channel capacityCascadin
13、g of ChannelsParallel Channels3.Continuous and Analog Channel4.Joint Typicality5.Channel Coding6.SummaryFundamentals of Information Theory2017-fall TBSI7 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummarySimple Examples
14、of DMCSimple examples of DMCTheorem 3.3 The capacity of Binary Symmetric Channel(BSC)isC=1 H(?),(4)where?is the error rate,i.e.,the flipping probability for each bittransmitted.Theorem 3.4 The capacity of Binary Erasure Channel(BEC)isC=1 ,(5)where is the erasure probability,i.e.,the bit-loss rate.Th
15、eorem 3.5 The capacity of weak symmetric channel Q equalsC=log|Y|H(the distribution w.r.t the row of Q),(6)achievable with equally distributed probabilities.Fundamentals of Information Theory2017-fall TBSI8 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog Channel
16、Joint TypicalityChannel CodingSummaryCompute the DMC CapacityA general solution for the capacity of DMCAs I(p,Q)is concave over p,the channel capacity can be formulated asthe maximization of the objective functionmaximizepI(p,Q),(7)subject toXxp(x)=1(8)p(x)0.(9)The methods of Lagrangian multiplier,g
17、radient-based search,iterativealgorithms and KKT conditions.Fundamentals of Information Theory2017-fall TBSI9 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCompute the DMC CapacityConvex Programming using KKT conditi
18、onsTheorem 3.6 Karush-Kuhn-Tucker conditionsSuppose the objective function f(x)is defined over n-dimensional convexset S,where S=x=(x1,x2,.,xn)|xi 0,i=1,2,.,n,and it isdifferentiable and concave over the set.Let x=(x1,x2,.,xn)S bethe optimal solution.So the function f(x)can achieve its maximumvalue
19、over S at x=xif and only iff(x)xi|x=x=0,if xi 0,(10)f(x)xi|x=x 0,if xi=0.(11)Note that x=(x1,x2,.,xn)is the interior point of S when xi 0.If xn=0,xis on the boundary of S.Fundamentals of Information Theory2017-fall TBSI10 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and
20、 Analog ChannelJoint TypicalityChannel CodingSummaryCompute the DMC CapacityThe capacity of DMCTheorem 3.7 For the discrete memoryless channel with transitionprobability matrix Q,the mutual information I(p,Q)is maximized toachieve the capacity C with respect to pif and only ifI(X=xi;Y)=C,for xithat
21、p(xi)0(12)I(X=xi;Y)C,for xithat p(xi)=0,(13)where i 1,2,.,n andI(X=xi;Y)=JXj=1q(yj|xi)logq(yj|xi)p(yj)(14)indicates the average mutual information carried by xi.Fundamentals of Information Theory2017-fall TBSI11 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog Ch
22、annelJoint TypicalityChannel CodingSummaryThe pmf to achieve the channel capacityThe uniqueness of the distributionTheorem 3.8 The output distribution is unique when achieving thechannel capacity,and all possible input distributions are optimal thatmaximizes the mutual information.The optimal input
23、distribution pis not necessarily unique.Theorem 3.9 The non-zero components of the optimal input distributionwith the maximized number of zero components are uniquely determined,and the number of non-zeros components are no more than the numberof output symbols.Note that the optimal input distributi
24、on with the most zero terms are notunique,but the non-zero components are permutations of each other.Fundamentals of Information Theory2017-fall TBSI12 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCascading of Chann
25、elsThe cascade of independent channelsAs a cascade of two independent channelsX Y Z,(15)the data processing theorem tells I(Y;Z)I(X;Z).The capacity goes to zero with the increased number of cascades.The transition probability matrix of n cascades of channels is calculatedasQ=nYi=1Qi,(16)the product
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 应用信息论基础 4 应用 信息论 基础
限制150内