欢迎来到淘文阁 - 分享文档赚钱的网站! | 帮助中心 好文档才是您的得力助手!
淘文阁 - 分享文档赚钱的网站
全部分类
  • 研究报告>
  • 管理文献>
  • 标准材料>
  • 技术资料>
  • 教育专区>
  • 应用文书>
  • 生活休闲>
  • 考试试题>
  • pptx模板>
  • 工商注册>
  • 期刊短文>
  • 图片设计>
  • ImageVerifierCode 换一换

    应用信息论基础 (4).pdf

    • 资源ID:62899665       资源大小:1.10MB        全文页数:42页
    • 资源格式: PDF        下载积分:8金币
    快捷下载 游客一键下载
    会员登录下载
    微信登录下载
    三方登录下载: 微信开放平台登录   QQ登录  
    二维码
    微信扫一扫登录
    下载资源需要8金币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    应用信息论基础 (4).pdf

    Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryLecture 3.Channel CapacityFundamentals of Information Theory2017-fall TBSI1 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryOutline1.Channel Capacity and its PropertiesDefinition of Channel CapacityDiscrete Memoryless ChannelChannel Capacity and Its Properties2.Discrete Memoryless Channel3.Continuous and Analog Channel4.Joint Typicality5.Channel Coding6.SummaryFundamentals of Information Theory2017-fall TBSI2 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryDefinition of Channel CapacityCommunications and ChannelsCommunication:Reliable transmission of messages from one peer toanother.Shannons channel model in 1948Types of channels,according to the continuity of values in both time andamplitude,e.g.,discrete channel/digital channel,continuous channel andanalog channel.Fundamentals of Information Theory2017-fall TBSI3 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryDiscrete Memoryless ChannelDiscrete Memoryless Channel(DMC)Definition 3.1 A discrete memoryless channel is defined with the set ofinput alphabets X,the set of output alphabets Y and one-step transitionprobability matrix Q=q(yj|xi)i,j.Note that q(y|x)is the conditional probability of y given x,and thechannel matrix Q contains|X|rows and|Y|columns.Theorem 3.1 Let the input of channel to be X=(X1,X2,.,Xn)andthe output values Y=(Y1,Y2,.,Yn).The discrete memorylesschannel satisfiesI(X;Y)nXi=1I(Xi;Yj).(1)Fundamentals of Information Theory2017-fall TBSI4 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryChannel Capacity and Its PropertiesChannel CapacityDefinition 3.2 The capacity of a channel is defined asC=maxp(x)I(X;Y).(2)The information-theoretical channel capacity is defined as the maxmutual information between the input and output of the channel,over allpossible input distribution p(x)given a specific communication channel.For discrete memoryless channels,the equivalent formulation isC=maxp(x)I(p,Q),(3)where the mutual information is rewritten as the function of inputdistribution p and transition matrix Q.Fundamentals of Information Theory2017-fall TBSI5 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryChannel Capacity and Its PropertiesProperties of Channel CapacityTheorem 3.2 The basic properties of C is listed as follows1.Non-negative:C 0.2.Bounded:C log|X|,and C log|Y|.More importantly,I(X;Y)is a continuously concave function over p(x),implying that the maximum value,i.e,C,can be obtained using convexoptimization.A classical example of channel models:noisy type-writer.Definition 3.3 If the rows and columns of transition probability matrixare permutations of themselves,respectively,the channel is symmetric.Ifthe rows are permutations of each other and the sum of columns areequal,the channel is weakly symmetric.Fundamentals of Information Theory2017-fall TBSI6 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryOutline1.Channel Capacity and its Properties2.Discrete Memoryless ChannelSimple Examples of DMCCompute the DMC CapacityThe pmf to achieve the channel capacityCascading of ChannelsParallel Channels3.Continuous and Analog Channel4.Joint Typicality5.Channel Coding6.SummaryFundamentals of Information Theory2017-fall TBSI7 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummarySimple Examples of DMCSimple examples of DMCTheorem 3.3 The capacity of Binary Symmetric Channel(BSC)isC=1 H(?),(4)where?is the error rate,i.e.,the flipping probability for each bittransmitted.Theorem 3.4 The capacity of Binary Erasure Channel(BEC)isC=1 ,(5)where is the erasure probability,i.e.,the bit-loss rate.Theorem 3.5 The capacity of weak symmetric channel Q equalsC=log|Y|H(the distribution w.r.t the row of Q),(6)achievable with equally distributed probabilities.Fundamentals of Information Theory2017-fall TBSI8 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCompute the DMC CapacityA general solution for the capacity of DMCAs I(p,Q)is concave over p,the channel capacity can be formulated asthe maximization of the objective functionmaximizepI(p,Q),(7)subject toXxp(x)=1(8)p(x)0.(9)The methods of Lagrangian multiplier,gradient-based search,iterativealgorithms and KKT conditions.Fundamentals of Information Theory2017-fall TBSI9 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCompute the DMC CapacityConvex Programming using KKT conditionsTheorem 3.6 Karush-Kuhn-Tucker conditionsSuppose the objective function f(x)is defined over n-dimensional convexset S,where S=x=(x1,x2,.,xn)|xi 0,i=1,2,.,n,and it isdifferentiable and concave over the set.Let x=(x1,x2,.,xn)S bethe optimal solution.So the function f(x)can achieve its maximumvalue over S at x=xif and only iff(x)xi|x=x=0,if xi 0,(10)f(x)xi|x=x 0,if xi=0.(11)Note that x=(x1,x2,.,xn)is the interior point of S when xi 0.If xn=0,xis on the boundary of S.Fundamentals of Information Theory2017-fall TBSI10 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCompute the DMC CapacityThe capacity of DMCTheorem 3.7 For the discrete memoryless channel with transitionprobability matrix Q,the mutual information I(p,Q)is maximized toachieve the capacity C with respect to pif and only ifI(X=xi;Y)=C,for xithat p(xi)0(12)I(X=xi;Y)C,for xithat p(xi)=0,(13)where i 1,2,.,n andI(X=xi;Y)=JXj=1q(yj|xi)logq(yj|xi)p(yj)(14)indicates the average mutual information carried by xi.Fundamentals of Information Theory2017-fall TBSI11 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryThe pmf to achieve the channel capacityThe uniqueness of the distributionTheorem 3.8 The output distribution is unique when achieving thechannel capacity,and all possible input distributions are optimal thatmaximizes the mutual information.The optimal input distribution pis not necessarily unique.Theorem 3.9 The non-zero components of the optimal input distributionwith the maximized number of zero components are uniquely determined,and the number of non-zeros components are no more than the numberof output symbols.Note that the optimal input distribution with the most zero terms are notunique,but the non-zero components are permutations of each other.Fundamentals of Information Theory2017-fall TBSI12 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCascading of ChannelsThe cascade of independent channelsAs a cascade of two independent channelsX Y Z,(15)the data processing theorem tells I(Y;Z)I(X;Z).The capacity goes to zero with the increased number of cascades.The transition probability matrix of n cascades of channels is calculatedasQ=nYi=1Qi,(16)the product of all channel matrices.Data processing theorem tells us that the additional gain in channelcapacity can be lost rather than obtained from the increased cascades.Fundamentals of Information Theory2017-fall TBSI13 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryParallel ChannelsParallel channelsTheorem 3.10 Parallel channels sharing the same input X with differentoutput values Y1,Y2,.,Yn.1.The capacity of parallel channels with the same input is greater than thatof any single channel and less than maxH(X).2.Diversity in wireless communications is the case.Theorem 3.11 The channels used in parallel with multiple input andmultiple output(MIMO).1.Sub-channels are independent on each other and X=X1,X2,.,Xnare transmitted through these sub-channels to get Y=Y1,Y2,.,Yn,simultaneously.The equivalent capacity is the sum of capacities of theparallel sub-channels.2.Multiplexing in communications is the case.Theorem 3.12 The probabilistic utilization of n channels leads to theequivalent capacity C=logPni=12Cnwith probability pi(C)=2(CiC).Opportunistic communication is the case.Fundamentals of Information Theory2017-fall TBSI14 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryOutline1.Channel Capacity and its Properties2.Discrete Memoryless Channel3.Continuous and Analog ChannelCapacity-cost FunctionAdditive Noise Continuous ChannelWater FillingThe notion of analog channel capacityShannon FormulaInsights of Shannon Formula4.Joint Typicality5.Channel Coding6.SummaryFundamentals of Information Theory2017-fall TBSI15 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryCapacity-cost FunctionThe Capacity of Continuous ChannelContinuous channel that is discrete in time and continuous in values withpower constraint.Definition 3.4 Define the function b()0 as the cost for a sequencex=(x1,x2,.,xn)for continuous memoryless channel X,q(y|x),Y.Let the joint probability distribution of random vectorX=X1,X2,.,Xn be p(x).Then,the average cost isE(b(X)=Xxp(x)b(x).(17)Definition 3.5 Let X and Y be the n-dimensional vectors as the inputand output of a continuous channel.The capacity-cost function isdefined asC()=limn1nmaxp(x):E(b(X)nI(X;Y).(18)Fundamentals of Information Theory2017-fall TBSI16 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryAdditive Noise Continuous ChannelContinuous memoryless channel with Additive NoiseThe mutual information is simplified to nI(X,Y)with stationarymemoryless input,and the capacity-utility function becomesC()=maxp(x):E(b(X)I(X;Y).(19)The channel output equals Y=X+Z,where Z is memorylessstationary noise added on the input X.Theorem 3.13 The capacity-cost function of continuous memorylessadditive noisy channel isC(PS)=maxp(x):E(X2)PSh(Y)h(Z).(20)Fundamentals of Information Theory2017-fall TBSI17 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryAdditive Noise Continuous ChannelAdditive White Gaussian Noise channel(AWGN)Theorem 3.14 The capacity-cost function for memoryless additivegaussian noise channel isC(PS)=12log(1+PSPN),(21)wherePSPNis the signal-to-noise ratio.AWGN channel can always be fully exploited using gaussian input at thecosts of equal transmission power.Theorem 3.15 If the input signal X is gaussian,the mutual informationis minimized with gaussian noise under power constraint PN.It indicates that gaussian noise is the worst when the input is given asgaussian distribution.Fundamentals of Information Theory2017-fall TBSI18 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryWater FillingContinuous parallel channels and water-fillingInput vector X=(X1,X2,.,Xn)and allocated power PSiof the i-thelement;channel noise vector Z=(Z1,Z2,.,Zn)and noise powerPNi,i=1,2,.,n.The capacity of the parallel channels isC(PS)=maxp(X):Pi=1PSiP0I(X;Y),(22)where P0is the budget for the total transmitting power.The capacity is achievable by allocating power as water-filling viaLagrange multiplier.Fundamentals of Information Theory2017-fall TBSI19 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryThe notion of analog channel capacityThe capacity of analog channelAnalog channel is continuous in both time and values,e.g.,fiber,wire,electromagnetic wave.A special case:Additive white gaussian noise(AWGN)channel.1.limited bandwidth:W2.additive noise:y(t)=x(t)+z(t)3.white noise:stationary ergodic stochastic process and a uniformdistribution in spectral density.4.Gaussian noise:The noise is gaussian at any time.Problem formulation1.Input signal x(t)with limited bandwidth W,W,duration T and powerconstrain P.2.Output signal y(t)=x(t)+z(t)3.Noise z(t)is additive white gaussian with zero-mean,and its double-sidespectral density isN(f)=N02,if|f|W(23)and zero otherwise.Fundamentals of Information Theory2017-fall TBSI20 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryThe notion of analog channel capacityThe capacity of analog channelSampling the analog signal in time is equivalent to the parallelcombination of addictive gaussian channels.So the total capacity iswritten asCT(PS)=122WTXi=1log(1+PSiPNi)(24)under the noise power constraint PN=2WTPNi=2WTN02=WTN0.The transmission power is allocated asPSi=PST+PN2WTN02=PS2W(25)via water-filling.Then we can obtainPSiPNi=PSWN0,(26)which is used to get Shannon formula.Fundamentals of Information Theory2017-fall TBSI21 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryShannon FormulaShannon FormulaTheorem 3.16 The capacity of AWGN channel isC=W log(1+PSN0W).(27)Remarks:the physical dimesions of terms in Shannon formula1.capacity Cbps bit-per-second2.bandwidth WHz or s13.signal power PSWatt4.noise spectral density N0Watt/HzTwo ways to increase the channel capacity1.expand the bandwidth W,bounded bylimWC(PS)=limWPSN0log(1+PSN0W)N0WPS=PSN0loge 1.44PSN.(28)2.increase the transmission power PS,approximated asC(PS)W logPSN0W(29)at high SNR regime.Fundamentals of Information Theory2017-fall TBSI22 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryInsights of Shannon FormulaInformation and thermal-dynamicsWhen approaching the capacity,the energy per bit isEb=PSC(J/bit).(30)The normalized SNR ratioEbN0isEbN0=PSN0C=PSN0W log(1+PSN0W)=SNRlog(1+SNR),(31)lower bounded by Shannon-limit ln2 0.693 or 1.59dB if SNRapproaches zero.Remarks:1.Shannon limit indicates the minimum required energy for 1-bittransmission.2.Thermal dynamics shows the noise power spectral density N0=kT inW/Hz,where k is Boltzmann constant and T is Kelvin temperature.Therefore,the minimum energy required per bit is Eb ln2kT 0.693kTfrom the point of view of physics.Fundamentals of Information Theory2017-fall TBSI23 43Channel Capacity and its PropertiesDiscrete Memoryless ChannelContinuous and Analog ChannelJoint TypicalityChannel CodingSummaryOutline1.Channel Capacity and its Properties2.Discrete Memory

    注意事项

    本文(应用信息论基础 (4).pdf)为本站会员(刘静)主动上传,淘文阁 - 分享文档赚钱的网站仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知淘文阁 - 分享文档赚钱的网站(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    关于淘文阁 - 版权申诉 - 用户使用规则 - 积分规则 - 联系我们

    本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知淘文阁网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

    工信部备案号:黑ICP备15003705号 © 2020-2023 www.taowenge.com 淘文阁 

    收起
    展开