欢迎来到淘文阁 - 分享文档赚钱的网站! | 帮助中心 好文档才是您的得力助手!
淘文阁 - 分享文档赚钱的网站
全部分类
  • 研究报告>
  • 管理文献>
  • 标准材料>
  • 技术资料>
  • 教育专区>
  • 应用文书>
  • 生活休闲>
  • 考试试题>
  • pptx模板>
  • 工商注册>
  • 期刊短文>
  • 图片设计>
  • ImageVerifierCode 换一换

    实验11-多元及岭回归分析.docx

    • 资源ID:50258894       资源大小:47.26KB        全文页数:16页
    • 资源格式: DOCX        下载积分:20金币
    快捷下载 游客一键下载
    会员登录下载
    微信登录下载
    三方登录下载: 微信开放平台登录   QQ登录  
    二维码
    微信扫一扫登录
    下载资源需要20金币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    实验11-多元及岭回归分析.docx

    重庆工商大学数学与统计学院统计专业实验课程实验报告实验课程: 统计专业实验 _ 指导教师: _ 叶勇_ 专业班级: _ 统计三班_ 学生姓名: _ 黄坤龙_ 学生学号: 2012101328_实 验 报 告实验项目实验11 多元及岭回归分析实验日期2015-6-10实验地点81010实验目的掌握多元回归模型的变量选择,岭回归分析的思想和操作方法。实验内容1.根据数据文件估计北京市人均住房面积的影响模型。并进行相应分析。2.建立重庆市人均住房面积的影响模型,根据统计年鉴收集整理指标数据,并进行模型估计和分析。实验思考题解答:1方差膨胀因子VIF的用途和计算公式是什么,其判断标准?答:方差膨胀因子是用来诊断一个序列是否存在多重共线性。自变量xj的方差膨胀因子记为VIF,它的计算方法为:VIF=1/1-Rj2。Rj2为以xj为因变量时对其他自变量回归的复测定系数。 VIF越大,表明多重共线性越严重。当0<VIF<10时,不存在多重共线性;当10VIF<100,存在较强的多重共线性;当VIF100时,存在严重的多重共线性。实验运行程序、基本步骤及运行结果:1.根据数据文件估计北京市人均住房面积的影响模型,并进行相应分析。 (1).首先,要确定因变量和自变量,根据题目,因变量为:人均住房面积y自变量为:人均全年收入x1人均可支配收入x2城镇储蓄存款余额x3人均储蓄余额x4国内生产总值x5人均生产总值x6基本投资额x7人均基本投资额x8 (2).然后利用SPSS进行多元线性回归分析,得到结果为:模型汇总b模型RR 方调整 R 方标准 估计的误差Durbin-Watson1.994a.988.981.246341.681a. 预测变量: (常量), x8, x7, x3, x6, x1, x2, x4。b. 因变量: y分析:根据拟合出来的模型可以知道,可决系数为0.988,调整后的可决系数为0.981.说明解释变量解释了被解释变量变异程度的98.1%,进而可以说明模型的拟合效果好。Anovab模型平方和df均方FSig.1回归59.60878.515140.325.000a残差.72812.061总计60.33619a. 预测变量: (常量), x8, x7, x3, x6, x1, x2, x4。b. 因变量: y分析:这是对于模型的整体显著性检验(F检验),根据结果可以看出F检验统计量为140.325,概率P值为0.000<0.05,说明模型通过了显著性检验,模型的拟合是有效的。已排除的变量b模型Beta IntSig.偏相关共线性统计量容差VIF最小容差1x510.462a1.469.170.4051.809E-555278.7791.780E-5a. 模型中的预测变量: (常量), x8, x7, x3, x6, x1, x2, x4。b. 因变量: y分析:根据多元线性回归模型的建立,将变量x5排除,它与模型中的其他解释变量存在很严重的多重共线性。系数a模型非标准化系数标准系数tSig.共线性统计量B标准 误差试用版容差VIF1(常量)3.964.24116.477.000x1.000.001-.956-.817.430.0011361.278x2-.001.001-2.180-2.195.049.001980.463x3.001.002.749.627.542.0011418.704x4.000.000-2.480-2.067.061.0011431.296x6.001.0005.1556.301.000.002665.397x73.285E-7.000.3492.505.028.05219.316x8.000.000.330.972.350.009114.391a. 因变量: y分析:这是对于模型的系数显著性检验(t检验),根据结果可以看出,常数项的P值为0.000<0.05,即是通过了显著性检验;x1的P值为0.43>0.05,没有通过显著性检验;x2的P照顾为0.049<0.05,通过了显著性检验;x3的P值为0.542>0.05,即是没有通过显著性检验;x4的P值为0.061>0.05,没有通过显著性检验;x6的P值为0.000<0.05,通过了显著性检验;x7的P值为0.052>0.05,没有通过显著性检验;x8的P值为0.009<0.05,通过了显著性检验。再根据方差扩大因子可以看出x1,x2,x3,x4,x6,x8存在多重共线性,只有x7不存在多重共线性。共线性诊断a模型维数特征值条件索引方差比例(常量)x1x2x3x4x6x7x8117.4441.000.00.00.00.00.00.00.00.002.4843.923.09.00.00.00.00.00.00.003.04512.870.00.00.00.00.00.00.45.004.02318.096.21.00.00.00.00.00.01.085.00348.783.30.01.01.02.02.06.37.196.00199.386.00.14.00.07.17.17.10.037.000144.498.09.04.95.02.00.29.05.128.000239.240.31.80.04.89.81.48.02.58a. 因变量: y残差统计量a极小值极大值均值标准 偏差N预测值5.314111.12147.86201.7712320残差-.41181.38168.00000.1957720标准 预测值-1.4381.840.0001.00020标准 残差-1.6721.549.000.79520a. 因变量: y(3).利用岭回归法对模型进行修正 岭回归法就是用过增加一个偏倚量c,使得模型估计更加稳定和显著。在SPSS中岭回归的实现:新建一个syntax窗口,调入岭回归语句(引号内为该文件实际所在路径):Include "d:Ridge regression.sps".岭回归命令格式:ridgereg enter=自变量列表 /dep = 因变量 /start=c初始值,默认为0 /stop=c终止值,默认为1 /inc=渐进步长,默认0.05) /k=c 指定偏倚系数,输出详细回归结果 .最后一定要有一个点.输入 ridgereg enter=x1 x2 x3 x4 x6 x7 x8 /dep = y /inc=0.01.点运行按钮 run 。得到结果为:R-SQUARE AND BETA COEFFICIENTS FOR ESTIMATED VALUES OF K K RSQ x1 x2 x3 x4 x6 x7 x8_ _ _ _ _ _ _ _ _.00000 .98793 -.955631 -2.18005 .748792 -2.47981 5.154638 .349141 .329859.01000 .94831 .378142 .176599 -.612495 -.498101 1.173739 .185817 .140657.02000 .93217 .308957 .200793 -.400480 -.301644 .779982 .112638 .242594.03000 .92303 .270773 .197581 -.290430 -.203683 .608333 .085146 .273692.04000 .91693 .246958 .192037 -.221381 -.143939 .510876 .073335 .282129.05000 .91246 .230606 .186853 -.173260 -.103246 .447625 .068238 .281821.06000 .90897 .218606 .182354 -.137464 -.073540 .403059 .066384 .277872.07000 .90614 .209373 .178488 -.109634 -.050802 .369855 .066208 .272429.08000 .90378 .202011 .175147 -.087294 -.032788 .344093 .066928 .266472.09000 .90176 .195980 .172235 -.068922 -.018140 .323481 .068126 .260469.10000 .90001 .190929 .169671 -.053524 -.005982 .306587 .069571 .254643.11000 .89847 .186626 .167394 -.040419 .004278 .292467 .071127 .249094.12000 .89710 .182904 .165354 -.029124 .013054 .280476 .072714 .243863.13000 .89588 .179646 .163513 -.019285 .020647 .270154 .074287 .238957.14000 .89477 .176764 .161841 -.010636 .027280 .261166 .075818 .234368.15000 .89376 .174190 .160313 -.002974 .033125 .253263 .077291 .230079.16000 .89283 .171875 .158908 .003862 .038311 .246253 .078698 .226069.17000 .89197 .169776 .157611 .009996 .042943 .239989 .080036 .222318.18000 .89118 .167863 .156407 .015531 .047103 .234353 .081304 .218805.19000 .89045 .166108 .155285 .020549 .050859 .229252 .082503 .215509.20000 .88976 .164491 .154236 .025117 .054264 .224610 .083636 .212414.21000 .88911 .162995 .153252 .029293 .057364 .220365 .084705 .209501.22000 .88850 .161603 .152325 .033124 .060197 .216467 .085713 .206756.23000 .88792 .160304 .151449 .036648 .062795 .212871 .086664 .204165.24000 .88738 .159088 .150620 .039902 .065183 .209544 .087561 .201715.25000 .88686 .157946 .149833 .042913 .067386 .206453 .088407 .199395.26000 .88636 .156870 .149084 .045706 .069423 .203573 .089205 .197194.27000 .88588 .155853 .148370 .048304 .071311 .200883 .089958 .195104.28000 .88543 .154890 .147687 .050725 .073064 .198362 .090669 .193116.29000 .88499 .153975 .147033 .052985 .074695 .195994 .091340 .191221.30000 .88457 .153105 .146406 .055100 .076216 .193764 .091975 .189415.31000 .88416 .152276 .145802 .057082 .077637 .191660 .092574 .187689.32000 .88376 .151483 .145222 .058942 .078966 .189671 .093141 .186039.33000 .88338 .150724 .144662 .060690 .080210 .187786 .093676 .184458.34000 .88301 .149997 .144122 .062336 .081378 .185997 .094183 .182944.35000 .88264 .149298 .143599 .063888 .082475 .184296 .094662 .181490.36000 .88229 .148626 .143093 .065353 .083507 .182675 .095116 .180094.37000 .88194 .147979 .142603 .066736 .084478 .181130 .095546 .178751.38000 .88160 .147355 .142127 .068045 .085394 .179654 .095952 .177458.39000 .88127 .146752 .141665 .069285 .086258 .178241 .096338 .176212.40000 .88095 .146169 .141215 .070460 .087073 .176889 .096702 .175011.41000 .88063 .145604 .140778 .071574 .087844 .175591 .097048 .173851.42000 .88031 .145057 .140351 .072633 .088573 .174345 .097375 .172731.43000 .88000 .144526 .139936 .073639 .089263 .173148 .097685 .171648.44000 .87970 .144011 .139530 .074595 .089916 .171995 .097979 .170599.45000 .87939 .143510 .139133 .075506 .090535 .170884 .098257 .169584.46000 .87910 .143023 .138746 .076373 .091123 .169813 .098520 .168600.47000 .87880 .142548 .138367 .077200 .091680 .168779 .098770 .167646.48000 .87851 .142085 .137996 .077988 .092209 .167780 .099006 .166720.49000 .87822 .141634 .137632 .078740 .092711 .166813 .099229 .165820.50000 .87794 .141193 .137276 .079458 .093188 .165878 .099441 .164946.51000 .87765 .140763 .136926 .080144 .093642 .164972 .099641 .164096.52000 .87737 .140342 .136583 .080799 .094073 .164094 .099830 .163269.53000 .87709 .139931 .136247 .081426 .094484 .163241 .100009 .162464.54000 .87681 .139528 .135916 .082026 .094874 .162414 .100178 .161679.55000 .87653 .139133 .135591 .082599 .095245 .161610 .100337 .160915.56000 .87626 .138747 .135271 .083148 .095598 .160828 .100488 .160169.57000 .87598 .138368 .134956 .083674 .095935 .160067 .100630 .159442.58000 .87571 .137996 .134646 .084178 .096255 .159327 .100763 .158732.59000 .87544 .137631 .134341 .084661 .096560 .158606 .100889 .158039.60000 .87517 .137273 .134041 .085124 .096850 .157903 .101007 .157361.61000 .87489 .136921 .133745 .085568 .097126 .157217 .101118 .156699.62000 .87462 .136575 .133453 .085993 .097390 .156548 .101222 .156051.63000 .87435 .136234 .133165 .086402 .097640 .155895 .101319 .155417.64000 .87408 .135900 .132881 .086793 .097879 .155257 .101410 .154796.65000 .87381 .135570 .132600 .087169 .098106 .154634 .101495 .154189.66000 .87355 .135246 .132324 .087530 .098322 .154024 .101574 .153594.67000 .87328 .134926 .132050 .087876 .098527 .153428 .101647 .153011.68000 .87301 .134611 .131780 .088209 .098723 .152844 .101715 .152439.69000 .87274 .134301 .131513 .088528 .098909 .152273 .101778 .151878.70000 .87247 .133995 .131250 .088835 .099086 .151713 .101836 .151328.71000 .87220 .133694 .130989 .089129 .099254 .151165 .101889 .150788.72000 .87193 .133396 .130731 .089412 .099413 .150627 .101938 .150258.73000 .87166 .133102 .130476 .089684 .099565 .150100 .101982 .149738.74000 .87139 .132812 .130224 .089945 .099709 .149583 .102021 .149227.75000 .87112 .132526 .129974 .090195 .099845 .149075 .102057 .148724.76000 .87085 .132243 .129727 .090436 .099974 .148577 .102089 .148230.77000 .87058 .131964 .129482 .090667 .100097 .148088 .102116 .147745.78000 .87031 .131688 .129240 .090889 .100213 .147607 .102141 .147267.79000 .87004 .131415 .129000 .091102 .100322 .147135 .102161 .146798.80000 .86976 .131145 .128762 .091307 .100426 .146670 .102179 .146335.81000 .86949 .130878 .128527 .091503 .100523 .146214 .102193 .145880.82000 .86922 .130614 .128294 .091692 .100615 .145764 .102203 .145432.83000 .86894 .130353 .128062 .091873 .100702 .145322 .102211 .144991.84000 .86867 .130095 .127833 .092047 .100783 .144887 .102216 .144556.85000 .86840 .129839 .127606 .092213 .100860 .144459 .102218 .144128.86000 .86812 .129586 .127380 .092373 .100931 .144038 .102217 .143706.87000 .86784 .129335 .127157 .092526 .100998 .143622 .102213 .143290.88000 .86757 .129087 .126935 .092673 .101060 .143213 .102207 .142880.89000 .86729 .128841 .126715 .092814 .101118 .142810 .102199 .142476.90000 .86701 .128598 .126497 .092949 .101172 .142412 .102188 .142077.91000 .86673 .128357 .126280 .093078 .101221 .142021 .102174 .141683.92000 .86645 .128118 .126065 .093202 .101267 .141634 .102159 .141295.93000 .86617 .127881 .125852 .093320 .101309 .141253 .102141 .140912.94000 .86589 .127646 .125640 .093433 .101347 .140877 .102121 .140533.95000 .86561 .127413 .125430 .093541 .101382 .140506 .102099 .140160.96000 .86532 .127182 .125221 .093645 .101414 .140139 .102075 .139791.97000 .86504 .126953 .125013 .093743 .101442 .139778 .102050 .139427.98000 .86475 .126726 .124808 .093837 .101466 .139421 .102022 .139067.99000 .86447 .126501 .124603 .093927 .101488 .139068 .101993 .1387111.0000 .86418 .126277 .124400 .094012 .101507 .138720 .101962 .138360可以看出,当偏倚系数C=0.04时,参数估计量趋于稳定,方差膨胀因子VIF小于10,共线性现象得到消除,进行详细岭回归估计:输入 ridgereg enter=x1 x2 x3 x4 x6 x7 x8 /dep = y /k=0.04.点运行按钮 run 。得到结果为:* Ridge Regression with k = 0.04 *Mult R .9575649365RSquare .9169306076Adj RSqu .8684734620SE .6462778971ANOVA table df SS MSRegress 7.000 55.324 7.903Residual 12.000 5.012 .418F value Sig F18.92250558 .00001362-Variables in the Equation- B SE(B) Beta B/SE(B)x1 .00011390 .00003901 .24695791 2.91987225x2 .00010380 .00003940 .19203674 2.63494995x3 -.00044223 .00024457 -.22138060 -1.80816742x4 -.00002525 .00001708 -.14393913 -1.47795434x6 .00013360 .00002858

    注意事项

    本文(实验11-多元及岭回归分析.docx)为本站会员(叶***)主动上传,淘文阁 - 分享文档赚钱的网站仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知淘文阁 - 分享文档赚钱的网站(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    关于淘文阁 - 版权申诉 - 用户使用规则 - 积分规则 - 联系我们

    本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知淘文阁网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

    工信部备案号:黑ICP备15003705号 © 2020-2023 www.taowenge.com 淘文阁 

    收起
    展开