欢迎来到淘文阁 - 分享文档赚钱的网站! | 帮助中心 好文档才是您的得力助手!
淘文阁 - 分享文档赚钱的网站
全部分类
  • 研究报告>
  • 管理文献>
  • 标准材料>
  • 技术资料>
  • 教育专区>
  • 应用文书>
  • 生活休闲>
  • 考试试题>
  • pptx模板>
  • 工商注册>
  • 期刊短文>
  • 图片设计>
  • ImageVerifierCode 换一换

    人工智能(Artificial Intelligence)概述.pdf

    • 资源ID:3865086       资源大小:3.27MB        全文页数:99页
    • 资源格式: PDF        下载积分:10金币
    快捷下载 游客一键下载
    会员登录下载
    微信登录下载
    三方登录下载: 微信开放平台登录   QQ登录  
    二维码
    微信扫一扫登录
    下载资源需要10金币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    人工智能(Artificial Intelligence)概述.pdf

    Artifi cial intelligence is the apex technology of the information era. In the latest in our Profi les in Innovation series, we examine how advances in machine learning and deep learning have combined with more powerful computing and an ever-expanding pool of data to bring AI within reach for companies across industries. The development of AI-as-a-service has the potential to open new markets and disrupt the playing fi eld in cloud computing. We believe the ability to leverage AI will become a defi ning attribute of competitive advantage for companies in coming years and will usher in a resurgence in productivity. Heath P . Terry, CFA (212) 357-1849 Goldman, Sachs identify clusters of unusual behavior. Predictive. Predict the likelihood of customer or employee churn based on web activity and other metadata; predict health issues based on wearable data. What is General, Strong or True Artificial Intelligence? General, Strong, or True Artificial Intelligence are terms used for machine intelligence that fully replicates human intelligence including independent learning and decision making. While techniques like Whole Brain Emulation are being used to work towards the goal of General AI, the amount of compute power required is still considered far beyond current technologies, making General AI largely theoretical for the time being. via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 11 Key drivers of value creation We believe profit pool creation (and destruction) related to the AI theme is best analyzed by first breaking AI down into four key inputs: talent, data, infrastructure and silicon. These inputs also double as barriers to adoption. Talent AI (deep learning in particular) is hard. Per our conversations with VCs and companies in the space, this has created a talent shortage and a competition for this talent among large internet and cloud computing vendors (Exhibit 5). AI talent is in high enough demand that “acquihires” are still a common means to acquire necessary talent. As the technology and tooling matures, talent may become less of a bottleneck. However, we believe talent will migrate to interesting, differentiated data sets. Due to this, we believe large differentiated data sets are the most likely driver of growth and incremental profit dollars as we move into an AI-centric world. Exhibit 5: A Scarcity of AI Talent is Driving M 0= least accurate, 100= most accurate Source: Department of Radiology at Massachusetts General Hospital and Harvard Medical School Most deep learning today is either supervised or semi-supervised, meaning all or some of the data utilized to train the model must be labeled by a human. Unsupervised machine learning is the current “holy grail” in AI, as raw un-labeled data could be utilized to train models. Broad adoption of deep learning will likely be tied to growth in large data sets (which is happening due to mobile and IoT) and to advances in unsupervised machine learning. However, we believe large differentiated data sets (electronic health records, omics data, geological data, weather data, etc.) will likely be a core driver of profit pool creation over the next decade. The amount of information created worldwide is expected to increase at a CAGR of 36% through 2020, reaching 44 Zettabytes (44 billion GB), according to IDC. Increases in connected devices (consumer and industrial), machine-to-machine communication, and remote sensors are combining to create large data sets that can then be mined for insights and to train adaptive algorithms. Availability of data has also increased dramatically in the last decade, with census, labor, weather, and even genome data available for free online in large quantities. We are also seeing increased availability of satellite imagery, which requires a great deal of compute to fully analyze. The US Geological Surveys Landsat 7 and Landsat 8 satellites image the entire Earth every 8 days, and the USGS makes those images available for free though even when compressed, the ultra-high definition images are approximately 1GB each, in file size. Other companies, like Orbital Insights are aggregating image data and creating commercial solutions across multiple industries. Infrastructure Hardware and infrastructure software are necessary to make AI work. We believe infrastructure to support AI will rapidly become commoditized. This view is based on two observations: 1) cloud computing vendors are well positioned to extend their offerings into AI infrastructure, 2) open source (TensorFlow, Caffe, Spark, etc.) has emerged as the primary driver of software innovation in AI. To spur adoption of AI, we believe large cloud vendors will continue to open source infrastructure capabilities, limiting the potential for profit pool creation. Training Data Size 5102050100200 Brain0.33.3945.7159.0772.8298.44 Neck21.330.6379.9799.3499.7499.33 Shoulder2.9821.3969.6486.5795.5392.94 Chest23.3934.4562.5396.1895.2599.61 Abdomen0.13.2335.465.8391.0195.18 Pelvis01.1515.9955.983.788.45 Average8.0117.3751.5477.1589.6895.67 via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 13 Exhibit 7: Internet Giants (such as Google) are spurring interest in AI via open sourcing technologies (such as TensorFlow) GitHub repositories most starred 2015-2016 Source: GitHub Silicon The repurposing of GPUs for deep learning has been one of the key drivers of our current “AI Spring”. Within the AI/ML ecosystem, there are two primary applications that determine the performance of a neural network with each requiring a different resource setup. The first is the construction and use of a training algorithm. The training algorithm leverages a large (usually the larger, the better) data set to find correlations and build a model that can determine the probability of an output, given a new input. Training is very resource-intensive, and most modern training is done on GPU-powered systems. The use of models and algorithms once they have been trained is referred to as inference. Inference requires far less computing power, and typically combs through smaller, incremental data input sets. While some GPUs are optimized for inference (Nvidias P4 series and M4 series, for example) given the single-purpose nature of inference, specialized silicon is being developed specifically for that application, referred to as FPGAs (Field Programmable Gate Array) and ASICs (Application Specific Integrated Circuit). This type of integrated circuit was originally developed for prototyping CPUs, but is increasingly being used for inference in artificial intelligence. Googles Tensor Processing Unit, is an example of an ASIC purpose-built for AI and machine learning. Microsoft has been using FPGA chips for inference, as well. Intel acquired FPGA manufacturer, Altera, in 2015 on the view that by 2020, a third of data centers could be leveraging FPGAs for specialized use cases. Xilinx, which pioneered commercially viable FPGAs in the 1980s, has pointed to the cloud and large data centers as a significant avenue of growth going forward, having announced a strategic customer relationship with Baidu. Data centers make up roughly 5% of Xilinxs revenue now. 0 5000 10000 15000 20000 25000 30000 35000 40000 45000 Facebook React Native (a native app framework) Apple Swift (a programming language) Tensorflow (a library for machine learning) StarsForks via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 14 Exhibit 8: Evolution of AI: 1950-Present Source: Company data, Goldman Sachs Global Investment Research via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 Fueling the future of productivity Labor productivity growth in the U.S. has come to a halt in recent years after modest growth in the past decade and significant growth in the mid-late 1990s. We believe that proliferation of consumable machine learning and AI has the potential to dramatically shift the productivity paradigm across global industries, in a way similar to the broad scale adoption of internet technologies in the 1990s. Across industries, we see a roughly 0.5%-1.5% reduction in labor hours spurred by automation and efficiency gains brought to bear by AI/ML technologies resulting in a +51- 154bps impact on productivity growth by 2025. While we expect AI/ML to improve both the denominator and numerator of productivity over time, we believe the most significant, early impacts will be on the automation of lower-wage tasks driving similar levels of output growth with less labor hours. Our base case AI/ML driven improvement of 97 bps implies a 2025 productivity growth IT contribution of 1.61%, or 11bps higher than 1995- 2004 (Exhibits 9, 10). Exhibit 9: Productivity analysis $ millions, assumes linear nominal GDP growth beyond 2019 Source: OECD, US Bureau of Labor Statistics, Goldman Sachs Global Investment Research Technology and productivity growth The 1990s technology boom saw unusual amplification of each of the two primary components of productivity, capital deepening and multifactor productivity (MFP), and was strongly correlated with rising equity valuations. Capital Deepening. GS economist Jan Hatzius has provided recent analysis on the anti- cyclical tendency of capital deepening (capital stock per labor hour), as labor hours historically tend to rise during expansionary periods without an equal surge in capital stock (see Jans report: “Productivity Paradox v2.0 Revisited”, published on 09/2/2016). In the 1990s, capital deepening increased markedly, highlighted by atypical capital investment increases that outpaced growth in the labor market. Multifactor productivity (MFP). A March, 2013 Federal Reserve study by David Byrne et al. suggests that the simultaneous diffusion of technology into IT-producing and general US2016E2017E2018E2019E2020E2021E2022E2023E2024E2025E Output USNominalGDP*($bn)18,55219,30020,04520,75721,47022,18322,89523,60824,32125,034 2.9%4.0%3.9%3.6%3.4%3.3%3.2%3.1%3.0%2.9% Productivity Laborproductivity69.070.471.873.174.375.476.577.678.679.7 yoygrowth(%)0.9%2.1%2.0%1.7%1.6%1.6%1.5%1.4%1.3%1.3% Laborhours(mn)268,958273,992279,026284,060289,094294,128299,162304,196309,230314,264 ML/AIimpact LowBaseHigh Laborhoursreduction(mn)(1,571)(2,969)(4,714) Reduction0.5%1%1.5% 2025Laborhours(mn)312,693311,295309,550 2025GDP($bn)25,03425,03425,034 Laborproductivity80.180.480.9 yoygrowth(%)1.8%2.2%2.8% Improvement(bps)5197154 via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 16 operations processes contributed to creating a threefold spike in growth (output per labor hour) during the 1990s, with IT-producing sectors responsible for at most 49% of the average annual increase in annual productivity growth from the pre-boom period to the period between 1995 and 2004 (Exhibit 10). Exhibit 10: Late 90s: IT-producing sectors contribute nearly half of productivity growth But lose value and share in growth post-tech boom Source: Federal Reserve Board, Goldman Sachs Global Investment Research Post-millennium stagnation. During the past decade, capital deepening growth related to IT applications (computer hardware, software, and telecom) has stagnated. IT capital, relative to broader market capital, has contributed less to overall growth in this component than average contributions during and even before the tech boom. Aggregate labor hours have been increasing, but the contribution of capital intensity to productivity has drastically underperformed versus the 1990s. The introduction of increasingly sophisticated, consumable machine learning and AI may be a catalyst in bringing capital intensity back to the forefront, in our view, significantly increasing the productivity of labor similar to the cycle we saw in the 1990s. Were more optimistic on the MFP side of the equation. GS economists have highlighted (Productivity Paradox v2.0 Revisited, 9/2/2016) that upward biases on ICT prices and a growth in inputs towards unmonetized outputs (free online content, back-end processes, etc.) add to the understatement of real GDP and productivity growth. Evolution of internet giants like Facebook and Google highlight the idea that complex input labor and capital arent necessarily converted into traditional consumer product monetization captured in standard productivity metrics. AI/ML induced productivity could impact investment We believe that one of the potential impacts of increasing productivity from AI/ML could be a shift in the way companies allocate capital. Since mid-2011, the growth in dividends and share repurchases has significantly exceeded capex growth, as reluctance among management teams to investment in capital projects remains post-recession. 0.77 1.5 0.64 0.79 1.56 0.92 0 1 2 3 4 1974-19951995-20042004-2012 IT ContributionOther nonfarm business % growth 1.56% total average growth virtually equal to 1995-2004 average IT contribution via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 17 Exhibit 11: Companies are hesitant to sacrifice dividends Clear shift in cash utilization strategy Exhibit 12: Cyclically adj. P/E ratios in a sluggish recovery Valuations only just hitting pre-recession levels Source: Shiller S processer speeds, more memory, better attributes in computer hardware, which led to large increases in the measured contribution of the technology sector. The technology sector was very central to pick up in the productivity numbers from the 1990s lasting to the early and mid-2000s. Terry: We ve seen a lot of technology development over the last 10-15 years. Why hasn t there been a similar impact to productivity from technologies such as the iPhone, Facebook, and the development of cloud computing? Hatzius: We don t have a full answer to it, but I do think an important part of the answer is the statistical ability to measure improvement in quality, and the impact of new products in the economic statistics is limited. It s relatively easy to measure nominal GDP, that s basically a matter of adding up receipts. There is room for measurement error as there is in almost everything, but I don t have real first order concern that measurement is getting worse in terms of measuring nominal GDP. Converting nominal GDP numbers into real GDP numbers by deflating it with a quality adjusted overall price index is where I think things get very difficult. If you look, for example, at the way software sectors enter the official numbers, if you believe the official numbers, $1000 of outlay on software now buys you just as much real software as $1000 of outlay bought you in the 1990s. The

    注意事项

    本文(人工智能(Artificial Intelligence)概述.pdf)为本站会员(pei****hi)主动上传,淘文阁 - 分享文档赚钱的网站仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知淘文阁 - 分享文档赚钱的网站(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    关于淘文阁 - 版权申诉 - 用户使用规则 - 积分规则 - 联系我们

    本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知淘文阁网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

    工信部备案号:黑ICP备15003705号 © 2020-2023 www.taowenge.com 淘文阁 

    收起
    展开