欢迎来到淘文阁 - 分享文档赚钱的网站! | 帮助中心 好文档才是您的得力助手!
淘文阁 - 分享文档赚钱的网站
全部分类
  • 研究报告>
  • 管理文献>
  • 标准材料>
  • 技术资料>
  • 教育专区>
  • 应用文书>
  • 生活休闲>
  • 考试试题>
  • pptx模板>
  • 工商注册>
  • 期刊短文>
  • 图片设计>
  • ImageVerifierCode 换一换

    AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf

    • 资源ID:875334       资源大小:431.82KB        全文页数:34页
    • 资源格式: PDF        下载积分:8金币
    快捷下载 游客一键下载
    会员登录下载
    微信登录下载
    三方登录下载: 微信开放平台登录   QQ登录  
    二维码
    微信扫一扫登录
    下载资源需要8金币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf

    DISCRIMINATING SYSTEMS Gender, Race, and Power in AISarah Myers West, AI Now Institute, New York University Meredith Whittaker, AI Now Institute, New York University, Google Open Research Kate Crawford, AI Now Institute, New York University, Microsoft ResearchAPRIL 2019Cite as: West, S.M., Whittaker, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. Retrieved from https:/ainowinstitute.org/ discriminatingsystems.html.This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International LicenseRESEARCH FINDINGSRECOMMENDATIONSINTRODUCTIONWHICH HUMANS ARE IN THE LOOP? HOW WORKFORCES AND AI SYSTEMS INTERACTWHO MAKES AI?Diversity Statistics in the AI Industry: Knowns and UnknownsFROM WORKFORCES TO AI SYSTEMS: THE DISCRIMINATION FEEDBACK LOOPCORPORATE DIVERSITY: BEYOND THE PIPELINE PROBLEMCore Themes in Pipeline ResearchLimitations of Pipeline ResearchPipeline Dreams: After Years of Research, The Picture WorsensWORKER-LED INITIATIVESTHE PUSHBACK AGAINST DIVERSITYCONCLUSION3458 1012 15 19 212325 262832CONTENTSRESEARCH FINDINGS There is a diversity crisis in the AI sector across gender and race. Recent studies found only 18% of authors at leading AI conferences are women,i and more than 80% of AI professors are men.ii This disparity is extreme in the AI industry:iii women comprise only 15% of AI research staff at Facebook and 10% at Google. There is no public data on trans workers or other gender minorities. For black workers, the picture is even worse. For example, only 2.5% of Googles workforce is black, while Facebook and Microsoft are each at 4%. Given decades of concern and investment to redress this imbalance, the current state of the field is alarming. The AI sector needs a profound shift in how it addresses the current diversity crisis. The AI industry needs to acknowledge the gravity of its diversity problem, and admit that existing methods have failed to contend with the uneven distribution of power, and the means by which AI can reinforce such inequality. Further, many researchers have shown that bias in AI systems reflects historical patterns of discrimination. These are two manifestations of the same problem, and they must be addressed together. The overwhelming focus on women in tech is too narrow and likely to privilege white women over others. We need to acknowledge how the intersections of race, gender, and other identities and attributes shape peoples experiences with AI. The vast majority of AI studies assume gender is binary, and commonly assign people as male or female based on physical appearance and stereotypical assumptions, erasing all other forms of gender identity.Fixing the pipeline wont fix AIs diversity problems. Despite many decades of pipeline studies that assess the flow of diverse job candidates from school to industry, there has been no substantial progress in diversity in the AI industry. The focus on the pipeline has not addressed deeper issues with workplace cultures, power asymmetries, harassment, exclusionary hiring practices, unfair compensation, and tokenization that are causing people to leave or avoid working in the AI sector altogether. The use of AI systems for the classification, detection, and prediction of race and gender is in urgent need of re-evaluation. The histories of race science are a grim reminder that race and gender classification based on appearance is scientifically flawed and easily abused. Systems that use physical appearance as a proxy for character or interior states are deeply suspect, including AI tools that claim to detect sexuality from headshots,iv predict criminality based on facial features,v or assess worker competence via micro-expressions. vi Such systems are replicating patterns of racial and gender bias in ways that can deepen and justify historical inequality. The commercial deployment of these tools is cause for deep concern. i. Element AI. (2019). Global AI Talent Report 2019. Retrieved from https:/jfgagne.ai/talent-2019/. ii. AI Index 2018. (2018). Artificial Intelligence Index 2018. Retrieved from http:/cdn.aiindex.org/2018/AI%20Index%202018%20Annual%20Report.pdf. iii. Simonite, T. (2018). AI is the future - but where are the women? WIRED. Retrieved from https:/www.wired.com/story/artificial-intelligence- researchers-gender-imbalance/. iv. Wang, Y., Ensmenger, N. (2015). Beards, Sandals, and Other Signs of Rugged Individualism: Masculine Culture within the Computing Professions. Osiris, 30(1): 38-65. 32 Hicks, M. (2017). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge: MIT Press, 16. 33 Microsoft Gender Case (2019, Apr. 12). Retrieved from https:/microsoftgendercase.com/. 34 Bensinger, G. (2018, July 16). Uber Faces Federal Investigation Over Alleged Gender Discrimination. The Wall Street Journal. Retrieved from https:/ www.wsj.com/articles/uber-faces-federal-investigation-over-alleged-gender-discrimination-1531753191?mod=breakingnews. 35 Goldman, D. (2015, June 8). Tim Cook: Youll soon see more women representing Apple. CNN. Retrieved from https:/money.cnn.com/2015/06/08/ technology/tim-cook-women-apple/?iid=EL. 36 OBrien, S.A. (2016, Jan. 15). Apples board calls diversity proposal unduly burdensome and not necessary. CNN. Retrieved from https:/money.cnn. com/2016/01/15/technology/apple-diversity/index.html. 37 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/www.newyorker.com/ magazine/2017/11/20/the-tech-industrys-gender-discrimination-problem. 38 Luckie, M. (2018, Nov. 27). Facebook is failing its black employees and its black users. Facebook. https:/www.facebook.com/notes/mark-s-luckie/ facebook-is-failing-its-black-employees-and-its-black-users/1931075116975013/. 39 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/www.newyorker.com/ magazine/2017/11/20/the-tech-industrys-gender-discrimination-problem.Discriminating Systems: Gender, Race, and Power in AI | Which Humans are in the Loop? How Workforces and AI Systems Interact | 9From this perspective, locating individual biases within a given technical systemand attempting to fix them by tweaking the systembecomes an exercise in futility. Only by examining discrimination through the lens of its social logics (who it benefits, who it harms, and how) can we see the workings of these systems in the context of existing power relationships.In addition to asking when and how AI systems favor some identities over others we might also ask: what is the logic through which artificial intelligence “sees” and constructs gender and race to begin with? How does it engage in the production and enactment of new classifications and identities?40 And how do AI systems replicate historical hierarchies by rendering people along a continuum of least to most “valuable”? These questions point to the larger problem: it is not just that AI systems need to be fixed when they misrecognize faces or amplify stereotypes. It is that they can perpetuate existing forms of structural inequality even when working as intended.To tackle these questions, our research traces the way gender and race surfaces in AI systems and workforces, and their interrelationship. First, we review what is known and not known about diversity in the field of AI, focusing particularly on how frames devoted to the STEM field pipeline have dominated the discourse. Then, we provide a brief summary of existing literature on gender and racial bias in technologies and where this literature could be extended. Finally, we look at how calls for diversity in tech have been ignored or resisted, and how these discriminatory views have permeated many AI systems. We conclude by sharing new research findings that point to ways in which a deeper analysis of gender, race, and power in the field of AI can help to redress inequalities in the industry and in the tools it produces.WHO MAKES AI?The current data on the state of gender diversity in the AI field is dire, in both industry and academia. For example, in 2013, the share of women in computing dropped to 26%, below their level in 1960.41 Almost half the women who go into technology eventually leave the field, more than double the percentage of men who depart.42 As noted above, a report produced by the research firm Element AI found that only 18% of authors at the leading 21 conferences in the field are women,43 while the 2018 Artificial Intelligence Index reports 80% of AI professors are men.44 This imbalance is replicated at large tech firms like Facebook and Google, whose websites show 40 Kloppenburg, S. and van der Ploeg, I. (2018). Securing Identities: Biometric Technologies and the Enactment of Human Bodily Differences. Science as Culture. 41 Thompson, C. (2019, Feb. 13). The Secret History of Women in Coding. New York Times Magazine. Retrieved from https:/www.nytimes. com/2019/02/13/magazine/women-coding-computer-programming.html?linkId=65692573. 42 Ashcraft, C., McLain, B. and Eger, E. (2016). Women in Tech: The Facts. National Center for Women in Information Technology. Retrieved from https:/www.ncwit.org/sites/default/files/resources/womenintech_facts_fullreport_05132016.pdf. 43 Element AI. (2019). Global AI Talent Report 2019. Retrieved from https:/jfgagne.ai/talent-2019/. 44 AI Index 2018. (2018). Artificial Intelligence Index 2018. Retrieved from http:/cdn.aiindex.org/2018/AI%20Index%202018%20Annual%20Report.pdf.Discriminating Systems: Gender, Race, and Power in AI | Which Humans are in the Loop? How Workforces and AI Systems Interact | 10even greater imbalances, with women comprising only 15% and 10% of their AI research staff, respectively.45,46 There is no reported data on trans workers or other gender minorities. The state of racial diversity in AI is even worse. Only 2.5% of Googles full-time workers are black, and 3.6% latinx, with black workers having the highest attrition rate of all racial categories.47 Facebook isnt much better: the company reported that with 4% black workers and 5% Hispanic workers in 2018, the companys diversity is improving.48 Microsoft reflects similar levels as Facebook, with 4% black workers, and 6% Latinx workers.49 Machine vision researcher and co- founder of Black in AI, Timnit Gebru, said that when she first attended the preeminent machine learning conference NeurIPS in 2016, she was one of six black people out of 8,500 attendees.50 “We are in a diversity crisis for AI,” Gebru explains. “In addition to having technical conversations, conversations about law, conversations about ethics, we need to have conversations about diversity in AI. This needs to be treated as something thats extremely urgent.”51 Of course, artificial intelligence is a sub-field of computer science, and the broader discipline is experiencing an historic low point for diversity: as of 2015, women made up only 18% of computer science majors in the United States, a decline from a high of 37% in 1984.52 No other professional field has experienced such a sharp decline in the number of women in its ranks.53 At present, women currently make up 24.4% of the computer science workforce, and receive median salaries that are only 66% of the salaries of their male counterparts. These figures are similarly pronounced when race is taken into account; the proportion of bachelors degree awards in engineering to black women declined 11% between 2000 and 2015.54 The number of women and people of color decreased at the same time that the tech industry was establishing itself as a nexus of wealth and power. This is even more significant when we recognize that these shocking diversity figures are not reflective of STEM as a whole: in fields outside of computer science and AI, racial and gender diversity has shown a marked improvement.55 45 Simonite, T. (2018). AI is the future - but where are the women? WIRED. Retrieved from https:/www.wired.com/story/artificial-intelligence- researchers-gender-imbalance/. 46 The World Economic Forums 2018 Global Gender Gap Report includes a section on diversity in AI that places its estimate much higher at 22%. However, the methodology for obtaining this figure raises some questions: it relies on LinkedIn users inclusion of AI-related skills in their profiles as the primary data source. This requires several causal leaps: first, that a sample of LinkedIn users is representative of the global population of workers in the field of AI, and that these users accurately represented their skill set. Moreover, the study used a flawed mechanism to attribute gender on a binary basis to users on the basis of inference from their first name a practice that is not only trans-exclusionary, but is particularly problematic in an analysis that includes names in non-English languages. 47 Google. (2018). Google Diversity Annual Report 2018. Retrieved from https:/static.googleusercontent.com/media/diversity.google/en/static/pdf/ Google_Diversity_annual_report_2018.pdf. 48 Williams, M. (2018, July 12). Facebook 2018 Diversity Report: Reflecting on Our Journey. Retrieved from https:/newsroom.fb.com/news/2018/07/ diversity-report/. 49 Microsoft. (2019). Diversity whatever claim to a right to privacy that we may have is diminished by a state that believes that we must always be watched and seen.”99Asking this question is particularly important given that practices involved in correcting such biases sometimes lead those developing such technologies (most often large corporations) to conduct invasive data collection on communities that are already marginalized with the goal of ensuring that theyre represented. For example, facial recognition systems often have a challenging time recognizing the faces of people undergoing gender transition. This error has been a problem for trans Uber drivers, because the facial recognition system built in as a security feature by Uber has led their accounts to be suspended, preventing them from being able to work while they seek to get their accounts restored.100These harms should be balanced against remedies that rely on unethical practices, or that propose mass data collection as the solution to bias. One approach that received particular pushback collected videos from transgender YouTubers without their consent in order to train facial recognition software to more accurately recognize people undergoing the process of transitioning.101 In this case, allowing alternate means of account verification may be a better “fix” than continuing to rely on a system whose efficacy demands increased surveillance and worker control.96 Ibid. 97 Powles, J. (2018, Dec. 7). The Seductive Diversion of Solving Bias in Artificial Intell

    注意事项

    本文(AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf)为本站会员(pei****hi)主动上传,淘文阁 - 分享文档赚钱的网站仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知淘文阁 - 分享文档赚钱的网站(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    关于淘文阁 - 版权申诉 - 用户使用规则 - 积分规则 - 联系我们

    本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知淘文阁网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

    工信部备案号:黑ICP备15003705号 © 2020-2023 www.taowenge.com 淘文阁 

    收起
    展开