AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf
《AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf》由会员分享,可在线阅读,更多相关《AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf(34页珍藏版)》请在淘文阁 - 分享文档赚钱的网站上搜索。
1、DISCRIMINATING SYSTEMS Gender, Race, and Power in AISarah Myers West, AI Now Institute, New York University Meredith Whittaker, AI Now Institute, New York University, Google Open Research Kate Crawford, AI Now Institute, New York University, Microsoft ResearchAPRIL 2019Cite as: West, S.M., Whittaker
2、, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. Retrieved from https:/ainowinstitute.org/ discriminatingsystems.html.This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International LicenseRESEARCH FINDINGSRECOMMENDATION
3、SINTRODUCTIONWHICH HUMANS ARE IN THE LOOP? HOW WORKFORCES AND AI SYSTEMS INTERACTWHO MAKES AI?Diversity Statistics in the AI Industry: Knowns and UnknownsFROM WORKFORCES TO AI SYSTEMS: THE DISCRIMINATION FEEDBACK LOOPCORPORATE DIVERSITY: BEYOND THE PIPELINE PROBLEMCore Themes in Pipeline ResearchLim
4、itations of Pipeline ResearchPipeline Dreams: After Years of Research, The Picture WorsensWORKER-LED INITIATIVESTHE PUSHBACK AGAINST DIVERSITYCONCLUSION3458 1012 15 19 212325 262832CONTENTSRESEARCH FINDINGS There is a diversity crisis in the AI sector across gender and race. Recent studies found onl
5、y 18% of authors at leading AI conferences are women,i and more than 80% of AI professors are men.ii This disparity is extreme in the AI industry:iii women comprise only 15% of AI research staff at Facebook and 10% at Google. There is no public data on trans workers or other gender minorities. For b
6、lack workers, the picture is even worse. For example, only 2.5% of Googles workforce is black, while Facebook and Microsoft are each at 4%. Given decades of concern and investment to redress this imbalance, the current state of the field is alarming. The AI sector needs a profound shift in how it ad
7、dresses the current diversity crisis. The AI industry needs to acknowledge the gravity of its diversity problem, and admit that existing methods have failed to contend with the uneven distribution of power, and the means by which AI can reinforce such inequality. Further, many researchers have shown
8、 that bias in AI systems reflects historical patterns of discrimination. These are two manifestations of the same problem, and they must be addressed together. The overwhelming focus on women in tech is too narrow and likely to privilege white women over others. We need to acknowledge how the inters
9、ections of race, gender, and other identities and attributes shape peoples experiences with AI. The vast majority of AI studies assume gender is binary, and commonly assign people as male or female based on physical appearance and stereotypical assumptions, erasing all other forms of gender identity
10、.Fixing the pipeline wont fix AIs diversity problems. Despite many decades of pipeline studies that assess the flow of diverse job candidates from school to industry, there has been no substantial progress in diversity in the AI industry. The focus on the pipeline has not addressed deeper issues wit
11、h workplace cultures, power asymmetries, harassment, exclusionary hiring practices, unfair compensation, and tokenization that are causing people to leave or avoid working in the AI sector altogether. The use of AI systems for the classification, detection, and prediction of race and gender is in ur
12、gent need of re-evaluation. The histories of race science are a grim reminder that race and gender classification based on appearance is scientifically flawed and easily abused. Systems that use physical appearance as a proxy for character or interior states are deeply suspect, including AI tools th
13、at claim to detect sexuality from headshots,iv predict criminality based on facial features,v or assess worker competence via micro-expressions. vi Such systems are replicating patterns of racial and gender bias in ways that can deepen and justify historical inequality. The commercial deployment of
14、these tools is cause for deep concern. i. Element AI. (2019). Global AI Talent Report 2019. Retrieved from https:/jfgagne.ai/talent-2019/. ii. AI Index 2018. (2018). Artificial Intelligence Index 2018. Retrieved from http:/cdn.aiindex.org/2018/AI%20Index%202018%20Annual%20Report.pdf. iii. Simonite,
15、T. (2018). AI is the future - but where are the women? WIRED. Retrieved from https:/ researchers-gender-imbalance/. iv. Wang, Y., Ensmenger, N. (2015). Beards, Sandals, and Other Signs of Rugged Individualism: Masculine Culture within the Computing Professions. Osiris, 30(1): 38-65. 32 Hicks, M. (20
16、17). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge: MIT Press, 16. 33 Microsoft Gender Case (2019, Apr. 12). Retrieved from https:/ 34 Bensinger, G. (2018, July 16). Uber Faces Federal Investigation Over Alleged Gender Discrimination. The
17、Wall Street Journal. Retrieved from https:/ 35 Goldman, D. (2015, June 8). Tim Cook: Youll soon see more women representing Apple. CNN. Retrieved from https:/ technology/tim-cook-women-apple/?iid=EL. 36 OBrien, S.A. (2016, Jan. 15). Apples board calls diversity proposal unduly burdensome and not ne
18、cessary. CNN. Retrieved from https:/n. com/2016/01/15/technology/apple-diversity/index.html. 37 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/ magazine/2017/11/20/the-tech-industrys-gender-discrimination-problem. 38 Luckie, M.
19、(2018, Nov. 27). Facebook is failing its black employees and its black users. Facebook. https:/ facebook-is-failing-its-black-employees-and-its-black-users/1931075116975013/. 39 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/ ma
20、gazine/2017/11/20/the-tech-industrys-gender-discrimination-problem.Discriminating Systems: Gender, Race, and Power in AI | Which Humans are in the Loop? How Workforces and AI Systems Interact | 9From this perspective, locating individual biases within a given technical systemand attempting to fix th
21、em by tweaking the systembecomes an exercise in futility. Only by examining discrimination through the lens of its social logics (who it benefits, who it harms, and how) can we see the workings of these systems in the context of existing power relationships.In addition to asking when and how AI syst
22、ems favor some identities over others we might also ask: what is the logic through which artificial intelligence “sees” and constructs gender and race to begin with? How does it engage in the production and enactment of new classifications and identities?40 And how do AI systems replicate historical
23、 hierarchies by rendering people along a continuum of least to most “valuable”? These questions point to the larger problem: it is not just that AI systems need to be fixed when they misrecognize faces or amplify stereotypes. It is that they can perpetuate existing forms of structural inequality eve
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- AINow 人工智能 AI 多元化 危机 英文 2019.4 33 pdf
限制150内