{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,11,19]],"date-time":"2024-11-19T18:18:31Z","timestamp":1732040311339,"version":"3.28.0"},"publisher-location":"New York, NY, USA","reference-count":86,"publisher":"ACM","content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2021,3,3]]},"DOI":"10.1145\/3442188.3445888","type":"proceedings-article","created":{"date-parts":[[2021,3,3]],"date-time":"2021-03-03T01:26:24Z","timestamp":1614734784000},"page":"249-260","update-policy":"http:\/\/dx.doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":70,"title":["What We Can't Measure, We Can't Understand"],"prefix":"10.1145","author":[{"given":"McKane","family":"Andrus","sequence":"first","affiliation":[{"name":"Partnership on AI"}]},{"given":"Elena","family":"Spitzer","sequence":"additional","affiliation":[{"name":"Partnership on AI"}]},{"given":"Jeffrey","family":"Brown","sequence":"additional","affiliation":[{"name":"Partnership on AI, Minnesota State University, Mankato"}]},{"given":"Alice","family":"Xiang","sequence":"additional","affiliation":[{"name":"Sony AI, Partnership on AI"}]}],"member":"320","published-online":{"date-parts":[[2021,3]]},"reference":[{"volume-title":"Proceedings of the 2019 AAAI\/ACM Conference on AI, Ethics, and Society. ACM, Honolulu HI USA, 445--451","author":"Andrus McKane","key":"e_1_3_2_1_1_1","unstructured":"McKane Andrus and Thomas K. Gilbert . 2019. Towards a Just Theory of Measurement: A Principled Social Measurement Assurance Program for Machine Learning . In Proceedings of the 2019 AAAI\/ACM Conference on AI, Ethics, and Society. ACM, Honolulu HI USA, 445--451 . https:\/\/doi.org\/10.1145\/3306618.3314275 10.1145\/3306618.3314275 McKane Andrus and Thomas K. Gilbert. 2019. Towards a Just Theory of Measurement: A Principled Social Measurement Assurance Program for Machine Learning. In Proceedings of the 2019 AAAI\/ACM Conference on AI, Ethics, and Society. ACM, Honolulu HI USA, 445--451. https:\/\/doi.org\/10.1145\/3306618.3314275"},{"key":"e_1_3_2_1_2_1","unstructured":"McKane Andrus Elena Spitzer and Alice Xiang. 2020. Working to Address Algorithmic Bias? Don't Overlook the Role of Demographic Data. https:\/\/www.partnershiponai.org\/demographic- data\/ McKane Andrus Elena Spitzer and Alice Xiang. 2020. Working to Address Algorithmic Bias? Don't Overlook the Role of Demographic Data. https:\/\/www.partnershiponai.org\/demographic- data\/"},{"key":"e_1_3_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1177\/146879410100100307"},{"volume-title":"Americans and privacy: Concerned, confused and feeling lack of control over their personal information","year":"2019","author":"Auxier Brooke","key":"e_1_3_2_1_4_1","unstructured":"Brooke Auxier , Lee Rainie , Monica Anderson , Andrew Perrin , Madhu Kumar , and Erica Turner . 2019. Americans and privacy: Concerned, confused and feeling lack of control over their personal information . Pew Research Center : Internet, Science & Tech (blog). November 15 ( 2019 ), 2019. Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, and Erica Turner. 2019. Americans and privacy: Concerned, confused and feeling lack of control over their personal information. Pew Research Center: Internet, Science & Tech (blog). November 15 (2019), 2019."},{"volume-title":"Beyond Bias: Re-Imagining the Terms of 'Ethical AI' in Criminal Law. SSRN Electronic Journal","year":"2019","author":"Barabas Chelsea","key":"e_1_3_2_1_5_1","unstructured":"Chelsea Barabas . 2019 . Beyond Bias: Re-Imagining the Terms of 'Ethical AI' in Criminal Law. SSRN Electronic Journal (2019). https:\/\/doi.org\/10.2139\/ssrn.3377921 10.2139\/ssrn.3377921 Chelsea Barabas. 2019. Beyond Bias: Re-Imagining the Terms of 'Ethical AI' in Criminal Law. SSRN Electronic Journal (2019). https:\/\/doi.org\/10.2139\/ssrn.3377921"},{"volume-title":"Conference on Fairness, Accountability and Transparency. PMLR, 62--76","year":"2018","author":"Barabas Chelsea","key":"e_1_3_2_1_6_1","unstructured":"Chelsea Barabas , Madars Virza , Karthik Dinakar , Joichi Ito , and Jonathan Zittrain . 2018 . Interventions over predictions: Reframing the ethical debate for actuarial risk assessment . In Conference on Fairness, Accountability and Transparency. PMLR, 62--76 . Chelsea Barabas, Madars Virza, Karthik Dinakar, Joichi Ito, and Jonathan Zittrain. 2018. Interventions over predictions: Reframing the ethical debate for actuarial risk assessment. In Conference on Fairness, Accountability and Transparency. PMLR, 62--76."},{"volume-title":"Michael Katell, P. M. Krafft, Jennifer Lee, Shankar Narayan, Franziska Putz, Daniella Raz, Brian Robick, Aaron Tam, Abiel Woldu, and Meg Young.","year":"2020","author":"Barghouti Bissan","key":"e_1_3_2_1_7_1","unstructured":"Bissan Barghouti , Corinne Bintz , Dharma Dailey , Micah Epstein , Vivian Guetler , Bernease Herman , Pa Ousman Jobe , Michael Katell, P. M. Krafft, Jennifer Lee, Shankar Narayan, Franziska Putz, Daniella Raz, Brian Robick, Aaron Tam, Abiel Woldu, and Meg Young. 2020 . Algorithmic Equity Toolkit . https:\/\/www.acluwa.org\/AEKit Bissan Barghouti, Corinne Bintz, Dharma Dailey, Micah Epstein, Vivian Guetler, Bernease Herman, Pa Ousman Jobe, Michael Katell, P. M. Krafft, Jennifer Lee, Shankar Narayan, Franziska Putz, Daniella Raz, Brian Robick, Aaron Tam, Abiel Woldu, and Meg Young. 2020. Algorithmic Equity Toolkit. https:\/\/www.acluwa.org\/AEKit"},{"key":"e_1_3_2_1_8_1","first-page":"671","article-title":"Big Data's Disparate Impact","volume":"104","author":"Barocas Solon","year":"2016","unstructured":"Solon Barocas and Andrew D. Selbst . 2016 . Big Data's Disparate Impact . California Law Review 104 (2016), 671 . https:\/\/heinonline.org\/HOL\/Page?handle=hein.journals\/calr104&id=695&div=&collection= Solon Barocas and Andrew D. Selbst. 2016. Big Data's Disparate Impact. California Law Review 104 (2016), 671. https:\/\/heinonline.org\/HOL\/Page?handle=hein.journals\/calr104&id=695&div=&collection=","journal-title":"California Law Review"},{"volume-title":"John Richards, Diptikalyan Saha, Prasanna Sattigeri, Moninder Singh, Kush R. Varshney, and Yunfeng Zhang.","year":"2018","author":"Bellamy Rachel K. E.","key":"e_1_3_2_1_9_1","unstructured":"Rachel K. E. Bellamy , Kuntal Dey , Michael Hind , Samuel C. Hoffman , Stephanie Houde , Kalapriya Kannan , Pranay Lohia , Jacquelyn Martino , Sameep Mehta , Aleksandra Mojsilovic , Seema Nagar , Karthikeyan Natesan Ramamurthy , John Richards, Diptikalyan Saha, Prasanna Sattigeri, Moninder Singh, Kush R. Varshney, and Yunfeng Zhang. 2018 . AI Fairness 360: An Extensible Toolkit for Detecting, Understanding, and Mitigating Unwanted Algorithmic Bias . arXiv:1810.01943 [cs] (Oct. 2018). http:\/\/arxiv.org\/abs\/1810.01943 Rachel K. E. Bellamy, Kuntal Dey, Michael Hind, Samuel C. Hoffman, Stephanie Houde, Kalapriya Kannan, Pranay Lohia, Jacquelyn Martino, Sameep Mehta, Aleksandra Mojsilovic, Seema Nagar, Karthikeyan Natesan Ramamurthy, John Richards, Diptikalyan Saha, Prasanna Sattigeri, Moninder Singh, Kush R. Varshney, and Yunfeng Zhang. 2018. AI Fairness 360: An Extensible Toolkit for Detecting, Understanding, and Mitigating Unwanted Algorithmic Bias. arXiv:1810.01943 [cs] (Oct. 2018). http:\/\/arxiv.org\/abs\/1810.01943"},{"volume-title":"Bennett and Os Keyes","year":"2020","author":"Cynthia","key":"e_1_3_2_1_10_1","unstructured":"Cynthia L. Bennett and Os Keyes . 2020 . What Is the Point of Fairness?: Disability, AI and the Complexity of Justice. ACM SIGACCESS Accessibility and Computing 125 (March 2020), 1--1. https:\/\/doi.org\/10.1145\/3386296.3386301 10.1145\/3386296.3386301 Cynthia L. Bennett and Os Keyes. 2020. What Is the Point of Fairness?: Disability, AI and the Complexity of Justice. ACM SIGACCESS Accessibility and Computing 125 (March 2020), 1--1. https:\/\/doi.org\/10.1145\/3386296.3386301"},{"key":"e_1_3_2_1_11_1","first-page":"803","article-title":"Is Algorithmic Affirmative Action Legal","volume":"108","author":"Bent Jason R","year":"2020","unstructured":"Jason R Bent . 2020 . Is Algorithmic Affirmative Action Legal ? Georgetown Law Journal 108 (2020), 803 . Jason R Bent. 2020. Is Algorithmic Affirmative Action Legal? Georgetown Law Journal 108 (2020), 803.","journal-title":"Georgetown Law Journal"},{"volume-title":"Proceedings of the Conference on Fairness, Accountability, and Transparency - FAT* '19","author":"Benthall Sebastian","key":"e_1_3_2_1_12_1","unstructured":"Sebastian Benthall and Bruce D. Haynes . 2019. Racial Categories in Machine Learning . In Proceedings of the Conference on Fairness, Accountability, and Transparency - FAT* '19 . ACM Press, Atlanta, GA, USA, 289--298. https:\/\/doi.org\/10.1145\/3287560.3287575 10.1145\/3287560.3287575 Sebastian Benthall and Bruce D. Haynes. 2019. Racial Categories in Machine Learning. In Proceedings of the Conference on Fairness, Accountability, and Transparency - FAT* '19. ACM Press, Atlanta, GA, USA, 289--298. https:\/\/doi.org\/10.1145\/3287560.3287575"},{"key":"e_1_3_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372877"},{"key":"e_1_3_2_1_14_1","unstructured":"Consumer Financial Protection Bureau. 2014. Using publicly available information to proxy for unidentified race and ethnicity. (2014). https:\/\/www.consumerfinance.gov\/data-research\/research-reports\/using-publicly-available-information-to-proxy-for-unidentified-race-and-ethnicity\/ Consumer Financial Protection Bureau. 2014. Using publicly available information to proxy for unidentified race and ethnicity. (2014). https:\/\/www.consumerfinance.gov\/data-research\/research-reports\/using-publicly-available-information-to-proxy-for-unidentified-race-and-ethnicity\/"},{"key":"e_1_3_2_1_15_1","unstructured":"Brandee Butler. 2020. For the EU to Effectively Address Racial Injustice We Need Data. Al Jazeera. Brandee Butler. 2020. For the EU to Effectively Address Racial Injustice We Need Data. Al Jazeera."},{"volume-title":"Jennifer Rode, Anna Lauren Hoffmann, Niloufar Salehi, and Lisa Nakamura.","year":"2019","author":"Cifor Marika","key":"e_1_3_2_1_16_1","unstructured":"Marika Cifor , Patricia Garcia , TL Cowan , Jasmine Rault , Tonia Sutherland , Anita Say Chan , Jennifer Rode, Anna Lauren Hoffmann, Niloufar Salehi, and Lisa Nakamura. 2019 . Feminist data manifest-no. https:\/\/www.manifestno.com\/ Marika Cifor, Patricia Garcia, TL Cowan, Jasmine Rault, Tonia Sutherland, Anita Say Chan, Jennifer Rode, Anna Lauren Hoffmann, Niloufar Salehi, and Lisa Nakamura. 2019. Feminist data manifest-no. https:\/\/www.manifestno.com\/"},{"key":"e_1_3_2_1_17_1","unstructured":"US Equal Employment Opportunity Commission et al. 1979. Questions and answers to clarify and provide a common interpretation of the uniform guidelines on employee selection procedures. US Equal Employment Opportunity Commission et al. 1979. Questions and answers to clarify and provide a common interpretation of the uniform guidelines on employee selection procedures."},{"volume-title":"The Measure and Mismeasure of Fairness: A Critical Review of Fair Machine Learning. arXiv:1808.00023 [cs] (Aug","year":"2018","author":"Corbett-Davies Sam","key":"e_1_3_2_1_18_1","unstructured":"Sam Corbett-Davies and Sharad Goel . 2018. The Measure and Mismeasure of Fairness: A Critical Review of Fair Machine Learning. arXiv:1808.00023 [cs] (Aug . 2018 ). http:\/\/arxiv.org\/abs\/1808.00023 Sam Corbett-Davies and Sharad Goel. 2018. The Measure and Mismeasure of Fairness: A Critical Review of Fair Machine Learning. arXiv:1808.00023 [cs] (Aug. 2018). http:\/\/arxiv.org\/abs\/1808.00023"},{"key":"e_1_3_2_1_19_1","unstructured":"Kimberl\u00e9 Crenshaw. 1989. Demarginalizing the intersection of race and sex: A black feminist critique of antidiscrimination doctrine feminist theory and antiracist politics. u. Chi. Legal f. (1989) 139. Kimberl\u00e9 Crenshaw. 1989. Demarginalizing the intersection of race and sex: A black feminist critique of antidiscrimination doctrine feminist theory and antiracist politics. u. Chi. Legal f. (1989) 139."},{"key":"e_1_3_2_1_20_1","unstructured":"Cara Crotty. 2020. Revised form for self-identification of disability released. https:\/\/www.constangy.com\/affirmative-action-alert\/revised-form-forself-identification-of-disability Cara Crotty. 2020. Revised form for self-identification of disability released. https:\/\/www.constangy.com\/affirmative-action-alert\/revised-form-forself-identification-of-disability"},{"key":"e_1_3_2_1_21_1","unstructured":"d4bl. 2020. Data 4 Black Lives. https:\/\/d4bl.org\/ d4bl. 2020. Data 4 Black Lives. https:\/\/d4bl.org\/"},{"volume-title":"A Broader View on Bias in Automated Decision-Making: Reflecting on Epistemology and Dynamics. arXiv:1807.00553 [cs, math, stat] (July","year":"2018","author":"Dobbe Roel","key":"e_1_3_2_1_22_1","unstructured":"Roel Dobbe , Sarah Dean , Thomas Gilbert , and Nitin Kohli . 2018. A Broader View on Bias in Automated Decision-Making: Reflecting on Epistemology and Dynamics. arXiv:1807.00553 [cs, math, stat] (July 2018 ). arXiv:1807.00553 [cs, math, stat] http:\/\/arxiv.org\/abs\/1807.00553 Roel Dobbe, Sarah Dean, Thomas Gilbert, and Nitin Kohli. 2018. A Broader View on Bias in Automated Decision-Making: Reflecting on Epistemology and Dynamics. arXiv:1807.00553 [cs, math, stat] (July 2018). arXiv:1807.00553 [cs, math, stat] http:\/\/arxiv.org\/abs\/1807.00553"},{"key":"e_1_3_2_1_23_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372878"},{"key":"e_1_3_2_1_24_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10742-009-0047-1"},{"key":"e_1_3_2_1_25_1","unstructured":"European Parliament and Council of European Union. 2016. Regulation (EU) 2016\/679 (General Data Protection Regulation). https:\/\/eur-lex.europa.eu\/legalcontent\/EN\/TXT\/HTML\/?uri=CELEX:32016R0679&from=EN European Parliament and Council of European Union. 2016. Regulation (EU) 2016\/679 (General Data Protection Regulation). https:\/\/eur-lex.europa.eu\/legalcontent\/EN\/TXT\/HTML\/?uri=CELEX:32016R0679&from=EN"},{"volume-title":"Federal Data Protection Act of","year":"2017","author":"German Bundestag","key":"e_1_3_2_1_26_1","unstructured":"German Bundestag 2017. Federal Data Protection Act of 30 June 2017 ( BDSG) ., 2097 pages. https:\/\/www.gesetze-im-internet.de\/englisch_bdsg\/englisch_bdsg.html German Bundestag 2017. Federal Data Protection Act of 30 June 2017 (BDSG)., 2097 pages. https:\/\/www.gesetze-im-internet.de\/englisch_bdsg\/englisch_bdsg.html"},{"key":"e_1_3_2_1_27_1","volume-title":"Proceedings of the AAAI Conference on Artificial Intelligence","volume":"33","author":"Ghili Soheil","year":"2019","unstructured":"Soheil Ghili , Ehsan Kazemi , and Amin Karbasi . 2019 . Eliminating latent discrimination: Train then mask . In Proceedings of the AAAI Conference on Artificial Intelligence , Vol. 33 . 3672--3680. Soheil Ghili, Ehsan Kazemi, and Amin Karbasi. 2019. Eliminating latent discrimination: Train then mask. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. 3672--3680."},{"volume-title":"29th conference on Neural Information Processing Systems (NIPS","year":"2016","author":"Goodman Bryce W","key":"e_1_3_2_1_28_1","unstructured":"Bryce W Goodman . 2016 . A step towards accountable algorithms? algorithmic discrimination and the european union general data protection . In 29th conference on Neural Information Processing Systems (NIPS 2016), Barcelona. NIPS foundation. Bryce W Goodman. 2016. A step towards accountable algorithms? algorithmic discrimination and the european union general data protection. In 29th conference on Neural Information Processing Systems (NIPS 2016), Barcelona. NIPS foundation."},{"key":"e_1_3_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372840"},{"volume-title":"Mahdi Milani Fard, and Serena Wang","year":"2018","author":"Gupta Maya","key":"e_1_3_2_1_30_1","unstructured":"Maya Gupta , Andrew Cotter , Mahdi Milani Fard, and Serena Wang . 2018 . Proxy Fairness . arXiv:1806.11212 [cs, stat] (June 2018). arXiv:1806.11212 [cs, stat] http:\/\/arxiv.org\/abs\/1806.11212 Maya Gupta, Andrew Cotter, Mahdi Milani Fard, and Serena Wang. 2018. Proxy Fairness. arXiv:1806.11212 [cs, stat] (June 2018). arXiv:1806.11212 [cs, stat] http:\/\/arxiv.org\/abs\/1806.11212"},{"key":"e_1_3_2_1_31_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10618-014-0393-7"},{"volume-title":"Morgan Klaus Scheuerman, and Stacy M. Branham","year":"2018","author":"Hamidi Foad","key":"e_1_3_2_1_32_1","unstructured":"Foad Hamidi , Morgan Klaus Scheuerman, and Stacy M. Branham . 2018 . Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18. ACM Press , Montreal QC, Canada, 1--13. https:\/\/doi.org\/10.1145\/3173574.3173582 10.1145\/3173574.3173582 Foad Hamidi, Morgan Klaus Scheuerman, and Stacy M. Branham. 2018. Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18. ACM Press, Montreal QC, Canada, 1--13. https:\/\/doi.org\/10.1145\/3173574.3173582"},{"key":"e_1_3_2_1_33_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372826"},{"key":"e_1_3_2_1_34_1","doi-asserted-by":"publisher","DOI":"10.2307\/3178066"},{"volume-title":"Stretching human laws to apply to machines: The dangers of a 'Colorblind' Computer","year":"2019","author":"Harned Zach","key":"e_1_3_2_1_35_1","unstructured":"Zach Harned and Hanna Wallach . 2019. Stretching human laws to apply to machines: The dangers of a 'Colorblind' Computer . Florida State University Law Review , Forthcoming ( 2019 ). Zach Harned and Hanna Wallach. 2019. Stretching human laws to apply to machines: The dangers of a 'Colorblind' Computer. Florida State University Law Review, Forthcoming (2019)."},{"volume-title":"Fairness Without Demographics in Repeated Loss Minimization. arXiv:1806.08010 [cs, stat] (July","year":"2018","author":"Hashimoto Tatsunori B.","key":"e_1_3_2_1_36_1","unstructured":"Tatsunori B. Hashimoto , Megha Srivastava , Hongseok Namkoong , and Percy Liang . 2018. Fairness Without Demographics in Repeated Loss Minimization. arXiv:1806.08010 [cs, stat] (July 2018 ). arXiv:1806.08010 [cs, stat] http:\/\/arxiv.org\/abs\/1806.08010 Tatsunori B. Hashimoto, Megha Srivastava, Hongseok Namkoong, and Percy Liang. 2018. Fairness Without Demographics in Repeated Loss Minimization. arXiv:1806.08010 [cs, stat] (July 2018). arXiv:1806.08010 [cs, stat] http:\/\/arxiv.org\/abs\/1806.08010"},{"key":"e_1_3_2_1_37_1","doi-asserted-by":"publisher","DOI":"10.1080\/1369118X.2019.1573912"},{"key":"e_1_3_2_1_38_1","doi-asserted-by":"publisher","DOI":"10.1145\/3290605.3300830"},{"key":"e_1_3_2_1_39_1","first-page":"1","article-title":"So What \"Should","volume":"3","author":"Howell Junia","year":"2017","unstructured":"Junia Howell and Michael O. Emerson . 2017 . So What \"Should \" We Use? Evaluating the Impact of Five Racial Measures on Markers of Social Inequality. Sociology of Race and Ethnicity 3 , 1 (Jan. 2017), 14--30. https:\/\/doi.org\/10.1177\/2332649216648465 10.1177\/2332649216648465 Junia Howell and Michael O. Emerson. 2017. So What \"Should\" We Use? Evaluating the Impact of Five Racial Measures on Markers of Social Inequality. Sociology of Race and Ethnicity 3, 1 (Jan. 2017), 14--30. https:\/\/doi.org\/10.1177\/2332649216648465","journal-title":"Evaluating the Impact of Five Racial Measures on Markers of Social Inequality. Sociology of Race and Ethnicity"},{"key":"e_1_3_2_1_40_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3375674"},{"volume-title":"Crossing the Quality Chasm: A New Health System for the 21st Century","author":"Institute of Medicine (US) Committee on Quality of Health Care in America. 2001.","key":"e_1_3_2_1_41_1","unstructured":"Institute of Medicine (US) Committee on Quality of Health Care in America. 2001. Crossing the Quality Chasm: A New Health System for the 21st Century . National Academies Press (US) , Washington (DC). http:\/\/www.ncbi.nlm.nih.gov\/books\/NBK222274\/ Institute of Medicine (US) Committee on Quality of Health Care in America. 2001. Crossing the Quality Chasm: A New Health System for the 21st Century. National Academies Press (US), Washington (DC). http:\/\/www.ncbi.nlm.nih.gov\/books\/NBK222274\/"},{"volume-title":"Jacobs and Hanna Wallach","year":"2019","author":"Abigail","key":"e_1_3_2_1_42_1","unstructured":"Abigail Z. Jacobs and Hanna Wallach . 2019 . Measurement and Fairness . arXiv:1912.05511 [cs] (Dec. 2019). arXiv:1912.05511 [cs] http:\/\/arxiv.org\/abs\/1912.05511 Abigail Z. Jacobs and Hanna Wallach. 2019. Measurement and Fairness. arXiv:1912.05511 [cs] (Dec. 2019). arXiv:1912.05511 [cs] http:\/\/arxiv.org\/abs\/1912.05511"},{"volume-title":"International Conference on Machine Learning. PMLR, 3000--3008","year":"2019","author":"Jagielski Matthew","key":"e_1_3_2_1_43_1","unstructured":"Matthew Jagielski , Michael Kearns , Jieming Mao , Alina Oprea , Aaron Roth , Saeed Sharifi-Malvajerdi , and Jonathan Ullman . 2019 . Differentially private fair learning . In International Conference on Machine Learning. PMLR, 3000--3008 . Matthew Jagielski, Michael Kearns, Jieming Mao, Alina Oprea, Aaron Roth, Saeed Sharifi-Malvajerdi, and Jonathan Ullman. 2019. Differentially private fair learning. In International Conference on Machine Learning. PMLR, 3000--3008."},{"key":"e_1_3_2_1_44_1","unstructured":"LLana James. 2020. Race-Based COVID-19 Data May Be Used to Discriminate against Racialized Communities. http:\/\/theconversation.com\/race-based-covid-19-data-may-be-used-to-discriminate-against-racialized-communities-138372 LLana James. 2020. Race-Based COVID-19 Data May Be Used to Discriminate against Racialized Communities. http:\/\/theconversation.com\/race-based-covid-19-data-may-be-used-to-discriminate-against-racialized-communities-138372"},{"volume-title":"Advances and Open Problems in Federated Learning. arXiv:1912.04977 [cs, stat] (Dec","year":"2019","author":"Kairouz Peter","key":"e_1_3_2_1_45_1","unstructured":"Peter Kairouz and Others. 2019. Advances and Open Problems in Federated Learning. arXiv:1912.04977 [cs, stat] (Dec . 2019 ). arXiv:1912.04977 [cs, stat] http:\/\/arxiv.org\/abs\/1912.04977 Peter Kairouz and Others. 2019. Advances and Open Problems in Federated Learning. arXiv:1912.04977 [cs, stat] (Dec. 2019). arXiv:1912.04977 [cs, stat] http:\/\/arxiv.org\/abs\/1912.04977"},{"key":"e_1_3_2_1_46_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICDMW.2011.83"},{"volume-title":"Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. ACM, Barcelona Spain, 45--55","author":"Katell Michael","key":"e_1_3_2_1_47_1","unstructured":"Michael Katell , Meg Young , Dharma Dailey , Bernease Herman , Vivian Guetler , Aaron Tam , Corinne Binz , Daniella Raz , and P. M. Krafft . 2020. Toward Situated Interventions for Algorithmic Equity: Lessons from the Field . In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. ACM, Barcelona Spain, 45--55 . https:\/\/doi.org\/10.1145\/3351095.3372874 10.1145\/3351095.3372874 Michael Katell, Meg Young, Dharma Dailey, Bernease Herman, Vivian Guetler, Aaron Tam, Corinne Binz, Daniella Raz, and P. M. Krafft. 2020. Toward Situated Interventions for Algorithmic Equity: Lessons from the Field. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. ACM, Barcelona Spain, 45--55. https:\/\/doi.org\/10.1145\/3351095.3372874"},{"volume-title":"Blind Justice: Fairness with Encrypted Sensitive Attributes. arXiv:1806.03281 [cs, stat] (June","year":"2018","author":"Kilbertus Niki","key":"e_1_3_2_1_48_1","unstructured":"Niki Kilbertus , Adri\u00e0 Gasc\u00f3n , Matt J. Kusner , Michael Veale , Krishna P. Gummadi , and Adrian Weller . 2018 . Blind Justice: Fairness with Encrypted Sensitive Attributes. arXiv:1806.03281 [cs, stat] (June 2018). arXiv:1806.03281 [cs, stat] http:\/\/arxiv.org\/abs\/1806.03281 Niki Kilbertus, Adri\u00e0 Gasc\u00f3n, Matt J. Kusner, Michael Veale, Krishna P. Gummadi, and Adrian Weller. 2018. Blind Justice: Fairness with Encrypted Sensitive Attributes. arXiv:1806.03281 [cs, stat] (June 2018). arXiv:1806.03281 [cs, stat] http:\/\/arxiv.org\/abs\/1806.03281"},{"volume-title":"Fair Decision Making Using Privacy-Protected Data. arXiv:1905.12744 [cs] (Jan","year":"2020","author":"Kuppam Satya","key":"e_1_3_2_1_49_1","unstructured":"Satya Kuppam , Ryan Mckenna , David Pujol , Michael Hay , Ashwin Machanavajjhala , and Gerome Miklau . 2020. Fair Decision Making Using Privacy-Protected Data. arXiv:1905.12744 [cs] (Jan . 2020 ). arXiv:1905.12744 [cs] http:\/\/arxiv.org\/abs\/1905.12744 Satya Kuppam, Ryan Mckenna, David Pujol, Michael Hay, Ashwin Machanavajjhala, and Gerome Miklau. 2020. Fair Decision Making Using Privacy-Protected Data. arXiv:1905.12744 [cs] (Jan. 2020). arXiv:1905.12744 [cs] http:\/\/arxiv.org\/abs\/1905.12744"},{"volume-title":"Chi","year":"2020","author":"Lahoti Preethi","key":"e_1_3_2_1_50_1","unstructured":"Preethi Lahoti , Alex Beutel , Jilin Chen , Kang Lee , Flavien Prost , Nithum Thain , Xuezhi Wang , and Ed H . Chi . 2020 . Fairness without Demographics through Adversarially Reweighted Learning . arXiv:2006.13114 [cs, stat] (June 2020). arXiv:2006.13114 [cs, stat] http:\/\/arxiv.org\/abs\/2006.13114 Preethi Lahoti, Alex Beutel, Jilin Chen, Kang Lee, Flavien Prost, Nithum Thain, Xuezhi Wang, and Ed H. Chi. 2020. Fairness without Demographics through Adversarially Reweighted Learning. arXiv:2006.13114 [cs, stat] (June 2020). arXiv:2006.13114 [cs, stat] http:\/\/arxiv.org\/abs\/2006.13114"},{"volume-title":"How Cambridge Analytica Sparked the Great Privacy Awakening. Wired (March","year":"2019","author":"Lapowsky Issie","key":"e_1_3_2_1_51_1","unstructured":"Issie Lapowsky . 2019. How Cambridge Analytica Sparked the Great Privacy Awakening. Wired (March 2019 ). https:\/\/www.wired.com\/story\/cambridge-analytica-facebook-privacy-awakening\/ Issie Lapowsky. 2019. How Cambridge Analytica Sparked the Great Privacy Awakening. Wired (March 2019). https:\/\/www.wired.com\/story\/cambridge-analytica-facebook-privacy-awakening\/"},{"key":"e_1_3_2_1_52_1","unstructured":"LinkedIn. [n.d.]. LinkedIn Recruiter: The Industry-Standard Recruiting Tool. https:\/\/business.linkedin.com\/talent-solutions\/recruiter LinkedIn. [n.d.]. LinkedIn Recruiter: The Industry-Standard Recruiting Tool. https:\/\/business.linkedin.com\/talent-solutions\/recruiter"},{"key":"e_1_3_2_1_53_1","first-page":"11","article-title":"Innovative Methodologies in Qualitative Research","volume":"12","author":"Maramwidze-Merrison Efrider","year":"2016","unstructured":"Efrider Maramwidze-Merrison . 2016 . Innovative Methodologies in Qualitative Research : Social Media Window for Accessing Organisational Elites for Interviews. 12 , 2 (2016), 11 . Efrider Maramwidze-Merrison. 2016. Innovative Methodologies in Qualitative Research: Social Media Window for Accessing Organisational Elites for Interviews. 12, 2 (2016), 11.","journal-title":"Social Media Window for Accessing Organisational Elites for Interviews."},{"volume-title":"Andrew Smart, and William S. Isaac","year":"2020","author":"Jr Donald Martin","key":"e_1_3_2_1_54_1","unstructured":"Donald Martin Jr ., Vinodkumar Prabhakaran , Jill Kuhlberg , Andrew Smart, and William S. Isaac . 2020 . Participatory Problem Formulation for Fairer Machine Learning Through Community Based System Dynamics . arXiv:2005.07572 [cs, stat] (May 2020). arXiv:2005.07572 [cs, stat] http:\/\/arxiv.org\/abs\/2005.07572 Donald Martin Jr., Vinodkumar Prabhakaran, Jill Kuhlberg, Andrew Smart, and William S. Isaac. 2020. Participatory Problem Formulation for Fairer Machine Learning Through Community Based System Dynamics. arXiv:2005.07572 [cs, stat] (May 2020). arXiv:2005.07572 [cs, stat] http:\/\/arxiv.org\/abs\/2005.07572"},{"key":"e_1_3_2_1_55_1","unstructured":"Microsoft. 2020. Fairlearn. https:\/\/github.com\/fairlearn\/fairlearn Microsoft. 2020. Fairlearn. https:\/\/github.com\/fairlearn\/fairlearn"},{"key":"e_1_3_2_1_56_1","doi-asserted-by":"publisher","DOI":"10.1177\/1527476419837739"},{"volume-title":"Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019","year":"1909","author":"Mulligan Deirdre K.","key":"e_1_3_2_1_57_1","unstructured":"Deirdre K. Mulligan , Joshua A. Kroll , Nitin Kohli , and Richmond Y. Wong . 2019. This Thing Called Fairness: Disciplinary Confusion Realizing a Value in Technology . Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019 ), 1--36. https:\/\/doi.org\/10.1145\/3359221 arXiv: 1909 .11869 10.1145\/3359221 Deirdre K. Mulligan, Joshua A. Kroll, Nitin Kohli, and Richmond Y. Wong. 2019. This Thing Called Fairness: Disciplinary Confusion Realizing a Value in Technology. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019), 1--36. https:\/\/doi.org\/10.1145\/3359221 arXiv:1909.11869"},{"key":"e_1_3_2_1_58_1","unstructured":"Mimi Onuoha. 2020. When Proof Is Not Enough. https:\/\/fivethirtyeight.com\/features\/when-proof-is-not-enough\/ Mimi Onuoha. 2020. When Proof Is Not Enough. https:\/\/fivethirtyeight.com\/features\/when-proof-is-not-enough\/"},{"key":"e_1_3_2_1_59_1","first-page":"37","article-title":"Our Data Bodies","volume":"15","author":"Petty Tawana","year":"2018","unstructured":"Tawana Petty , Mariella Saba , Tamika Lewis , Seeta Pe\u00f1a Gangadharan , and Virginia Eubanks . 2018 . Our Data Bodies : Reclaiming Our Data . June 15 (2018), 37 . Tawana Petty, Mariella Saba, Tamika Lewis, Seeta Pe\u00f1a Gangadharan, and Virginia Eubanks. 2018. Our Data Bodies: Reclaiming Our Data. June 15 (2018), 37.","journal-title":"Reclaiming Our Data"},{"volume-title":"Jennifer Walker, and Per Axelsson.","year":"2019","author":"Rainie Stephanie Carroll","key":"e_1_3_2_1_60_1","unstructured":"Stephanie Carroll Rainie , Tahu Kukutai , Maggie Walter , Oscar Luis Figueroa-Rodr\u00edguez , Jennifer Walker, and Per Axelsson. 2019 . Indigenous data sovereignty. The State of Open Data: Histories and Horizons ( 2019), 300. Stephanie Carroll Rainie, Tahu Kukutai, Maggie Walter, Oscar Luis Figueroa-Rodr\u00edguez, Jennifer Walker, and Per Axelsson. 2019. Indigenous data sovereignty. The State of Open Data: Histories and Horizons (2019), 300."},{"volume-title":"Where Responsible AI Meets Reality: Practitioner Perspectives on Enablers for Shifting Organizational Practices. arXiv:2006.12358 [cs] (July","year":"2020","author":"Rakova Bogdana","key":"e_1_3_2_1_61_1","unstructured":"Bogdana Rakova , Jingying Yang , Henriette Cramer , and Rumman Chowdhury . 2020. Where Responsible AI Meets Reality: Practitioner Perspectives on Enablers for Shifting Organizational Practices. arXiv:2006.12358 [cs] (July 2020 ). arXiv:2006.12358 [cs] http:\/\/arxiv.org\/abs\/2006.12358 Bogdana Rakova, Jingying Yang, Henriette Cramer, and Rumman Chowdhury. 2020. Where Responsible AI Meets Reality: Practitioner Perspectives on Enablers for Shifting Organizational Practices. arXiv:2006.12358 [cs] (July 2020). arXiv:2006.12358 [cs] http:\/\/arxiv.org\/abs\/2006.12358"},{"key":"e_1_3_2_1_62_1","unstructured":"Nani Jansen Reventlow. [n.d.]. Data collection is not the solution for Europe's racism problem. https:\/\/www.aljazeera.com\/opinions\/2020\/7\/29\/data-collection-is-not-the-solution-for-europes-racism-problem\/?gb=true Nani Jansen Reventlow. [n.d.]. Data collection is not the solution for Europe's racism problem. https:\/\/www.aljazeera.com\/opinions\/2020\/7\/29\/data-collection-is-not-the-solution-for-europes-racism-problem\/?gb=true"},{"volume-title":"What's in a Name? Reducing Bias in Bios without Access to Protected Attributes. arXiv:1904.05233 [cs, stat] (April","year":"2019","author":"Romanov Alexey","key":"e_1_3_2_1_63_1","unstructured":"Alexey Romanov , Maria De-Arteaga , Hanna Wallach , Jennifer Chayes , Christian Borgs , Alexandra Chouldechova , Sahin Geyik , Krishnaram Kenthapadi , Anna Rumshisky , and Adam Tauman Kalai . 2019. What's in a Name? Reducing Bias in Bios without Access to Protected Attributes. arXiv:1904.05233 [cs, stat] (April 2019 ). arXiv:1904.05233 [cs, stat] http:\/\/arxiv.org\/abs\/1904.05233 Alexey Romanov, Maria De-Arteaga, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, Anna Rumshisky, and Adam Tauman Kalai. 2019. What's in a Name? Reducing Bias in Bios without Access to Protected Attributes. arXiv:1904.05233 [cs, stat] (April 2019). arXiv:1904.05233 [cs, stat] http:\/\/arxiv.org\/abs\/1904.05233"},{"volume-title":"Aequitas: A Bias and Fairness Audit Toolkit. arXiv:1811.05577 [cs] (April","year":"2019","author":"Saleiro Pedro","key":"e_1_3_2_1_64_1","unstructured":"Pedro Saleiro , Benedict Kuester , Loren Hinkson , Jesse London , Abby Stevens , Ari Anisfeld , Kit T. Rodolfa , and Rayid Ghani . 2019 . Aequitas: A Bias and Fairness Audit Toolkit. arXiv:1811.05577 [cs] (April 2019). http:\/\/arxiv.org\/abs\/1811.05577 Pedro Saleiro, Benedict Kuester, Loren Hinkson, Jesse London, Abby Stevens, Ari Anisfeld, Kit T. Rodolfa, and Rayid Ghani. 2019. Aequitas: A Bias and Fairness Audit Toolkit. arXiv:1811.05577 [cs] (April 2019). http:\/\/arxiv.org\/abs\/1811.05577"},{"key":"e_1_3_2_1_65_1","doi-asserted-by":"publisher","DOI":"10.1147\/JRD.2019.2945519"},{"volume-title":"Proceedings of the ACM on Human-Computer Interaction 4, CSCW1 (May","year":"2020","author":"Scheuerman Morgan Klaus","key":"e_1_3_2_1_66_1","unstructured":"Morgan Klaus Scheuerman , Kandrea Wade , Caitlin Lustig , and Jed R. Brubaker . 2020. How We've Taught Algorithms to See Identity: Constructing Race and Gender in Image Databases for Facial Analysis . Proceedings of the ACM on Human-Computer Interaction 4, CSCW1 (May 2020 ), 1--35. https:\/\/doi.org\/10.1145\/3392866 10.1145\/3392866 Morgan Klaus Scheuerman, Kandrea Wade, Caitlin Lustig, and Jed R. Brubaker. 2020. How We've Taught Algorithms to See Identity: Constructing Race and Gender in Image Databases for Facial Analysis. Proceedings of the ACM on Human-Computer Interaction 4, CSCW1 (May 2020), 1--35. https:\/\/doi.org\/10.1145\/3392866"},{"key":"e_1_3_2_1_67_1","doi-asserted-by":"publisher","DOI":"10.1145\/3287560.3287598"},{"key":"e_1_3_2_1_68_1","unstructured":"Suranga Seneviratne. 2019. The Ugly Truth: Tech Companies Are Tracking and Misusing Our Data and There's Little We Can Do. http:\/\/theconversation.com\/the-ugly-truth-tech-companies-are-tracking-and-misusing-our-data-and-theres-little-we-can-do-127444 Suranga Seneviratne. 2019. The Ugly Truth: Tech Companies Are Tracking and Misusing Our Data and There's Little We Can Do. http:\/\/theconversation.com\/the-ugly-truth-tech-companies-are-tracking-and-misusing-our-data-and-theres-little-we-can-do-127444"},{"key":"e_1_3_2_1_69_1","unstructured":"Sachil Singh. 2020. Collecting race-based data during pandemic may fuel dangerous prejudices. https:\/\/www.queensu.ca\/gazette\/stories\/collecting-race-based-data-during-pandemic-may-fuel-dangerous-prejudices Sachil Singh. 2020. Collecting race-based data during pandemic may fuel dangerous prejudices. https:\/\/www.queensu.ca\/gazette\/stories\/collecting-race-based-data-during-pandemic-may-fuel-dangerous-prejudices"},{"key":"e_1_3_2_1_70_1","doi-asserted-by":"publisher","DOI":"10.21105\/joss.01904"},{"key":"e_1_3_2_1_71_1","doi-asserted-by":"publisher","DOI":"10.1177\/2053951717736335"},{"volume-title":"Regulating artificial intelligence","author":"Tischbirek Alexander","key":"e_1_3_2_1_72_1","unstructured":"Alexander Tischbirek . 2020. Artificial intelligence and discrimination: Discriminating against discriminatory systems . In Regulating artificial intelligence . Springer , 103--121. Alexander Tischbirek. 2020. Artificial intelligence and discrimination: Discriminating against discriminatory systems. In Regulating artificial intelligence. Springer, 103--121."},{"key":"e_1_3_2_1_73_1","unstructured":"UK Information Commissioner's Office. 2020. What do we need to do to ensure lawfulness fairness and transparency in AI systems? https:\/\/ico.org.uk\/for-organisations\/guide-to-data-protection\/key-data-protection- themes\/guidance-on-ai-and-data-protection\/what-do-we-need-to-do-to-ensure-lawfulness-fairness-and-transparency-in-ai-systems\/ UK Information Commissioner's Office. 2020. What do we need to do to ensure lawfulness fairness and transparency in AI systems? https:\/\/ico.org.uk\/for-organisations\/guide-to-data-protection\/key-data-protection- themes\/guidance-on-ai-and-data-protection\/what-do-we-need-to-do-to-ensure-lawfulness-fairness-and-transparency-in-ai-systems\/"},{"volume-title":"LiFT: A Scalable Framework for Measuring Fairness in ML Applications. arXiv:2008.07433 [cs] (Aug","year":"2020","author":"Vasudevan Sriram","key":"e_1_3_2_1_74_1","unstructured":"Sriram Vasudevan and Krishnaram Kenthapadi . 2020. LiFT: A Scalable Framework for Measuring Fairness in ML Applications. arXiv:2008.07433 [cs] (Aug . 2020 ). https:\/\/doi.org\/10.1145\/3340531.3412705 10.1145\/3340531.3412705 Sriram Vasudevan and Krishnaram Kenthapadi. 2020. LiFT: A Scalable Framework for Measuring Fairness in ML Applications. arXiv:2008.07433 [cs] (Aug. 2020). https:\/\/doi.org\/10.1145\/3340531.3412705"},{"key":"e_1_3_2_1_75_1","doi-asserted-by":"publisher","DOI":"10.1177\/2053951717743530"},{"key":"e_1_3_2_1_76_1","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3174014"},{"volume-title":"Democratic Data: A Relational Theory For Data Governance. SSRN Scholarly Paper ID 3727562. Social Science Research Network","year":"2020","author":"Viljoen Salome","key":"e_1_3_2_1_77_1","unstructured":"Salome Viljoen . 2020 . Democratic Data: A Relational Theory For Data Governance. SSRN Scholarly Paper ID 3727562. Social Science Research Network , Rochester, NY . https:\/\/doi.org\/10.2139\/ssrn.3727562 10.2139\/ssrn.3727562 Salome Viljoen. 2020. Democratic Data: A Relational Theory For Data Governance. SSRN Scholarly Paper ID 3727562. Social Science Research Network, Rochester, NY. https:\/\/doi.org\/10.2139\/ssrn.3727562"},{"key":"e_1_3_2_1_78_1","first-page":"1","article-title":"The What-If Tool: Interactive Probing of Machine Learning Models","volume":"26","author":"Wexler James","year":"2020","unstructured":"James Wexler , Mahima Pushkarna , Tolga Bolukbasi , Martin Wattenberg , Fernanda Vi\u00e9gas , and Jimbo Wilson . 2020 . The What-If Tool: Interactive Probing of Machine Learning Models . IEEE Transactions on Visualization and Computer Graphics 26 , 1 (Jan. 2020), 56--65. https:\/\/doi.org\/10.1109\/TVCG.2019.2934619 10.1109\/TVCG.2019.2934619 James Wexler, Mahima Pushkarna, Tolga Bolukbasi, Martin Wattenberg, Fernanda Vi\u00e9gas, and Jimbo Wilson. 2020. The What-If Tool: Interactive Probing of Machine Learning Models. IEEE Transactions on Visualization and Computer Graphics 26, 1 (Jan. 2020), 56--65. https:\/\/doi.org\/10.1109\/TVCG.2019.2934619","journal-title":"IEEE Transactions on Visualization and Computer Graphics"},{"key":"e_1_3_2_1_79_1","doi-asserted-by":"publisher","DOI":"10.5325\/jinfopoli.8.2018.0078"},{"volume-title":"Fair Lending: Race and Gender Data are Limited for Non-Mortgage Lending. Subcommittee on Oversight and Investigations, Committee on Financial Services","year":"2008","author":"Williams Orice M","key":"e_1_3_2_1_80_1","unstructured":"Orice M Williams . 2008 . Fair Lending: Race and Gender Data are Limited for Non-Mortgage Lending. Subcommittee on Oversight and Investigations, Committee on Financial Services , House of Representatives ( 2008). arXiv:GAO-08-1023T Orice M Williams. 2008. Fair Lending: Race and Gender Data are Limited for Non-Mortgage Lending. Subcommittee on Oversight and Investigations, Committee on Financial Services, House of Representatives (2008). arXiv:GAO-08-1023T"},{"key":"e_1_3_2_1_81_1","doi-asserted-by":"publisher","DOI":"10.1007\/s13347-019-00355-w"},{"key":"e_1_3_2_1_82_1","article-title":"Reconciling legal and technical approaches to algorithmic bias","volume":"88","author":"Xiang Alice","year":"2021","unstructured":"Alice Xiang . 2021 . Reconciling legal and technical approaches to algorithmic bias . Tennessee Law Review 88 , 3 (2021). Alice Xiang. 2021. Reconciling legal and technical approaches to algorithmic bias. Tennessee Law Review 88, 3 (2021).","journal-title":"Tennessee Law Review"},{"volume-title":"Manuel Gomez Rogriguez, and Krishna P. Gummadi","year":"2017","author":"Zafar Muhammad Bilal","key":"e_1_3_2_1_83_1","unstructured":"Muhammad Bilal Zafar , Isabel Valera , Manuel Gomez Rogriguez, and Krishna P. Gummadi . 2017 . Fairness Constraints : Mechanisms for Fair Classification. In Artificial Intelligence and Statistics. PMLR , 962--970. http:\/\/proceedings.mlr.press\/v54\/zafar17a.html Muhammad Bilal Zafar, Isabel Valera, Manuel Gomez Rogriguez, and Krishna P. Gummadi. 2017. Fairness Constraints: Mechanisms for Fair Classification. In Artificial Intelligence and Statistics. PMLR, 962--970. http:\/\/proceedings.mlr.press\/v54\/zafar17a.html"},{"key":"e_1_3_2_1_84_1","first-page":"1375","article-title":"Understanding discrimination in the scored society","volume":"89","author":"Zarsky Tal Z","year":"2014","unstructured":"Tal Z Zarsky . 2014 . Understanding discrimination in the scored society . Washington Law Review 89 (2014), 1375 . Tal Z Zarsky. 2014. Understanding discrimination in the scored society. Washington Law Review 89 (2014), 1375.","journal-title":"Washington Law Review"},{"key":"e_1_3_2_1_85_1","first-page":"1","article-title":"Assessing Fair Lending Risks Using Race\/Ethnicity Proxies","volume":"64","author":"Zhang Yan","year":"2016","unstructured":"Yan Zhang . 2016 . Assessing Fair Lending Risks Using Race\/Ethnicity Proxies . Management Science 64 , 1 (Nov. 2016), 178--197. https:\/\/doi.org\/10.1287\/mnsc.2016.2579 10.1287\/mnsc.2016.2579 Yan Zhang. 2016. Assessing Fair Lending Risks Using Race\/Ethnicity Proxies. Management Science 64, 1 (Nov. 2016), 178--197. https:\/\/doi.org\/10.1287\/mnsc.2016.2579","journal-title":"Management Science"},{"key":"e_1_3_2_1_86_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10506-016-9182-5"}],"event":{"name":"FAccT '21: 2021 ACM Conference on Fairness, Accountability, and Transparency","sponsor":["ACM Association for Computing Machinery"],"location":"Virtual Event Canada","acronym":"FAccT '21"},"container-title":["Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3442188.3445888","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,2,2]],"date-time":"2023-02-02T20:05:31Z","timestamp":1675368331000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3442188.3445888"}},"subtitle":["Challenges to Demographic Data Procurement in the Pursuit of Fairness"],"short-title":[],"issued":{"date-parts":[[2021,3]]},"references-count":86,"alternative-id":["10.1145\/3442188.3445888","10.1145\/3442188"],"URL":"https:\/\/doi.org\/10.1145\/3442188.3445888","relation":{},"subject":[],"published":{"date-parts":[[2021,3]]},"assertion":[{"value":"2021-03-01","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}