Using Algorithms to Tame Discrimination: A Path to Diversity, Equity, and Inclusion

Deven R. Desai - Georgia Institute of Technology and Yale Law School Information Society Project
Swati Gupta - Georgia Institute of Technology
Vol. 56
April 2023
Page 1703

Companies that try to address inequality in employment face a paradox. Failing to address disparities regarding protected classes in a company’s workforce can result in legal sanctions; but proactive actions to address and avoid such disparities can also face legal scrutiny and sanctions too. After the summer of 2020, companies such as Microsoft announced large programs to address inequity in employment. They soon received letters from the Labor Department’s Office of Federal Contract Compliance Programs (“OFCCP”) because of the OFCCP’s concern that the plans will end up discriminating based on race. At the same time, the OFCCP announced a settlement with Microsoft on September 19, 2020, for $3 million back pay and interest to address hiring disparities “against Asian applicants” for several positions from December 2015 to November 2018. These examples are not isolated and are likely to persist. Any company seeking to identify talent will likely use data and algorithms to screen and hire employees. That practice will again raise the tension of how to increase diversity without running into problems of embedded inequity and making decisions that are prohibited because they are based on protected class status. We offer a potential path forward to solve this paradox by exploring current advances in Computer Science and Operations Research.

By carefully acknowledging uncertainties in candidates’ data (using the framework of partially ordered sets), a hiring entity can improve equal opportunity practices. The solution is to embed error-mitigation due to uncertainties or biases in an algorithmic decision-making process without crossing into illegal discriminatory practices (e.g., without enforcing quotas). In short, this Article explains a way to design fair screening methods that account for biases and uncertainties in data and abide by anti-discrimination law.

View Full Article