Abstract:
The growing use of algorithms in social and economic life has raised a concern: that they may inadvertently discriminate against certain groups. Because the data used to train these algorithms are themselves tinged with stereotypes and past discrimination, it is natural to worry that biases are being “baked in.” We consider this problem in the context of a specific but important case, one that is particularly amenable to economic analysis: using algorithmic predictions to guide decisions.(Paper by Jon Kleinberg, Jens Ludwig, Sendhil Mullainathan, and Ashesh Rambachan.)