When Small Decisions Have Big Impact: The Hidden Consequences of Algorithmic Decision-Making

About this Session


Thu. 07.04. 10:30



Speaker: Ruben Bach, Co-Authors: Christoph Kern, Frauke Kreuter

Statistical profiling is increasingly used in public administration to inform high-stakes policy decisions such as the allocation of scarce public resources. One example is statistical profiling of jobseekers’ risk to become long-term unemployed (LTU). So far, risk assessments are based on human experience or on a (small) set of pre-defined rules. The hope is that statistical and algorithmic profiling will increase both effectiveness and objectivity of the decision-making process by making better risk assessments that eventually translate into better decisions. However, concerns are raised that statistical profiling may exacerbate existing inequalities and potentially result in unfair and discriminatory decisions. Against this backdrop, we evaluate statistical profiling models under realistic conditions by utilizing German administrative labor market data on jobseekers’ (un)employment histories. We show that these data can be used to predict LTU with high accuracy. However, different classification policies based on our estimated risk scores have very different fairness implications as our models tend to exacerbate pre-existing differences between different societal groups. Thus, automating policies via statistical profiling can perpetuate structural inequalities on the labor market if such differences are left unnoticed. We therefore call for rigorous auditing processes before profiling models are put to practice.