TY - JOUR
T1 - Fairness in Algorithmic Profiling: The AMAS Case.
AU - Achterhold, Eva
AU - Mühlböck, Monika
AU - Steiber, Nadia
AU - Kern, Christoph
N1 - Publisher Copyright:
© The Author(s) 2025.
PY - 2025/1/29
Y1 - 2025/1/29
N2 - We study a controversial application of algorithmic profiling in the public sector, the Austrian AMAS system. AMAS was supposed to help caseworkers at the Public Employment Service (PES) Austria to allocate support measures to job seekers based on their predicted chance of (re-)integration into the labor market. Shortly after its release, AMAS was criticized for its apparent unequal treatment of job seekers based on gender and citizenship. We systematically investigate the AMAS model using a novel real-world dataset of young job seekers from Vienna, which allows us to provide the first empirical evaluation of the AMAS model with a focus on fairness measures. We further apply bias mitigation strategies to study their effectiveness in our real-world setting. Our findings indicate that the prediction performance of the AMAS model is insufficient for use in practice, as more than 30% of job seekers would be misclassified in our use case. Further, our results confirm that the original model is biased with respect to gender as it tends to (incorrectly) assign women to the group with high chances of re-employment, which is not prioritized in the PES’ allocation of support measures. However, most bias mitigation strategies were able to improve fairness without compromising performance and thus may form an important building block in revising profiling schemes in the present context.
AB - We study a controversial application of algorithmic profiling in the public sector, the Austrian AMAS system. AMAS was supposed to help caseworkers at the Public Employment Service (PES) Austria to allocate support measures to job seekers based on their predicted chance of (re-)integration into the labor market. Shortly after its release, AMAS was criticized for its apparent unequal treatment of job seekers based on gender and citizenship. We systematically investigate the AMAS model using a novel real-world dataset of young job seekers from Vienna, which allows us to provide the first empirical evaluation of the AMAS model with a focus on fairness measures. We further apply bias mitigation strategies to study their effectiveness in our real-world setting. Our findings indicate that the prediction performance of the AMAS model is insufficient for use in practice, as more than 30% of job seekers would be misclassified in our use case. Further, our results confirm that the original model is biased with respect to gender as it tends to (incorrectly) assign women to the group with high chances of re-employment, which is not prioritized in the PES’ allocation of support measures. However, most bias mitigation strategies were able to improve fairness without compromising performance and thus may form an important building block in revising profiling schemes in the present context.
KW - Statistical discrimination
KW - Bias mitigation
KW - Algorithmic profiling
KW - Artificial intelligence
KW - Public employment services
UR - http://www.scopus.com/inward/record.url?scp=85218129922&partnerID=8YFLogxK
U2 - 10.1007/s11023-024-09706-9
DO - 10.1007/s11023-024-09706-9
M3 - Article
SN - 0924-6495
VL - 35
SP - 1
EP - 30
JO - Minds and Machines
JF - Minds and Machines
IS - 1
M1 - 9
ER -