FAIR-EDU: Promote Fairness in Education Institutions
Overview
When a bias impacts human beings as individuals or as groups characterized by certain legally-protected sensitive attributes (e.g., gender), the inequalities reinforced by search and recommendation algorithms can lead to severe societal consequences, such as discrimination and unfairness.
FAIR-EDU tests and estimates the algorithmic bias present in staff-related data generated and used by the University of L’Aquila, in alignment with the institution’s Gender Equality Plan.
The project identified seven research questions addressed in four interleaved phases carried on with the support of the Gender Equality Plan’s Working Group and the IT staff.
Related Publications
- Uncovering Gender Gap in Academia – Journal of Systems and Software (2024)
- Enhancing Fairness in Classification Tasks – BIAS@ECIR 2022
- Debiaser for Multiple Variables – Information Processing & Management (2023)