The increasing prevalence of automated decision-making process is increasing the risk associated to models that can potentially discriminate against disadvantaged groups. The Fairness Measures project contributes to the development of fairness-aware algorithms and systems by providing relevant datasets and software.
This website provides pointers to a series of datasets that we have collected and/or prepared. The datasets corresponds to various fields and applications (e.g., finance, law, and human resources).
This website also provides code implementing a series of measures introduced in the literature to analyze and quantify discrimination. We include common tests as well as more specialized methods.
We would love to hear your comments and suggestions, please contact Meike Zehlike for any feedback you may have.