Help us make the FRA website better for you!

Take part in a one-to-one session and help us improve the FRA website. It will take about 30 minutes of your time.

YES, I AM INTERESTED NO, I AM NOT INTERESTED

30
May
2018

#BigData: Discrimination in data-supported decision making

We live in a world of big data, where technological developments in the area of machine learning and artificial intelligence have changed the way we live. Decisions and processes concerning everyday life are increasingly automated, based on data. This affects fundamental rights in various ways. This focus paper specifically deals with discrimination, a fundamental rights area particularly affected by technological developments.

The intersection of rights and technological developments warrants closer examination, prompting the Fundamental Rights Agency to research this theme.

When algorithms are used for decision making, there is potential for discrimination against individuals. The principle of non-discrimination, as enshrined in Article 21 of the Charter of Fundamental Rights of the European Union (EU), needs to be taken into account when applying algorithms to everyday life. This paper explains how such discrimination can occur, suggesting possible solutions. The overall aim is to contribute to our understanding of the challenges encountered in this increasingly important field.