Automated decision-making (ADM) is the decision making process of a machine without human involvement based on factual data or digitally created profiles. Profiling is done when personal aspects are evaluated in order to make predictions about individuals, even if no decision is made. For example, if a company or an organisation assesses a person’s characteristics (such as income, political views or personal preferences) or classifies them in categories. Organisations obtain personal information about individuals from a variety of different sources like internet searches, buying habits, lifestyle and behavioral data gathered from mobile phones, social networks or video surveillance.
The rise of artificial intelligence (AI) has led to an explosion in the number of algorithms that are used by companies and the public sector all over the world. However, these systems are often times follow a messy logic and are therefore error-prone. Mistakes can seriously impact people’s lives, no matter if they are not invited to a job interview, get a bad credit score or are subject to personalized pricing. In addition to limiting the use of ADM, key issues that have derived out of ADM include bias, fairness, accountability and transparency around the reasons for a decision and the ability to explain the basis on which a machine made a decision.
The rise of artificial intelligence (AI) has led to an explosion in the number of algorithms that are used by companies and the public sector all over the world. However, these systems are often times follow a messy logic and are therefore error-prone. Mistakes can seriously impact people’s lives, no matter if they are not invited to a job interview, get a bad credit score or are subject to personalized pricing. In addition to limiting the use of ADM, key issues that have derived out of ADM include bias, fairness, accountability and transparency around the reasons for a decision and the ability to explain the basis on which a machine made a decision.
According to Article 22 GDPR, users have the right not to be subject to a decision based solely on automated means. Users also have the right to express their point of view, challenge the automated decision and receive meaningful human intervention.
In this context, noyb
- witnesses a trend that ADM usually targets the most vulnerable groups (like “gig workers”, people reporting hate speech, job applicants or consumers) where companies do want to invest in the necessary resources to make reasonable decisions,
- filed several complaints against tech giants like Amazon and Airbnb because they fail to comply with user rights provided for by the GDPR and
- researches how ADM and profiling work, which data is used and the origin of this data
Case | Controller | DPA | Status | Duration |
---|---|---|---|---|
C040 | Amazon (Luxemburg) | DSB (Austria), CNPD (Luxembourg) | Pending (3 - 4 years) | Filed:
(3 years ago) |
C052 | Airbnb | BlnBDI (Berlin), DPC (Ireland) | Pending (2 - 3 years) | Filed:
(2 years 11 months ago) |
C053 | Amazon (Luxemburg) | CNPD (Luxembourg) | Lost | Filed:
(2 years 10 months ago) |
C063 | Telesign, BICS, Proximus PLC | APD/GBA (Belgium) | Pending (12 - 18 months) | Filed:
(1 year 4 months ago) |
C088 | KSV 1870, Unsere Wasserkraft | easy green energy GmbH & Co KG | DSB (Austria) | Pending (0 - 6 months) | Filed:
(2 months 2 weeks ago) |