{"id":2592,"title":"Monitoring algorithmic systems will need more than the EU\u2019s GDPR","link":"https:\/\/www.reframetech.de\/en\/2019\/01\/24\/monitoring-algorithmic-systems-will-need-more-than-the-eus-gdpr\/","date":"01\/24\/2019","date_unix":1548311496,"date_modified_unix":1723454034,"date_iso":"2019-01-24T06:31:36+00:00","content":"<p><em>Algorithmic systems evaluate people \u2013 which poses risks \u2013 for us as individuals, for groups and for society as a whole. It is therefore important that algorithmic processes be auditable. Can the EU\u2019s General Data Protection Regulation (GDPR) help foster this kind of oversight and protect us from the risks inherent to algorithmic decision-making? Answers to these questions and more are provided by Wolfgang Schulz and Stephan Dreyer in an analysis commissioned by the Bertelsmann Stiftung.<\/em><\/p>\n<p>In algorithmic processes, the processing of data itself poses less of a risk to users than the decisions made as a result of this. They can have an impact not only on us as individuals in terms of our autonomy and personal rights but also on groups of people in terms of discrimination. So far, however, the use of algorithms has continued largely without any form of social control. Against this background, Wolfgang Schulz and Stephan Dreyer explore the extent to which the GDPR, which went into effect in May 2018, supports oversight of algorithmic decision-making and thus protects us from the risks involved.<\/p>\n<p>Their analysis shows that the EU regulation has limited impactfor two reasons. First, the GDPR\u2019s scope of application is narrow. It prohibits only those decisions that are fully automated and have an immediate legal impact or other significant effects. It still allows systems that prepare human-made decisions and recommendations. Fully automated decisions are those decisions made without any human involvement \u2013 for example, when a software sorts out applicants before a human recruiter has even looked at his or her documents. However, if a human being makes a final decision and draws on the support of an algorithmic system \u2013 for example in granting credit \u2013 the GDPR does not apply. And there are also exceptions to the ban \u2013 when the person concerned gives their consent, for example. Ultimately, this leads to algorithmic decision-making becoming commonplace in our everyday lives.<\/p>\n<p><strong>GDPR does not protect against all risks associated with algorithmic decision-making<\/strong><\/p>\n<p>Second, the provisions of the GDPR strengthen to some extent the information rights of individual users and, as a result of stricter documentation obligations, foster greater awareness of data subjects\u2019 rights among those responsible for data processing. The regulation, however, does not protect against socially relevant risks to principles such as fairness, non-discrimination and social inclusion, which extend beyond the data protection rights of an individual per se. The GDPR\u2019s transparency requirements do not have the scope to address systematic errors and the discrimination of entire groups of people.<\/p>\n<p>Additional measures are therefore needed to improve the auditability of algorithmic systems. One such measure involves data protection authorities drawing attention to \u2013 and thereby raising public awareness of \u2013 the social risks inherent to algorithmic decision-making systems through the audits they conduct within the framework of the GDPR. Another option involves steps taken outside the framework of the GDPR that can foster oversight. External third parties may review algorithmic decision-making systems or class-action suits that are pursued under consumer protection laws.<\/p>\n<figure id=\"attachment_2611\" aria-describedby=\"caption-attachment-2611\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-2611\" src=\"https:\/\/www.reframetech.de\/en\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en.jpg\" alt=\"\" width=\"658\" height=\"799\" srcset=\"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en.jpg 2067w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-247x300.jpg 247w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-843x1024.jpg 843w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-768x933.jpg 768w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-600x729.jpg 600w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-1265x1536.jpg 1265w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-1687x2048.jpg 1687w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2019\/01\/DSGVO_ABB_5_en-642x780.jpg 642w\" sizes=\"auto, (max-width: 658px) 100vw, 658px\" \/><figcaption id=\"caption-attachment-2611\" class=\"wp-caption-text\">Bertelsmann Stiftung<\/figcaption><\/figure>\n<p><strong>A PDF of the study \u201cWhat Are the Benefits of the General Data Protection Regulation for Automated Decision-Making Systems\u201d with a cover page (not under a Creative Commons license) is available <a href=\"https:\/\/www.bertelsmann-stiftung.de\/de\/publikationen\/publikation\/did\/the-general-data-protection-regulation-and-automated-decision-making-will-it-deliver\/\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\">here<\/a>.<\/strong><\/p>\n<p><strong>If you wish to re-use or distribute the working paper, a Creative Commons-licensed version without the cover page is available <a href=\"https:\/\/www.reframetech.de\/en\/wp-content\/uploads\/sites\/23\/2019\/01\/GDPR_withoutCover-1.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">here<\/a>.<\/strong><\/p>\n<hr \/>\n<p>This text is licensed under a\u202f<a href=\"http:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" ><strong>C<\/strong><\/a><a href=\"http:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong>reative Commons Attribution 4.0 International License<\/strong><\/a><\/p>\n","excerpt":"<p>Algorithmic systems evaluate people \u2013 which poses risks \u2013 for us as individuals, for groups and for society as a [&hellip;]<\/p>\n","thumbnail":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2018\/04\/shutterstock_520256158_Premier_Cover_4_185x247mm-002-780x373.jpg","thumbnailsquare":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2018\/04\/shutterstock_520256158_Premier_Cover_4_185x247mm-002-370x370.jpg","authors":[{"id":1477,"name":"Stephan Dreyer","link":"https:\/\/www.reframetech.de\/blogger\/stephan-dreyer\/"},{"id":1476,"name":"Prof. Dr. Wolfgang Schulz","link":"https:\/\/www.reframetech.de\/blogger\/prof-dr-wolfgang-schulz\/"}],"categories":[{"id":698,"name":"Political decision-makers","link":"https:\/\/www.reframetech.de\/en\/category\/political-decision-makers\/"}],"tags":[{"id":504,"name":"EU","link":"https:\/\/www.reframetech.de\/en\/tag\/eu-en\/"},{"id":368,"name":"GDPR","link":"https:\/\/www.reframetech.de\/en\/tag\/gdpr\/"},{"id":639,"name":"Publications","link":"https:\/\/www.reframetech.de\/en\/tag\/publications\/"}]}