4.4 Editorial Material

The dual function of explanations: Why it is useful to compute explanations

Journal

COMPUTER LAW & SECURITY REVIEW
Volume 41, Issue -, Pages -

Publisher

ELSEVIER ADVANCED TECHNOLOGY
DOI: 10.1016/j.clsr.2020.105527

Keywords

Automated decisions; Artificial intelligence; Explainability; Explainable AI; GDPR

Categories

Funding

  1. UK Engi-neering and Physical Sciences Research Council (EPSRC) [EP/S027238/1, EP/S027254/1]
  2. EPSRC [EP/S027238/1, EP/S027254/1] Funding Source: UKRI

Ask authors/readers for more resources

The legal debate around automated decision-making has mainly focused on the 'right to explanation' in the GDPR, while the emergence of XAI has introduced taxonomies for explaining AI systems. However, researchers have warned that transparency of algorithms alone is not sufficient, and better tools are needed for evaluating socio-technical systems. The PLEAD project suggests that explanations can aid in compliance strategies beyond GDPR requirements, and computable explanations can facilitate monitoring and auditing, benefiting both data subjects and controllers.
Whilst the legal debate concerning automated decision-making has been focused mainly on whether a 'right to explanation' exists in the GDPR, the emergence of 'explainable Artificial Intelligence' (XAI) has produced taxonomies for the explanation of Artificial Intelligence (AI) systems. However, various researchers have warned that transparency of the algorithmic processes in itself is not enough. Better and easier tools for the assessment and review of the socio-technical systems that incorporate automated decision-making are needed. The PLEAD project suggests that, aside from fulfilling the obligations set forth by Article 22 of the GDPR, explanations can also assist towards a holistic compliance strategy if used as detective controls. PLEAD aims to show that computable explanations can facilitate monitoring and auditing, and make compliance more systematic. Automated computable explanations can be key controls in fulfilling accountability and data-protection-by-design obligations, able to empower both controllers and data subjects. This opinion piece presents the work undertaken by the PLEAD project towards facilitating the generation of computable explanations. PLEAD leverages provenance-based technology to compute explanations as external detective controls to the benefit of data subjects and as internal detective controls to the benefit of the data controller. (C) 2021 Niko Tsakalakis, Sophie Stalla-Bourdillon, Laura Carmichael, Trung Dong Huynh, Luc Moreau, Ayah Helal. Published by Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available