Facial recognition and GDPR compliance: the impossible convergence?
On 21 February 2020, many sources indicated that the national police forces of several countries asked the European Commission to share their facial recognition databases for EU-wide investigations.
In France, experiments are already under way and have even given rise to legal proceedings. In a judgment of the Criminal Court of Lyon dated 31 October 2019, an individual was convicted of theft of a lorry and its merchandise on the basis of CCTV images cross-referenced with images from both the criminal records (TAJ) database1. and the anthropometric photographs (GASPARD) database2 . Even though the investigation also gathered other evidence that showed that the alleged thief was guilty, the defendant’s lawyer did not fail to point out that the software used to cross-reference the data was not disclosed during the proceedings and that even if evidence could be provided by any means under French criminal law, the lawfulness of such software was doubtful.
Facial recognition software is based on algorithms and algorithms are based on human-directed choices. As a rule, they should comply with the General Data Protection Regulation (GDPR) and the related Police and Criminal Justice Data Protection Directive and ensure that the principles of fairness and dignity are complied with in a ‘data subject protection by design’ mode. Otherwise, facial recognition software will certainly lack the transparency expected from this type of tool for at least four reasons:
- the source of the data used (particularly large-scale cross-checking of filing systems),
- the purpose of the processing: using your image to open your smartphone is not the same thing as signing your tax return (which will soon be possible in France with the ALICEM3 facial recognition ID app project), still less to be subject to constant and general judicial monitoring;
- the risk of racial or ethnic bias,
- the technical limitations (both false positives and false negatives reduce its performance to less than 40% correct results, to mention only the latest experiment conducted in the streets of London) and vulnerability to computer attacks (the American company Clearview has recently had the bitter experience of seeing 2,200 profiles out of the 3 billion it collected on social networks stolen4).
In France, the CNIL stays the course and rejected the projects submitted by the cities of Nice and Marseille5 for using facial recognition to control access in high schools. Amid the Covid-19 pandemic, tracking apps are currently high on the agenda of European supervisory authorities, which are concerned with similar issues including the monitoring of the movements, and probability of exposure to the virus, of individuals on a large scale. Following the opinion of the European Data Protection Board (EDPB), the European Commission, together with the Member States, has just published a toolbox6 describing the best practices for using this type of apps. It would be equally urgent that a convergent European position be taken on how to reconcile facial recognition with the respect for the rights of individuals.
1 TAJ: traitement d'antécédents judiciaires
2 GASPARD: gestion automatisée des signalements et des photo anthropométriques répertoriées et distribuables
4 www.infosecurity-magazine.com/news/facial-recognition-clearview-ai/
6 ec.europa.eu/commission/presscorner/detail/fr/ip_20_670
Article provided by: Eric Le Quellenec (Alain Bensoussan Avocats Lexing, France)
Discover more about INPLP, the INPLP-Members and the GDPR-FINE database
Dr. Tobias Höllwarth (Managing Director INPLP)