The new guidelines issued by the European Data Protection Board on May 16, 2022, regarding the use of facial recognition, reiterate the position that is supposed to be measured, a warning that now seems certain.
On May 16, 2022, the European Data Protection Board (EDPB) adopted guidelines for the use of oral recognition technology by law enforcement and judicial authorities (prevention, investigation, evidence, enforcement of sanctions, etc.). The full text can be found here, and will soon be the subject of a 6-week public consultation Stay tuned because this public consultation analyzes all contributions including the respective actors but is open to everyone.
As a reminder, the French name for this independent European body is the EDPD, which aims to guarantee the continued implementation of the GDPR and to promote cooperation between the European Union’s data protection authorities, the EDPS (European Data Protection Board).
EDPB’s position on the issue of face recognition has remained stable and consistent since the beginning of the debate, which inevitably has been accompanied by an increase in the number of uses around this technology. The EDPD takes a cautious approach to promoting oral recognition in areas particularly sensitive to use by law enforcement and judicial authorities.
In this context, the joint opinion published by EDPB and EDPS (European Data Protection Supervisor) on June 18, 2021 on the proposal for a control over artificial intelligence has already recommended in principle to ban the use of oral recognition technology in certain cases. (Which we will see below) and in principle a targeted ban which the EDPB reiterated in the guidelines published on 16 May. With regard to other areas that are not subject to such restrictions in principle, the EDPB introduces appropriate requirements based on existing data protection laws to prevent any misuse.
An understandably cautious stance that verbal recognition could have a serious impact on the fundamental rights of European citizens, whether related to the protection of the right to privacy, but more broadly to the protection of human dignity, freedom of thought, conscience and religion. The latest scandal surrounding the practice of Clearview AI Company has led to multiple trials and fines of 23 million pounds sterling (8.85 million euros) on May 23 by British data protection authorities, as well as the obligation to delete personal data. This is an example of British residents. The company will offer its customers, including police and judicial authorities, a service to find a person’s online picture after entering a photo. The British Information Commissioner John Edwards was criticized for saying that “the company does not only allow the identification of people whose pictures it has collected, but also monitors their behavior and offers it as a commercial service”. The ClearView AI company will be criticized for collecting more than 20 billion images from around the world / web without establishing the slightest partnership with the publishers of the sites or services it owns, and the apps in question (etc) are therefore without the consent of the people concerned.
Prohibition in principle
The EDPB in its guidelines reiterates its call for a complete ban on the use of facial recognition technology in certain areas where the risk of leaks is considered to be very high. The EDPB refers to four, which are as follows:
- Use of recognition for remote identification of individuals in public spaces,
- The use of facial recognition to infer from individuals biometric data their racial, gender, political or sexual orientation or any other information that may form the basis for discrimination,
- The use of facial recognition to predict a person’s emotions,
- The use of facial recognition in a law enforcement context relies on a database that collects personal data on a large scale and arbitrarily, such as photographs or facial images collected on the Internet.
In these different circumstances, the EDPB considers that identification by oral recognition is more likely to infringe on the fundamental rights of European citizens than to justify a conflict of interest (private as well as public).
EDPB relies on many existing regulatory tools. Thus, the European Data Protection Agency emphasizes that the use of verbal recognition by state authorities must comply with various obligations arising from law enforcement guidelines, which define the European framework for data protection in their processing for repressive and judicial purposes.
Let us recall here, to avoid any confusion, the relationship between law enforcement guidelines (LED) and GDPR. Both texts are part of the European data protection body: however, they have separate and complementary areas of application. In fact, the GDPR only applies to data processing in terms of activities that fall under EU law, not national security or national defense. This is why the data protection framework for these particular sectors relies on a separate direction, LED. This is why most of the data protection policies set by LEDs are strictly identical to the policies described by the GDPR, although some obligations are specifically related to state security and national defense issues.
According to the EDPB, the use of facial recognition means compliance with LEDs, implementing the following guarantees:
- A legal basis that relies on a legal measure whose terms are sufficiently clear to give citizens an adequate indication of the conditions and circumstances where the authority is entitled to use any facial recognition measure.
- The formulation of this legal basis must be subject to the approval of the appropriate data protection authority.
- The legal recognition of facial recognition systems must be strictly appropriate for the purposes described in the text. In this context, a common motive of common interest is not enough to automatically justify the use of verbal recognition: measures must be limited to targeting and fighting certain serious crimes, such as a terrorist act, and not common or indiscriminate.
- The requirements of the facial recognition system and the criteria of proportionality must be respected; In general, if data is processed systematically by law enforcement and judicial authorities and without knowledge of data matters, these are unlikely to happen.
- The fact that a person has published a photograph that has made it public does not imply the right of law enforcement and judicial authorities to use relevant biometric data in terms of facial recognition systems.
- Particular attention should be paid to the rights of data matters relating to their personal data processed by the facial recognition system. Authorities must therefore pay close attention to the right of persons concerned to information, as well as their right of access to data, or the right to correct incorrect information stored in a database.
- A data protection effect study must be done systematically before implementing any facial recognition system.
The EDPB’s position may seem strict in its guidelines, recommending a complete ban on certain sectors and a strong framework for others. Such warnings, however, may seem reasonable in light of the major effects of the uncontrolled dissemination of such technologies in various economic, political and social spheres of our European community. As we have already mentioned in the in-depth dossier dedicated to our facial recognition in 2020, the opportunities offered by this technology are real, as it includes risks in respecting the fundamental rights of European citizens. This is why a cautious approach seems appropriate to exploit the possibility of facial recognition, without leading us to a dystopian social environment towards which less prudent countries like China seem to be moving.
Small more personal conclusions
As part of our commitment to promote responsible innovation, we oppose innovation by a utopian regulatory approach.
The practice, which begins with the response to public consultation as well as the implementation of new obligations, will add some subtlety to these guidelines.