Skip to main content

Use of Drones by the French Police: Warning from the State Council

Recent demonstrations and protests around the world (“Yellow Jackets” – Gilets Jaunes – in France, Black Lives Matter etc) have gone hand in hand with the increasing use of surveillance technologies by the police forces to monitor participants. More generally, new technologies have given the opportunity to the law enforcement authorities to expand their control over the population, with the fear that these technologies are used in a way that may violate citizens’ fundamental rights and freedoms (think of the controversial debate over Covid 19 contact tracing app). Amazon has also announced on 10 June that it would stop providing its facial recognition software (Rekognition) to the police in the US, to give time to the Congress to put in place “stronger regulations to govern the ethical use of facial recognition technology”. 

In this context, the French State Council – the highest administrative jurisdiction in France – has issued an interesting decision on the 18th of May, whereby it ordered the Paris Police Department to immediately cease using drones to monitor compliance of the population with the Covid-19 deconfinement measures.

In its ruling, the State Council underscores the implications of such “technopolice” practices on privacy by recalling the broad scope of the concept of personal data, which not only covers information relating to an identified individual, but also information that can lead to the identification of an individual, through additional processing that might be reasonably employed. For instance, in the context of video surveillance practices of a public site, although the images recorded may not always allow the identification of every individual captured (e.g. if an individual is from behind), it may however be possible to identify him or her on the basis of his or her appearance by cross-checking the images with other information held for instance by the ticket office of the site.

The facts underlying the State Council are as follows: from the 18th of March – the first day of confinement in France – the Paris Police Department was using, on a daily basis, a drone to live-stream video footage taken from the air of several public sites in Paris. The images captured were transmitted in real time to a control centre where police officers then decided what measures, if any, should be taken to ensure compliance with confinement rules. Possible measures included broadcasting messages via loudspeaker placed on the drone to the individuals located at one of the recorded sites, and/or deploying an intervention unit to the site to ensure dispersion or evacuation. The instructions given by the Paris Police Department to its teams running the drone indicated that the drone’s zoom function should not be used and the images not recorded. Such use continued after 11 May, to monitor compliance with the deconfinement rules.

On the 2nd of May, two civil liberties associations (Quadrature du Net and Ligue des Droits de l’Homme) brought a case before an administrative tribunal by means of urgency proceedings, arguing that the deployment of the drone was a breach of privacy and data protection laws. The associations maintained that insofar as such practice involved a processing of personal data by the State for the purposes of public safety, such processing should have been authorized by a ministerial or Council of State decree after opinion of the French Data Protection Authority (CNIL), in accordance with article 31 of the French Data Protection Act (Loi Informatique et Libertés). In addition, the associations alleged a number of additional breaches of the data protection rules, in particular the absence of data storage limitations, of transparency vis-a-vis the data subjects and of organisational measures implemented to ensure the confidentiality and security of the processed data.

On the 5th of May the administrative court rejected the associations’ request, by a rather expeditious reasoning, holding that the collection and transmission of the images by the Paris Police Department could not be considered processing of personal data since nothing indicated that the drone had been used in a manner that allowed for the identification of individuals on the ground.

The State Council reversed the decision of the first instance court and held that, contrary to the explanation of the police force, the drone "may collect identifiable data" and do not contain any technical functionality to avoid the identification of the concerned individuals. As a consequence, the live-streaming of the recorded images constitutes a processing of personal data that should have been authorized by a ministerial decree (or more likely a decree of the State Council, given the potentially sensitive nature of the data at stake which may cover for instance data revealing religious beliefs if the surroundings of a place of worship are filmed) taken after an obtaining an opinion from the CNIL. In the absence of such authorization, the drone should have been equipped with technical features to make it impossible for individuals to be identified, thus ensuring that the data processed did not qualify as personal data. In other words, the State Council did not forbid per se the use of drones by the police force, but recalled that such use shall be subject to specific safeguards given its high privacy and data protection implications.

The State Council decision was not surprising given the broad scope of the concept of personal data as defined by data protection laws and guidance. Although individuals could not actually be identified from the data captured by the drone, given the height from which recordings were made, the issue lay in the fact that the Police Department had the possibility, via the zoom function or by flying close to the ground, to allow for such identification. Thus merely instructing police officers not to use the zoom function was not a sufficient safeguard and the mere technical ability to capture personal data, whether or not it is actually done, will apparently be enough to give rise to data protection obligations.

This decision aligns with the CNIL’s position and the position of the European Data Protection Board (EDPB) regarding data protection and the use of video surveillance technologies. The EDPB‘s recent guidance provides that the use of video surveillance technology renders all individuals who enter the monitored space potentially identifiable on basis of their appearance or other specific elements, and that this processing should therefore be subject to particular vigilance given the potential for misuse of the data. The risks increase with the size of the monitored area and the number of individuals that frequent the area. The CNIL had also provided in December 2018 a cautious opinion on the use of “pedestrian cameras” (caméras-piétons), which allow the French police to live-record their actions, because of the potentially disproportionate collection the personal data collection by such cameras.

Another point of interest of this decision is the procedural strategy adopted by the associations; they preferred to invoke urgency proceedings before the administrative tribunal rather than making a complaint to the CNIL, which would normally be the preferred route in personal data-related litigation. Although this may be justified by the urgency of the situation (the first instance decision was rendered within three days), the French Data Protection Act also provides for urgency proceedings before the CNIL – which can however not last less than eight days.

Interestingly, the CNIL issued a public statement immediately following the decision of the State Council, indicating that it had initiated a series of investigations with the French Interior Ministry relating to the use of drones. Its conclusions are expected within the next weeks. 


Popular posts from this blog

CNIL’s decision against Google relating to the use of cookies: result of the appeal before the French Conseil d’Etat

On 4 March 2021, the French Conseil d’Etat rendered its decision in the Google vs CNIL case. As a reminder, on 7 December 2020, the CNIL imposed a sanction on Google LLC and Google Ireland Limited (together “ Google ”) for a total amount of 100 million euros for breach of Article 82 of the French Loi Informatique et Libertés (the “ LIL ”) relating to the use of cookies and other tracking technologies (Article 82 transposes Article 5.3 of the ePrivacy Directive). The CNIL found in particular that Google failed to obtain proper consent from data subjects, breached its information obligation and did not provide an efficient objection mechanism, in relation to the use of cookies. The CNIL also issued an injunction ordering Google to comply with article 82 of the LIL within three months, the CNIL being able to impose a €100 000 daily fine in case of non-compliance with such injunction. Google appealed the CNIL’s decision, by way of interim proceedings, in order to obtain the suspension

Proposition de règlement sur les marchés numériques ou Digital Markets Act (« DMA ») et Proposition de règlement sur les services numériques ou Digital Services Act (« DSA ») : principales dispositions

La publication des propositions de règlements DMA et DSA intervenue le 15 décembre 2020 constitue une étape importante de l’ambitieuse réforme de l'espace numérique envisagée par la Commission européenne. Dans le cadre du processus législatif européen, ces propositions doivent maintenant être soumises à l’approbation du Parlement et du Conseil qui leur apporteront probablement des amendements. Le délai moyen pour l’adoption d'un règlement est de 18 mois mais peut être significativement allongé pour des textes très discutés ou controversés ce qui sera vraisemblablement le cas du DMA et du DSA, compte tenu de leur vaste champ d’application, de l’importance des acteurs concernés ainsi que des pouvoirs conséquents qu’ils prévoient de conférer à la Commission.  La présente note donne un aperçu de la structure et des principales obligations applicables aux " gatekeepers ", dans le cas de la DMA (Partie I) et aux fournisseurs de " intermediary services " en ligne,

Le CEPD publie des lignes directrices sur le ciblage des utilisateurs de réseaux sociaux

Le 2 septembre 2020, le Comité Européen de la Protection des Données (CEPD) a adopté des lignes directrices sur le ciblage des utilisateurs de réseaux sociaux, qui sont ouvertes à la consultation publique jusqu'au 19 octobre 2020.   I.                    Contexte   Les réseaux sociaux permettent des échanges massifs de données à caractère personnel. Le ciblage publicitaire fait partie du business model des fournisseurs de réseaux sociaux, qui traitent les données personnelles issues de leur(s) plateforme(s) seuls ou conjointement avec d’autres acteurs.   Le CEPD, conscient des enjeux majeurs relatifs au traitement de données à caractère personnel dans le cadre du ciblage publicitaire sur les réseaux sociaux, a publié des lignes directrices afin de donner un cadre à ce ciblage publicitaire. Il demande principalement aux différents acteurs qui traitent des données de déterminer de manière transparente leurs rôles et responsabilités dans le cadre d'un contrat.   C