Tag - Surveillance In English

Censorship and surveillance : a legislative overload in the french parliament.
Contrary to the main narrative, french parliamentarians are not only talking about the budget. Every year, they also revisit a familiar theme: an authoritarian drift, marked by increased security, surveillance, and censorship. After several months of relative inactivity, and with upcoming municipal elections in which these issues may carry political weight, the number of debated security-oriented bills is rising. This provides an opportunity to take stock of the issues currently under discussion in the French Parliament.  It is no secret that decisions taken in the French Parliament can have a global impact, particularly within the EU, where countries often look to one another for precedent when justifying controversial legislation. 2030 OLYMPIC GAMES : POSTPONING THE APPROVAL OF VIDEO SURVEILLANCE ALGORITHMS. In may 2025, the government presented a bill to the Senate in order to organize the 2030 winter Olympics in the French Alps. The first relevant provision in this law: the postponement of the authorization of algorithmic video surveillance until 2027. As a reminder, a 2024 law concerning the Paris Olympic Games authorized live behavioral recognition tools in public spaces. Despite a questionable overview of the usefulness of this technology, the government shows no intention of halting it and is ready to use the 2030 Olympics as a pretext to continue experimentation (together with the security industry) until 2027. We have a dedicated article on this topic. This bill also plans on adding a new regime of prohibition regarding the right to appear in public space during major events : the ministry considers that the current framework (the “MICAS” about administrative control and surveillance measures regarding individuals1) is not coercive enough to fulfill their needs. These new prohibitions will allow state prefects to be able to ban an individual from attending any public event, without judicial approval, nor a specific criminal necessity. These measures have been introduced in by a law on narcotic trafficking in June 2025. The 2030 Olympic Games law aims extending this to “any person for whom there are serious reasons to believe that their behavior poses a particularly serious threat to public security.”. It is difficult to make it more vague and more sweeping. And where do we stand? This bill was adopted by the Senate in May 2025 and by the National Assembly in January 2026. The text will most likely be submitted to the Constitutional Council, which will assess whether the law violates human rights and liberties. If the Council is not seized, the text will be implemented on march 2026.  FACILITATING THE USE OF AUTOMATED SYSTEMS THAT DETECTS LICENSE PLATES. Automatic License Plate Recognition system (ALPR, or LAPI in french) are used by the Customs and police authorities. This technology helps the authorities mostly in detecting car license plates. In October 2025 a bill was proposed to relax the use of such systems.  These measures have multiplied in across French cities for the past decade and are linked with street and toll cameras. Such deployment constitutes mass surveillance, it enables the identification of license plates and therefore the car owners themselves in public space during their daily lives.  Same reason to justify its extension : the tools are practical but too limited under the current framework. The proposition (most likely influenced by the ministry, or security lobbies) wants to extend the end results that can justify the use of such technologies, extend the time in which the data can be kept as well as easing the transmission of said data among different authorities.  And where do we stand ? Currently, the proposition has been adopted by the Senate on December 2025 and is sent to the National Assembly, the voting date is not planned yet. EXTENDING THE POWERS OF MUNICIPAL POLICE AND RURAL GUARDS A bill project presented by the interior ministry plans on extending the powers of municipal police and rural guards.  Said authorities will be given prerogatives on the use of aforementioned ALPR technologies, authorization to use surveillance drones, pedestrian cameras, authority to charge fines for misdemeanors (whose number is on the rise mostly in substance use cases). The bill also enables the regions to fund local security equipment. ( A demand of Valérie Pécresse, a french representative who risks here “Security Shield” be rejected by the courts. ) The impact of this bill project is quiet massive for human rights and liberties, as these agents also gain power in identity control. We will be quick in being up to date about its details. It should be noted that this is part of a continuing trend, particularly since the so-called “Global Security” law: a shift of police powers toward agents who are increasingly less trained and less public : judicial police officers, municipal police, rural wardens, and private security personnel. And where do we stand? The text was adopted by the Senate in October 2025 and was submitted to the National Assembly as of February 11, 2026. The date of its vote has not yet been scheduled EXPANDING THE SUPERVISORY POWERS OF SOCIAL ADMINISTRATIONS. Again, this is a bill that was first presented to the Senate. According to the ministry, its goal is to enforce measures and detect “fiscal and social fraud.” We have written extensively on this topic (here): in this context, we criticize the extension of access, by numerous social administration officers, to large-scale data, including files on airline passengers and telephone communications. This bill is yet another example of the unfortunate proliferation, over the past 20 years, of mass surveillance and control in the name of “combating social fraud”. And where do we stand ? The bill have been adopted in November 2025 by the Senate, and will be voted around 24th to 27th February 2026. PROHIBITION OF SOCIAL MEDIA A bill presented by the government aims at prohibiting social media for people under the age of 15. This bill, introduced by parliamentarian Laure Miller with the support of the French government, seeks to ban individuals under the age of 15 from creating or using social media accounts. Additionally, it would require all social media platforms to implement mandatory age verification for every user, ensuring that underage individuals cannot access these services. The Conseil d’Etat (State Counsel) have given a negative opinion about the text, but the government and the president Macron is strongly in favor of these measures.  And where do we stand ? The bill was voted on by the French National Assembly on January 26–27, 2026 and is scheduled to be considered by the Senate in the near future. If it is approved, the law is expected to come into effect in September 2026.  Among all the new bills currently being considered in France, this one stands out as particularly likely to influence other countries especially with the rise of hateful content on social media. AUTOMATED SURVEILLANCE IN SUPERMARKETS The latest item in this seemingly endless list is automated surveillance in stores. This is a bill proposed by EPR deputy Paul Midy, one of the leading advocates of the “French Tech”, France’s ecosystem of young, innovative digital startups. The idea is simple: legalize algorithmic video surveillance in supermarkets. In reality, this serves the interests of the French security industry, which not only seeks to deploy its tools in public spaces but also inside supermarkets. This is exactly what Veesion has been attempting for several years, claiming that its technology can detect behaviors such as theft. These companies face a major obstacle: it is currently illegal. This is not only our opinion but is also stated from multiple sources, the CNIL, the Conseil d’État, and other authorities, all of whom have confirmed that such use is prohibited. In other words, there is no real debate, even though Veesion has never been sanctioned and continues to receive financial support (for more information, click here). Fortunately for them, Paul Midy is attempting to make this technology legal by proposing to incorporate it into the Code of Interior Security. Where do we stand ? So far, Paul Midy’s bill on automated store surveillance has been approved by the Law Commission and debated in public session at the National Assembly on February 2, 2026. It will face a second public debate session on Monday 16th, February 2026 (the day this translation is being written).  As we can see, the Parliament and the government are quietly greasing the path toward authoritarianism: more surveillance, more censorship, fewer judges… While this list may seem frightening, it also gives us dizzying perspective on how little we can do, both to track these bills and report on the debates, and to try, in any meaningful way, to oppose them.  However, the inventory is not completed : the interior ministry wants to deploy a bill project on “daily security” (even though the text seems to be limited for “illegal street stunts”, it risks of being a legislative vehicle that will bring other dangerous measures, such as Algorithmic video surveillance) and continues to implement Islamophobic measures whenever they can. This article was translated by volunteers in our Matrix group, thanks to Ismail1071 ! 1. We did a livestream about this, you can see it here. ︎
February 16, 2026
La Quadrature du Net
CNAF’s discriminatory scoring algorithm: 10 new organisations join the case before the Conseil d’État in France
Just over a year ago, 15 civil society organisations challenged the risk-scoring algorithm used by the family branch of the French welfare system (CNAF). The legal action was brought before the French Conseil d’État on the grounds of personal data protection and the principle of non-discrimination. This algorithm assigns a suspicion score to each beneficiary and selects those to subject to further checks. Every month, the algorithm analyses the personal data of more than 32 million people and calculates more than 13 million scores. Factors that increase a suspicion score include having a low income, being unemployed, receiving the minimum income benefit or disability benefits. Today, our coalition is proud to welcome 10 new organisations in this litigation. We are now 25 asking for a ban of the CNAF’s scoring algorithm. The diversity of the coalition – bringing together groups of affected people, unions as well as French and European fundamental rights NGOs – demonstrate the broad resistance to the CNAF’s algorithm and more broadly against discriminatory algorithms targeting vulnerable people. Our legal action started in October 2024 before the Conseil d’État targets both the extent of the surveillance in place and the discrimination perpetrated by this algorithm. Fuelled by personal data of millions of people, it deliberately targets the most disadvantaged. The serious discrimination at the heart of the algorithm has been confirmed by the Défenseur des Droits – the French Ombudsperson – in an opinion sent to the court in last October. Finally, on 15 January 2026, the CNAF released the source code of its current algorithm. While we welcome the efforts towards transparency — the CNAF had previously refused to disclose the source code of the algorithm in use — transparency alone is not enough. This should not distract from the fact that a 2025 internal CNAF study we obtained recognised the algorithm’s discriminatory effects. Our coalition included this study in a new brief sent to the court in December. “Our new, expanded coalition brings together a variety of European and French organisations from a range of backgrounds. This shows that the Conseil d’État should refer the case to the Court of Justice of the European Union so that the court can issue a pan-European decision,” says Bastien Le Querrec, legal officer at La Quadrature du Net. The Conseil d’État informed the plaintiffs that the written phase of the litigation will close at the end of this month. We expect the public hearing to take place next spring. New plaintiffs: * Confédération Générale du Travail (CGT) * Union Syndicale Solidaires * Fédération Syndicale Unitaire Travail Emploi Insertion Organismes Sociaux (FSU TEIOS) * Data for Good * European Digital Rights (EDRi) * AlgorithmWatch * European Network Against Racism * Panoptykon Foundation * Mouvement des mères isolées * Féministes contre le cyberharcèlement First plaintiffs: * La Quadrature du Net (LQDN) * Association d’Accès aux Droits des Jeunes et d’Accompagnement vers la Majorité (AADJAM) * Aequitaz * Amnesty International France * Association nationale des assistant·e·s de service social (ANAS) * APF France handicap * Collectif Changer de Cap * Fondation pour le Logement des Défavorisés * Groupe d’information et de soutien des immigré·es (Gisti) * Le Mouton numérique * La Ligue des droits de l’Homme (LDH) * Mouvement national des chômeurs et précaires (MNCP) * Mouvement Français pour un Revenu de base (MFRB) * CNDH Romeurope * Syndicat des avocats de France (SAF)
January 20, 2026
La Quadrature du Net
Predictive Policing in France: Against opacity and discrimination, the need for a ban
As part of a European initiative coordinated by Statewatch, La Quadrature has translated its report on the state of predictive policing in France. In light of the information gathered, and given the dangers these systems carry when they incorporate socio-demographic data as a basis for their recommendations, we call for their ban. After documenting back in 2017 the arrival of so-called predictive policing systems, and then being confronted with the lack of up-to-date information and real public debate, we sought to investigate them in more detail. For this report, we have therefore compiled the data available on several predictive policing software systems formerly or currently in use within French police forces. These include: * RTM (Risk Terrain Modelling), a “situational prevention” software program used by the Paris Police Prefecture to target intervention zones based on “environmental” data (presence of schools, shops, metro stations, etc.); * PredVol, a software developed in 2015 within the government agency Etalab, tested in Val d’Oise in 2016 to assess the risk of car thefts, abandoned in 2017 or 2018; * PAVED, a software developed from 2017 by the Gendarmerie and trialed from 2018 in various departements to assess the risk of car thefts or burglaries. In 2019, shortly before its planned nationwide rollout, the project was “paused”; * M-Pulse, previously named Big Data of Public Tranquility, developed by the city of Marseille in partnership with the company Engie Solutions to assess the suitability of municipal police deployments in urban public space; * Smart Police, an application that include a “predictive” module and that is developed by French startup Edicia which, according to its website, has sold this software suite to over 350 municipal forces. DANGEROUS TECHNOLOGIES, WITHOUT SUPERVISION OR EVALUATION Here we summarize the main criticisms of the systems studied, most of which use artificial intelligence techniques. CORRELATION IS NOT CAUSATION The first danger associated with these systems, itself amplified by the lack of transparency, is the fact that they extrapolate results from statistical correlations between the different data sources they aggregate. Indeed, out of bad faith or ideological laziness, the developers of these technologies maintain a grave confusion between correlation and causation (or at least refuse to make the distinction between the two). Yet these confusions are reflected in very concrete ways in the design and functionalities of the applications and software used in the field by police officers, but also in their consequences for women residents exposed to increased policing. When using these decision-support systems, the police should therefore at the very least strive to demonstrate the explanatory relevance of using specific socio-demographic variables in their predictive models (i.e. go beyond simple correlations to trace the structural causes of delinquency, which could lead to considering actual remedies rather than mere securitarian policies). This would imply, first and foremost, being transparent about these variables, which is far from being the case. In the case of PAVED, for example, the predictive model is said to use fifteen socio-demographic variables which, according to the developers, are strongly correlated with crime. However, there is no transparency about the nature of these variables, let alone any attempt to demonstrate a true cause-and-effect relationship. The same is generally true of the variables used by Smart Police, Edicia’s software, although in this case we have even less visibility on the exact nature of the variables used by the system. POTENTIALLY DISCRIMINATORY VARIABLES It is likely that, just like the algorithms used by the Caisse nationale des allocations familiales (CNAF) we recently uncovered, some of the socio-demographic variables mobilized are discriminatory. Indeed, risk scores are possibly correlated with a high rate of unemployment or poverty, or a high rate of people born outside the European Union in the neighborhood under consideration. This is because in a system like PAVED, we know that among the data used for establishing “predictions” are nationality and immigration data, household income and composition, and level of education. All of these variables are likely to lead to the targeting of the most precarious populations and those most exposed to structural racism. CRIMINOLOGICAL FALSE BELIEFS Another danger associated with these systems, itself amplified by the lack of transparency, lies in the fact that they entrench decried criminological doctrines. The promoters of predictive policing refuse to build a general understanding of deviant behavior and illegalisms: they make no mention of the policies of exclusion, discrimination, and the social violence of public policies. But when they venture in proposing explanatory models, and attempt to fit these models into their scoring algorithms, developers seem to rely on “knowledge” whose relevance is perfectly dubious. Some doctrinal allusions appear, for example, in research articles by PAVED’s main developer, Gendarmerie Colonel Patrick Perrot. They contain basic assumptions about crime (for example, crime as a “constantly evolving phenomenon”), alluding to “weak signals” and other “warning signs” of delinquency that echo “broken windows” theories, the scientific basis of which is widely questioned. Similarly, in the case of Edicia, the predictive module seems to be based on the idea that delinquency has a geographical spillover effect (or “contagion” effect), and also incorporates postulates “brought up” from the “field” which claim that “petty delinquency leads to major delinquency”. These inane doctrines serve above all to mask the disastrous consequences of neoliberal policies, to criminalize everyday incivilities and must be interpreted as the key element in an attempt to criminalize the poor. They are now incorporated into the automatic systems that the police grant itself, making them harder to decipher. A RISK OF SELF-REINFORCEMENT The criticism is widely known, but it deserves to be reiterated: predictive policing software raises a major risk of feedback loops and self-reinforcing effect, therefore leading to a demultiplication of police domination on specific neighborhoods (surveillance, identity checks, uses of coercive powers). In fact, their use necessarily leads to the over-representation of geographical areas defined as high-risk in the learning data. As soon as a significant number of patrols are sent to a given area in response to the algorithm’s recommendations, they will be led to observe offenses – even minor ones – and to collect data relating to this area, which will in turn be taken into account by the software and contribute to reinforcing the probability that this same area will be perceived as “at risk”. Predictive policing thus produces a self-fulfilling prophecy by concentrating significant resources in areas already plagued by discrimination and over-policing. POSSIBLE ABUSES OF POWER While we have not found any information on the specific instructions given to police officers when patrolling in areas deemed high-risk by predictive systems, one source told us that, thanks to PAVED, the gendarmerie was able to obtain authorization from the public prosecutor for officers on patrol to position themselves in transit areas so as to stop passing vehicles. This involved checking drivers’ license plates and driving licenses, and in some cases carrying out vehicle searches. If the information proves accurate, it would mean that preventive checks, carried out under authorization from the Public Prosecutor’s Office, were decided on the sole basis of a technology founded on dubious doctrines and whose effectiveness has never been assessed. A situation which, in itself, would materialize a characterized disproportion of the restrictive measures of freedom taken against the people subjected to those stops and searches. TECHNOLOGIES OF DUBIOUS EFFECTIVENESS With regard to their discriminatory nature, even if these predictive policing systems proved effective from the point of view of police rationality, they would pose significant problems in terms of social justice and respect for human rights. Yet, despite the absence of any official evaluation, available data points to the absence of added value of predictive models in achieving the objectives the police had set themselves. In fact, these tools seem far from having convinced their users. PredVol was no better than simple human deduction. As for PAVED, although it may have prevented a few car thefts, it proved disappointing in terms of predictive capabilities, and did not translate into an increased number of flagrant arrests, which remains the standard of efficiency for the police under the reign of the “policy of numbers”2. Despite initial plans, PAVED was never implemented within the Gendarmerie Nationale. Following an experimental phase from 2017 to 2019, it was decided to shelve the software. And while M-Pulse has found a new lease of life under the “citizen rebranding” pushed by Marseille’s new center-Left municipal majority, its police uses seem relatively marginal. For what reasons? The opacity surrounding these experiments makes it impossible to say with any certainty, but the most likely hypothesis lies both in the absence of any real added value in relation to existing knowledge and beliefs within police forces, and in the organizational and technical complexity associated with the use and maintenance of these systems. IMPORTANT SHORTCOMINGS IN THE HANDLING OF PERSONAL DATA For those opposing these systems, the information presented in our report might seem reassuring. But in reality, even if the fad surrounding “predictive policing” seems to have passed, R&D around decision support systems in the context of policing goes on unabated. In France, substantial sums of money are being spent to meet the stated ambition of “taking the Ministry of the Interior to the technological frontier”, as envisioned in the 2020 Internal Security White Paper1. In the context of a primacy given to techno-securitarian approaches, PAVED could thus be reactivated or replaced by other systems in the near future. As for Edicia, in recent months the company has been considering incorporating new sources of data from social networks into its predictive module, as envisaged by the designers of M-Pulse at the start of the project. Predictive policing is thus still in order. Interrogated via a FOIA request in March 2022 and again in November 2023, the CNIL, the French data protection authority, told us that it had never received or produced any document relating to predictive policing software as part of its prerogatives. It suggests that the agency had never taken any interest in such automated decision-making systems as part of its oversight powers. In and of itself, this begs important questions when considering that, for some of them, they are used by thousands of municipal police officers across France. Finally, insofar as the administrative police powers exercised in areas deemed “at risk” by predictive systems can be legally considered as “individual administrative decisions”, the requirements set out by the French Constitutional Council in its case law on “public” algorithms should be respected2. In particular, these prohibit the use of “sensitive data”, and impose the possibility of administrative appeals for data subjects. Added to this are transparency obligations imposed by law, notably the 2016 law known as the “Digital Republic”3. These legislative requirements as well as European case-law do not seem to be met when it comes to predictive policing systems. Not only is there no significant, proactive attempt to inform citizens and other stakeholders about exactly how these systems work, apart from the occasional bits of information opportunistically share by the police or other governmental agencies. Worse still, the right to freedom of information that we have exercised via our FOIA requests to learn more about them only delivered very partial information. More often than not, these requests were met with the absence of a response, particularly from the Ministry of the Interior. IT’S URGENT TO BAN PREDICTIVE POLICING Predictive policing systems are hardly in the news anymore. And yet, despite a blatant lack of evaluation, legislative oversight and poor operational results, the promoters of these technologies continue to entertain the belief that “artificial intelligence” will be able to make the police more “efficient”. From our point of view, what these systems produce is, above all, an automation of social injustice and of police violence, and an even greater dehumanization of relations between the police and the population. In this context, it is urgent to put a stop to the use of these technologies and then conduct a rigorous evaluation of their implementation, effects and dangers. The state of our knowledge leads us to believe that such transparency will prove their ineptitude and dangers, and provide further evidence of the need to ban them. HELP US To compensate for the opacity deliberately maintained by the designers of these systems and by the public authorities who use them, if you have at your disposal documents or elements enabling a better understanding of their operation and effects, we invite you to share them on our anonymous document sharing platform. You can also send them to us by post to the following address: 115 rue de Ménilmontant, 75020 Paris. Finally, please don’t hesitate to point out any factual or analytical errors you may find in our report by writing to us at contact@technopolice.fr. And to support this type of research in the future, please also feel free to make a donation to La Quadrature du Net. Read the full report -------------------------------------------------------------------------------- 1. The White Paper proposed to devote 1% of GDP to internal security missions by 2030, representing an expected increase of around 30% in the Ministry’s budget over the decade. Ministère de l’intérieur, « Livre blanc de la sécurité intérieure » (Paris : Gouvernement français, 16 novembre 2020), https://www.interieur.gouv.fr/Actualites/L-actu-du-Ministere/Livre-blanc-de-la-securite-interieure. ︎ 2. See the decision on the transposition of the RGPD (Decision n° 2018-765 DC of June 12, 2018) and the decision on Parcoursup (Decision n° 2020-834 QPC of April 3, 2020). ︎ 3. On the legal obligations of transparency of public algorithms, see : Loup Cellard, « Les demandes citoyennes de transparence au sujet des algorithmes publics », Note de recherche (Paris : Mission Etalab, 1 juillet 2019), http://www.loupcellard.com/wp-content/uploads/2019/07/cellard_note_algo_public.pdf. ︎
Algorithmic videosurveillance is spreading in Europe, let’s fight back together!
Today, La Quadrature du Net publishes the English version of its booklet on algorithmic videosurveillance (“AVS”). This document gathers all our work on this topic, in the hope of making it as accessible as possible. It explain what AVS is, how it works, the economical and political interests at stake in its implementation, and the political project of repression and discrimination that this technology serves. This publication aims to help activists around the world understand this new type of surveillance better in order to fight it. You can download it here. Over the past years, we have tried to shed light on algorithmic videosurveillance software. This type of software analyzes CCTV footage to categorize and classify our body, our gait, our movements, our clothes or even our face to detect odd or suspect individuals in the hustle and bustle of the streets. We have been campaigning against it everywhere, in local groups, before courts,and even in Parliament. Now, we want this fight to spread outside of France, as we see the rise and spread of the AVS market in other countries. Since opacity and secrecy is one of the biggest obstacle to organize this struggle, we find it particularly important that information spreads across Europe. We hope the English version of our booklet can help more people to dig into this subject. We think this is even more important as we see numerous projects being developed in many countries. In Italy, the “Marvel” and “Protector” projects in Trento were declared illegal whereas the “Argo” project in Torino is still running. In Germany, AVS is deployed in Mannheim and Hamburg, while it’s implemented in railway stations in Spain. Moreover, reports from the French government provide us with valuable information on the state of algorithmic video surveillance in other countries through an international comparison of the use of these technologies (available here and here, in French). In these documents, we learn for example that software developed by Briefcam, an Israeli company bought by Canon, is used in Belgium and Italy, aside from being deployed in hundreds of French cities. We also learn that, for 18 months, the German Federal police tested a semi-automatic video analysis software called Investigator, developed by Digivod, which includes physical characteristics and facial recognition. Our concerns and our will to broaden the fight to allies outside of France is also justified by the recurrent statements of French companies bragging about conquering new markets around the world. For example, a start-up called XXII currently aims for Spain, while a software called Veesion, used to analyze behaviors in supermarkets, has now obtained contracts in Portugal. Meanwhile, big French companies such as Thales and Idemia are still promoting software for “safe cities” around the globe. In 2019, we launched our initiative called “Technopolice” to face the rise of biometric surveillance, sensors and cameras in the cities of France. Numerous local groups were created in Marseille, Montpellier, Lyon, Arles or Paris to fight against digital policing. Now the fight against algorithmic surveillance must become a European and an international struggle. Download, read, print and share our booklet available here! And continue the fight where you live. If you find information regarding surveillance projects in your cities and countries, make it public and be vocal on it! Let’s fight back together !
April 11, 2025
La Quadrature du Net
All-out mobilization against the French “war-on-drugs” law
In the midst of the media uproar over drug trafficking, a law on “drug trafficking” is passing through Parliament. In reality, this text does not only apply to the sale of narcotics and leads to a heavy reinforcement of the surveillance capacities of the intelligence and judicial police. It is one of the most repressive and dangerous texts of recent years. This law could notably give even more powers to repress militant actions. This bill was adopted unanimously in the Senate, with the support of the Socialists, the Ecologists and the Communists, and will now be discussed in the National Assembly. La Quadrature du Net is calling for urgent mobilization to raise awareness of the dangers of this text and to push left-wing parties to reject it. On this page, you will find a set of resources and tools to help you understand this law and convince your elected representatives to mobilize: * The so-called “Drug Trafficking” law undermines the protection of encrypted messaging services (such as Signal or WhatsApp) by requiring the implementation of backdoors for the police and intelligence services. * By modifying the legal regime for organized crime, applicable in other cases, this law does not only apply to drug trafficking. It can even be used to surveil activists. * The safe-deposit box, a provision of the law, makes secret the documents in a file detailing the use of surveillance techniques during an investigation. This violates the right to defend oneself and prevents the public from knowing the extent of the surveillance capabilities of the judicial police. * The text provides for authorizing the police to remotely activate the microphones and cameras of fixed and mobile connected devices (computers, telephones, etc.) to spy on individuals. * It extends the authorization to use “black boxes”, a technique for analyzing data from all our communications and exchanges on the Internet for the purpose of “fighting delinquency and organized crime”. * The police will be able to tighten its policy of censoring Internet content by extending it to publications related to the use and sale of drugs. The risks of abuse of freedom of expression are therefore amplified. SUMMARY OF THE LAW IN VIDEO PIPHONE: CHOOSE WHICH MP TO CALL Some progressive elected representatives have bought into this narrative and this dangerous security escalation. In the Senate, the Green, Communist and Socialist groups voted in favor of this text. We believe that they must be made to face their responsibility by denouncing the dangers of this law. With our tool, you can contact them directly to send them the arguments and resources shared on this page. It is also possible to try to convince the deputies of Together for the Republic and Modem to vote against the most dangerous measures of this bill. You can call them all week and if possible on Mondays, Thursdays and Fridays, when they are not in the chamber. You will probably have an assistant on the phone and that’s okay! Feel free to talk to them, and then ask them to relay your opinion to their MP. Thank you to everyone who is putting their energy into opposing this umpteenth security crackdown, and well done to those who had the courage to contact their MPs! <3 IN DETAIL: WHAT DOES THE PROPOSED LAW PROVIDE FOR? – AMENDMENT OF THE ORGANIZED CRIME REGIME This law significantly strengthens the organized crime regime, which does not only concern drug trafficking. This legal framework was created twenty years ago to target, in theory, mafia networks by providing specific rules that derogate from common law. In particular, they allow the police to use surveillance techniques that are much broader and more intrusive than normal (wiretapping, IMSI-catchers, bugging, data capture, etc.). The scope of the offenses covered by the organized crime regime is defined in a list of the code of criminal procedure, which has grown steadily over the years, affecting more people and situations. In particular, it covers criminal association and a number of offenses and crimes committed by “organized gangs”, these classifications being increasingly used to prosecute activists. This was particularly the case during the movement of the pursue an activist fighting against the construction of administrative detention centers. To legitimize the extension of supposedly limited measures, the government and parliamentarians are using sensationalist discourse to address the issue of drug trafficking, which is not new and has known other solutions than the “all-repressive” approach. This is not insignificant. By giving the fight against drug trafficking an exceptional dimension, they are trying to justify the need to resort to extraordinary means that are extremely detrimental to our freedoms. In this respect, they are following the legislative pattern used in recent years in the fight against terrorism, which has allowed for the establishment of very significant derogations from the functioning of the institutions that have gone far beyond terrorism. – SECRET POLICE SURVEILLANCE The law prevents people from knowing how they are being monitored, which is an unprecedented and very serious attack on the founding principles of the French judicial system, which are the right to defend oneself and the principle of adversarial proceedings. Thus a measure known as the “safe file” or “separate report” would make it possible to separate from the criminal file the reports related to the implementation of surveillance techniques. These reports will only be accessible to investigators under the control of the prosecutor or the investigating judge, preventing lawyers and the persons concerned from reading and discussing them and thus from detecting potential illegalities. This will also deprive the public of the opportunity to know the extent of the surveillance capabilities of the judicial police and will facilitate abuses in the use of highly intrusive techniques, such as spyware or the compromising of devices. The senators amended the text to destroy the confidentiality of encrypted messaging services such as Signal or WhatsApp. The law stipulates that communication services are obliged to introduce access – a “backdoor” – for the benefit of the police and intelligence services, under penalty of heavy sanctions. This would create an unprecedented breach in end-to-end encryption technology, exploitable by both states and malicious actors. Such a measure is extremely dangerous. As reiterated by numerous institutions, including ANSSI and the European Data Protection Board has been warning that this would weaken the level of protection for all communications and threaten the confidentiality of all our exchanges. For years, we have been defending the right to encryption. You can read our position from 2017 here. – REMOTE ACTIVATION OF CONNECTED OBJECTS This law provides for a new escalation in surveillance by continuing the legalization of spyware (such as NSO-Pegasus or Paragon). It thus authorizes the police to remotely activate the microphones and cameras of fixed and mobile connected devices, such as computers or telephones, to spy on people. This technique relies on compromising computer systems by using flaws in connected devices. Proposed by Eric Dupont-Moretti in 2023 in a law on judicial reform, this surveillance measure was partially censored by the Constitutional Council. It is reproduced here with slight modifications, while the urgency would be to ban this type of surveillance, as it poses dangers to democratic balances and individual freedoms. – EXPANSION OF INTELLIGENCE POWERS AND BLACK BOXES Intelligence services would also see their powers strengthened with this law. On the one hand, the exchange of information between so-called “second circle” services (for which intelligence is only part of their mission, particularly within the police and national gendarmerie) is in principle very limited. It would be facilitated here by the removal of the need for authorization, and this well beyond the sole perimeter of drug trafficking. On the other hand, the law broadens the scope of application of the “black boxes” to the purpose of “fighting delinquency and organized crime.” This intelligence technique analyzes the data of all our communications and data retrieved from the internet via algorithms under the pretext of “detecting” new suspects. Since their creation in 2015, – Internet censorship The law will allow the police – via the Pharos service – to censor any content on the internet that it considers illegal in connection with an offense relating to drug trafficking. These are very broad prerogatives that are in addition to an already very significant capacity for administrative censorship. This possibility of requiring the withdrawal of publications without the intervention of a judge had initially been authorized for child criminal content before being extended to terrorism. This desire to lock down the internet can only lead to abuse in view of the volume of content concerned and the extra-judicial framework of this censorship, without really having an impact on the social problem of drug use, which is based on many other factors. – AND OTHER MEASURES So far, we have mentioned the most worrying measures, but unfortunately this law contains many other security extensions applying to “organized crime”, which, as we know, concerns many more situations than drug trafficking: IMSI-catchers in private places, the power of the prefect to ban people from “appearing” in a place, the use of drones in prisons, compulsory cameras in ports, a new broad and poorly defined offense of “participation in a criminal organization”…
February 27, 2025
La Quadrature du Net