Source - La Quadrature du Net

Censorship and surveillance : a legislative overload in the french parliament.
Contrary to the main narrative, french parliamentarians are not only talking about the budget. Every year, they also revisit a familiar theme: an authoritarian drift, marked by increased security, surveillance, and censorship. After several months of relative inactivity, and with upcoming municipal elections in which these issues may carry political weight, the number of debated security-oriented bills is rising. This provides an opportunity to take stock of the issues currently under discussion in the French Parliament.  It is no secret that decisions taken in the French Parliament can have a global impact, particularly within the EU, where countries often look to one another for precedent when justifying controversial legislation. 2030 OLYMPIC GAMES : POSTPONING THE APPROVAL OF VIDEO SURVEILLANCE ALGORITHMS. In may 2025, the government presented a bill to the Senate in order to organize the 2030 winter Olympics in the French Alps. The first relevant provision in this law: the postponement of the authorization of algorithmic video surveillance until 2027. As a reminder, a 2024 law concerning the Paris Olympic Games authorized live behavioral recognition tools in public spaces. Despite a questionable overview of the usefulness of this technology, the government shows no intention of halting it and is ready to use the 2030 Olympics as a pretext to continue experimentation (together with the security industry) until 2027. We have a dedicated article on this topic. This bill also plans on adding a new regime of prohibition regarding the right to appear in public space during major events : the ministry considers that the current framework (the “MICAS” about administrative control and surveillance measures regarding individuals1) is not coercive enough to fulfill their needs. These new prohibitions will allow state prefects to be able to ban an individual from attending any public event, without judicial approval, nor a specific criminal necessity. These measures have been introduced in by a law on narcotic trafficking in June 2025. The 2030 Olympic Games law aims extending this to “any person for whom there are serious reasons to believe that their behavior poses a particularly serious threat to public security.”. It is difficult to make it more vague and more sweeping. And where do we stand? This bill was adopted by the Senate in May 2025 and by the National Assembly in January 2026. The text will most likely be submitted to the Constitutional Council, which will assess whether the law violates human rights and liberties. If the Council is not seized, the text will be implemented on march 2026.  FACILITATING THE USE OF AUTOMATED SYSTEMS THAT DETECTS LICENSE PLATES. Automatic License Plate Recognition system (ALPR, or LAPI in french) are used by the Customs and police authorities. This technology helps the authorities mostly in detecting car license plates. In October 2025 a bill was proposed to relax the use of such systems.  These measures have multiplied in across French cities for the past decade and are linked with street and toll cameras. Such deployment constitutes mass surveillance, it enables the identification of license plates and therefore the car owners themselves in public space during their daily lives.  Same reason to justify its extension : the tools are practical but too limited under the current framework. The proposition (most likely influenced by the ministry, or security lobbies) wants to extend the end results that can justify the use of such technologies, extend the time in which the data can be kept as well as easing the transmission of said data among different authorities.  And where do we stand ? Currently, the proposition has been adopted by the Senate on December 2025 and is sent to the National Assembly, the voting date is not planned yet. EXTENDING THE POWERS OF MUNICIPAL POLICE AND RURAL GUARDS A bill project presented by the interior ministry plans on extending the powers of municipal police and rural guards.  Said authorities will be given prerogatives on the use of aforementioned ALPR technologies, authorization to use surveillance drones, pedestrian cameras, authority to charge fines for misdemeanors (whose number is on the rise mostly in substance use cases). The bill also enables the regions to fund local security equipment. ( A demand of Valérie Pécresse, a french representative who risks here “Security Shield” be rejected by the courts. ) The impact of this bill project is quiet massive for human rights and liberties, as these agents also gain power in identity control. We will be quick in being up to date about its details. It should be noted that this is part of a continuing trend, particularly since the so-called “Global Security” law: a shift of police powers toward agents who are increasingly less trained and less public : judicial police officers, municipal police, rural wardens, and private security personnel. And where do we stand? The text was adopted by the Senate in October 2025 and was submitted to the National Assembly as of February 11, 2026. The date of its vote has not yet been scheduled EXPANDING THE SUPERVISORY POWERS OF SOCIAL ADMINISTRATIONS. Again, this is a bill that was first presented to the Senate. According to the ministry, its goal is to enforce measures and detect “fiscal and social fraud.” We have written extensively on this topic (here): in this context, we criticize the extension of access, by numerous social administration officers, to large-scale data, including files on airline passengers and telephone communications. This bill is yet another example of the unfortunate proliferation, over the past 20 years, of mass surveillance and control in the name of “combating social fraud”. And where do we stand ? The bill have been adopted in November 2025 by the Senate, and will be voted around 24th to 27th February 2026. PROHIBITION OF SOCIAL MEDIA A bill presented by the government aims at prohibiting social media for people under the age of 15. This bill, introduced by parliamentarian Laure Miller with the support of the French government, seeks to ban individuals under the age of 15 from creating or using social media accounts. Additionally, it would require all social media platforms to implement mandatory age verification for every user, ensuring that underage individuals cannot access these services. The Conseil d’Etat (State Counsel) have given a negative opinion about the text, but the government and the president Macron is strongly in favor of these measures.  And where do we stand ? The bill was voted on by the French National Assembly on January 26–27, 2026 and is scheduled to be considered by the Senate in the near future. If it is approved, the law is expected to come into effect in September 2026.  Among all the new bills currently being considered in France, this one stands out as particularly likely to influence other countries especially with the rise of hateful content on social media. AUTOMATED SURVEILLANCE IN SUPERMARKETS The latest item in this seemingly endless list is automated surveillance in stores. This is a bill proposed by EPR deputy Paul Midy, one of the leading advocates of the “French Tech”, France’s ecosystem of young, innovative digital startups. The idea is simple: legalize algorithmic video surveillance in supermarkets. In reality, this serves the interests of the French security industry, which not only seeks to deploy its tools in public spaces but also inside supermarkets. This is exactly what Veesion has been attempting for several years, claiming that its technology can detect behaviors such as theft. These companies face a major obstacle: it is currently illegal. This is not only our opinion but is also stated from multiple sources, the CNIL, the Conseil d’État, and other authorities, all of whom have confirmed that such use is prohibited. In other words, there is no real debate, even though Veesion has never been sanctioned and continues to receive financial support (for more information, click here). Fortunately for them, Paul Midy is attempting to make this technology legal by proposing to incorporate it into the Code of Interior Security. Where do we stand ? So far, Paul Midy’s bill on automated store surveillance has been approved by the Law Commission and debated in public session at the National Assembly on February 2, 2026. It will face a second public debate session on Monday 16th, February 2026 (the day this translation is being written).  As we can see, the Parliament and the government are quietly greasing the path toward authoritarianism: more surveillance, more censorship, fewer judges… While this list may seem frightening, it also gives us dizzying perspective on how little we can do, both to track these bills and report on the debates, and to try, in any meaningful way, to oppose them.  However, the inventory is not completed : the interior ministry wants to deploy a bill project on “daily security” (even though the text seems to be limited for “illegal street stunts”, it risks of being a legislative vehicle that will bring other dangerous measures, such as Algorithmic video surveillance) and continues to implement Islamophobic measures whenever they can. This article was translated by volunteers in our Matrix group, thanks to Ismail1071 ! 1. We did a livestream about this, you can see it here. ︎
February 16, 2026
La Quadrature du Net
CNAF’s discriminatory scoring algorithm: 10 new organisations join the case before the Conseil d’État in France
Just over a year ago, 15 civil society organisations challenged the risk-scoring algorithm used by the family branch of the French welfare system (CNAF). The legal action was brought before the French Conseil d’État on the grounds of personal data protection and the principle of non-discrimination. This algorithm assigns a suspicion score to each beneficiary and selects those to subject to further checks. Every month, the algorithm analyses the personal data of more than 32 million people and calculates more than 13 million scores. Factors that increase a suspicion score include having a low income, being unemployed, receiving the minimum income benefit or disability benefits. Today, our coalition is proud to welcome 10 new organisations in this litigation. We are now 25 asking for a ban of the CNAF’s scoring algorithm. The diversity of the coalition – bringing together groups of affected people, unions as well as French and European fundamental rights NGOs – demonstrate the broad resistance to the CNAF’s algorithm and more broadly against discriminatory algorithms targeting vulnerable people. Our legal action started in October 2024 before the Conseil d’État targets both the extent of the surveillance in place and the discrimination perpetrated by this algorithm. Fuelled by personal data of millions of people, it deliberately targets the most disadvantaged. The serious discrimination at the heart of the algorithm has been confirmed by the Défenseur des Droits – the French Ombudsperson – in an opinion sent to the court in last October. Finally, on 15 January 2026, the CNAF released the source code of its current algorithm. While we welcome the efforts towards transparency — the CNAF had previously refused to disclose the source code of the algorithm in use — transparency alone is not enough. This should not distract from the fact that a 2025 internal CNAF study we obtained recognised the algorithm’s discriminatory effects. Our coalition included this study in a new brief sent to the court in December. “Our new, expanded coalition brings together a variety of European and French organisations from a range of backgrounds. This shows that the Conseil d’État should refer the case to the Court of Justice of the European Union so that the court can issue a pan-European decision,” says Bastien Le Querrec, legal officer at La Quadrature du Net. The Conseil d’État informed the plaintiffs that the written phase of the litigation will close at the end of this month. We expect the public hearing to take place next spring. New plaintiffs: * Confédération Générale du Travail (CGT) * Union Syndicale Solidaires * Fédération Syndicale Unitaire Travail Emploi Insertion Organismes Sociaux (FSU TEIOS) * Data for Good * European Digital Rights (EDRi) * AlgorithmWatch * European Network Against Racism * Panoptykon Foundation * Mouvement des mères isolées * Féministes contre le cyberharcèlement First plaintiffs: * La Quadrature du Net (LQDN) * Association d’Accès aux Droits des Jeunes et d’Accompagnement vers la Majorité (AADJAM) * Aequitaz * Amnesty International France * Association nationale des assistant·e·s de service social (ANAS) * APF France handicap * Collectif Changer de Cap * Fondation pour le Logement des Défavorisés * Groupe d’information et de soutien des immigré·es (Gisti) * Le Mouton numérique * La Ligue des droits de l’Homme (LDH) * Mouvement national des chômeurs et précaires (MNCP) * Mouvement Français pour un Revenu de base (MFRB) * CNDH Romeurope * Syndicat des avocats de France (SAF)
January 20, 2026
La Quadrature du Net
Donate to La Quadrature Du Net
Today, when the return of facisms is mentioned, no one has the courage to shrug it off. At la Quadrature du Net, we have been speaking about “authoritarianism” for a long time. But what we long took for a series of skirmishes now increasingly resembles a coherent and concerted attack. This is why we have decided to devote our 2026 donation campaign to examine the links between digital technologies and their potential use in a fascistic, totalitarian manner. This overview will take the form of published articles and, for the first time, by the production of livestream content. To fight the authoritarian drift and continue our work for a free digital world, one that is emancipatory and unifying, we need your support! This year, in order to add to the articles that have long accompanied our donation campaign, we have decided to launch the diffusion of livestream content. In this new format, we will address topics around the “digital technology and fascisation” with the help of guests. They will help us analyse and provide greater insight regarding the political and social processes that we face today and that we must defend for the future. We have scheduled 5 livestreams by the end of December, which will be available to watch for free on our website. In order to encourage you to help us, here is what we have achieved so far: * On the 21st November, the first livestream was on the topic of “fascism” and “technofascism”, in order to find a consensus on the different words that can be used, before addressing the substantive issues. We received Nastasia Hadjadji, co-author of “Apocalypse Nerds, How techno-fascists have taken power.” * On the 26th November, we received Soizic Pénicaud (journalist, teacher and activist) and Clément Pouré (journalist) in order to speak about the function of journalism and its articulation connection with militant causes, in a context of far-right shift of the media world. * On the 3rd December, with Mathilde Saliou (journalist) and Pablo Rauzy (university professor) we questioned the apparent “neutrality of technology” by analysing how the form given to technological objects is very often designed to serve fascisation, and we discussed different ways of conceiving them in order to lead to a different, more emancipating path. * On the 10th December, with Pauline and Paloma from the Human Rights Observers and Romain Lanneau from Statewatch, the question of border surveillance and how different technological tools are used was the main topic. * On the 17th December, the different methods of repression used by intelligence agencies against Muslim people were mentioned, starting with house arrests and the use of “white notes”, used in court to criminalise someone before a given judgement. You can donate here! WHY DONATE TO US THIS YEAR? In order to ensure our annual budget, this year we need €250 000 of donation, including the already present monthly subscriptions and new monthly or punctial donations. WHAT PURPOSE DOES YOUR DONATION ACTUALLY HAVE? The association has a fantastic group of volunteers, but it also needs a team of salaried staff. The main purpose of the donations is to pay the salaries of the association’s permanent members (around 78% of our expenses). The other costs to cover are the rent and maintenance of our premises, business trips in France and abroad (mostly by train), expenses related to campaigns and events, as well as various material costs for activism (posters, stickers, paper, printer, t-shirts, etc.). To give you an idea, here are our 2025 expenses (including salaries), broken down across our campaigns according to the time each person spent on the different topics of our struggles: WHAT ARE THE SOURCES OF OUR FUNDINGS? The association does not touch any public fund, but receives strong support from diverse philantrophic fondations, consisting of 50% of the annuel budget last year. We are currently supported by “la Fondation pour le progrès de l’Homme”, the “Un monde par tous” foundation, the “Limelight foundation”, the “Digital Freedom Fund” as well as “Civitates”. We also receive punctial support by the european network EDRi. The rest of our funding comes from donations. So if you can, please help us ! Be aware that, as mentionned in our Q&A, donations made to La Quadrature are not tax-deductible, as the tax authorities have refused us this possibility on two occasions. HOW TO DONATE? You can make a donation by credit card, check, or bank transfer. If you can make a monthly donation, even a very small one, do not hesitate, as these are our favorites: they allow us to work with more confidence throughout the year by providing a steady income. Additionally, the total of your donations entitles you to rewards (bag, t-shirt, sweatshirt). Please note, delivery is not automatic—you need to log in and request them on your personal donor page. And if the rewards take a little while to arrive, which isn’t uncommon, it’s because we’re overwhelmed, waiting for restocks in certain sizes, and also because we make everything ourselves by hand. But they always end up arriving! Thank you again for your generosity, and thank you for your patience <3 -------------------------------------------------------------------------------- This article was translated by our volunteer group. Warm thanks to them all!
November 21, 2025
La Quadrature du Net
In France, the eternal return of facial recognition
Last May, the French government launched a working group aimed at legalizing real-time facial recognition. Far from being a surprise, this announcement is part of a series of proposals put forward by state officials, along with industrial and scientific players. We publish this op-ed by Félix Tréguer, adapted from a text originally published on AOC, where he argues that facial recognition is incompatible with democratic forms of life. In May 2025, Gérald Darmanin was getting restless. Overtaken on the right by his successor at the Ministry of Interior, Bruno Retailleau, the new Minister of Justice apparently found it difficult to hang up his apron as “France’s top cop.” So he pulled out of his hat a seemingly novel proposal: the legalization of real-time facial recognition. The ink on the War on Drugs law, with its array of new police surveillance measures, was not yet dry— the Constitutional Council would rule on it a few days later—but the minister was already on to his next move. After his announcement, his office would confirm to AFP that a working group was about to be launched to “create a legal framework” so as to “introduce this measure into our legislation.” According to the minister, who in 2022 said he opposed facial recognition, “using technology and facial recognition are the solutions to drastically combat insecurity.” A few days later, his rival Retailleau would follow suit, calling for “highly regulated” use of real-time facial recognition in the context of criminal investigations. A POLITICAL PROJECT The context of heightened political competition on the right could lead one to believe that this is yet another trial balloon with no future. In fact, facial recognition is a political project that has long been embraced by Emmanuel Macron’s governments, but has been repeatedly postponed for fear of provoking an outcry among the population. When my colleagues at La Quadrature du Net and other collectives across the country launched the Technopolice campaign in 2019 to document new police surveillance technologies and unite local resistance, facial recognition was already on everyone’s lips. At the time, the Parliamentary Office for Scientific and Technological Assessment, then spearheaded by a Macronist parliamentarian, was already calling for an experimental law to authorize its use in real time. A few weeks later, Secretary of State for Digital Affairs Cédric O gave an interview to Le Monde newspaper on the subject. In this very first government statement on the topic, O considered it necessary “to experiment with facial recognition so that our manufacturers can make progress.” The economic stakes were then expressed candidly. It is true that since the early 2010s, facial recognition and other techniques combining artificial intelligence and video surveillance—a spectrum of applications grouped under the term algorithmic video surveillance (AVS)— have been the subject huge public spending. Through public research policies led by the European Commission or the French National Research Agency, but also via tax mechanisms such as the Research Tax Credit, startups and large multinationals such as Idemia and Thales have a significant portion of their R&D financed by taxpayers. Bpifrance (the French public investment bank) and the Caisse des Dépôts et Consignations have also taken action to help French industry structure itself to gain a foothold in these promising markets: last year, the global facial recognition market grew by 16% per year and is expected to reach $12 billion in 2028; while the market for other applications of AVS was $5.6 billion in 2023 and could reach $16.3 billion in 2028. In 2019, Cédric O therefore proposed legalizing facial recognition on an experimental basis for the 2024 Olympic Games. But on the eve of the 2022 French presidential election, before retiring from politics to become the chief lobbyist for the tech industry, he publicly acknowledged that the political conditions for using this technology were not ripe: “Priority has been given to other issues […] given the context and the sensitivity of the subject,” he explained at the time, while also denouncing the “libertarian NGOs” that, in his view, had fueled a climate of “psychosis.” The government then fell back on VSA applications deemed less sensitive. They were legalized on an experimental and temporary basis under the 2023 Olympic Games Act and implemented in recent months: These include detecting people or vehicles driving in the wrong direction, falls to the ground, crowd movements, fires, etc. The experiment had inconclusive results, and yet the government is now seeking to extend it as part of the next law relating to the 2030 Olympic Games. But at the end of 2022, during parliamentary debates, Gérald Darmanin, then Minister of the Interior, and his colleague in charge of sports, Amélie Oudéa-Castera, were pretty clear: facial recognition was a no-go: “The system does not in any way provide for […] the creation of a biometric identification system,” the minister of sports stated in the chamber: “The government does not want anything like that, either now or in the future.” IN THE AI ACT, EXEMPTIONS FOR THE POLICE But these reassurances were part of a double game. At the same moment, in Brussels, the French government was leading the negotiations on the AI Act. As demonstrated by an investigation by the media outlet Disclose published this winter, even though the European Commission’s political marketing around this text was based in part on the promise of a ban on “real-time biometric surveillance in real time,” France was actually putting all its weight behind pressuring other European Union member states to spare law enforcement agencies from overly restrictive regulations. This strategy paid off. In the version of the “AI Act” that was ultimately adopted, the principle of banning real-time facial recognition was immediately undermined by a number of exemptions. For example, it is authorized to prevent “a genuine and foreseeable threat of a terrorist attack,” but also in the context of criminal investigations to find suspects for a range of offenses punishable by more than four years’ imprisonment, including sabotage. Activist activities, particularly those associated with the environmental movement, could easily be impacted. Another concession obtained by France —a particularly chilling one—, allows police forces to use algorithmic video surveillance systems that “deduce or infer [people’s] race, political opinions, trade union membership, religious or philosophical beliefs, sex life or sexual orientation.” This means not only the detection of insignia or clothing denoting political orientation, but also the resurgence of naturalizing theories and pseudosciences purporting to reveal “race” or sexual orientation based on morphological characteristics or facial features, now incorporated into powerful automated systems designed to enact state violence. Gérald Darmanin’s announcement of the launch of a “working group” to legalize facial recognition was therefore no surprise. It was a logical follow-up of many steps taken at the highest levels of government, in conjunction with industrial and scientific actors, to prepare the French population and minimize as much as possible the political cost of legalizing real-time facial recognition. The Minister of Justice has claimed that facial recognition is essential to ensure the safety of the population, while trying to minimize the issues at stake: “People say that at [the main Paris airport] Roissy, it takes 1.5 hours to get through; in Dubai it takes 10 minutes; yes, but in Dubai, they have facial recognition,” he explained May, touting the added convenience that people would be entitled to expect from the widespread use of this technology. One might be tempted to remind the minister: “Yes, but in Dubai, human rights defenders are imprisoned, the regime resorts to torture and is effectively a dictatorship. Is this really a model to follow?” Darmanin conveniently forgets that facial recognition is already a reality in some French airports and train stations. It helps him better defend his fool’s bargain: privacy and freedoms in exchange for greater convenience for those who move around in the world of flows. The minister sees this as a good deal, one that will convince “the average person” to put aside their reservations and what he sees as a widespread “paranoia about technology, civil liberties, and the issue of databases.” PERMANENT, GENERAL, AND INVISIBLE IDENTITY CHECKS In high places, however, the paradigm shift brought about by real-time facial recognition is well understood. In 2019, when a colleague from La Quadrature and I were invited to give our opinion on the “social acceptability of facial recognition” before an assembly of police officials, prefects, scientists, and industrialists at the General Directorate of the National Gendarmerie, a colonel in the Gendarmerie came to present a memo he had just published on the subject. In this document, he offered a rather lucid analysis of the place of facial recognition in the history of state identification techniques: “The advantage of this technology is that it systematically and automatically performs the basic tasks of law enforcement, which are to identify, track, and search for individuals, making this control invisible. Subject to unbiased algorithms, it could put an end to years of controversy over racial profiling, since identity checks would be permanent and universal. Provided the algorithms are unbiased, it could put an end to years of controversy over racial profiling, as identity checks would be permanent and universal [emphasis added].” “Invisible, permanent, and universal” identity checks? Michel Foucault was undoubtedly right when, in Discipline and Punish (1975), he alluded to the fantasy of a police force that had become “the instrument of permanent, exhaustive, omnipresent surveillance, capable of making everything visible, but on the condition that it makes itself invisible.” If a law were to be passed to authorize real-time facial recognition, things could move very quickly: Even if, for the time being, in France, its use by the police is only legally possible after the fact, in the context of judicial investigations and only for the “Criminal Record Processing” database (the TAJ, which contains nearly 10 million photographs of faces), the technical infrastructure enabling real-time use is already in place. First, the sensors: around 90,000 video surveillance cameras placed on public streets and roads across the country, forming as many geolocated points dedicated to the collection of facial images. Next, centralized databases of ID photos linked to civil status data: in addition to the TAJ database, most immigration-related files now include photographs of faces that can be processed by algorithms. This is also the case for the ” Secure Electronic Documents ” (TES) file created in 2016 by the Ministry of the Interior, which collects facial prints from all holders of identity cards and passports. And finally, the last piece of the puzzle: facial recognition algorithms that compare images to files, provided by private service providers such as the multinational company Idemia, whose reliability rate has greatly improved in recent years. REFUSING FACIAL RECOGNITION With facial recognition, our faces become index terms in police files, which means that our personal data can be automatically revealed (surname, first name, place of birth, place of residence, etc.). If its use in real time were authorized, going out in public with your face uncovered would be like harboring a forgery-proof ID card that could be read by the government at any time. Anonymity would be made virtually impossible. And, in the course of this process, our faces—which reflect our emotions, attitudes, and ways of being—would be reduced to being mere faces: eyes, a nose, a mouth, ears, and other anatomical features whose measurements, shapes, or colors could be automatically classified. As showcases of our subjectivity, they would become new objects of power, through which the state could control us. Facial recognition is also one of the means by which fascism could take hold and endure. On that day in September 2019 at the General Directorate of the National Gendarmerie, we reminded all its promoters present in the audience why we believed it is unacceptable. Thinking that we could strike a chord with some in the audience, we told them of our conviction that if our grandmothers and grandfathers had had to live in the early 1940s in a world saturated with these technologies, they would not have been able to survive for long in hiding, and therefore organize resistance networks capable of standing up to the Nazi regime. This counterfactual hypothesis illustrates why facial recognition is simply incompatible with the defense of democratic ways of life. In this day and age, it is not to be taken lightly. Félix Tréguer is a researcher, member of La Quadrature du Net and author of Technopolice, la surveillance policière à l’ère de l’intelligence artificielle (Divergences, 2024).
September 4, 2025
La Quadrature du Net
French Administrative Supreme Court illegitimately buries the debate over internet censorship law
In November 2023, La Quadrature du Net, Access Now, ARTICLE 19, European Center for Not-for-Profit Law (ECNL), European Digital Rights (EDRi) and Wikimedia France filed a complaint against the French decree implementing the European Union’s (EU) Regulation on addressing the dissemination of terrorist content online (TCO, also known as “TERREG”). The goal was to obtain the annulment of this dangerous regulation by the Court of Justice of the European Union (CJEU) for its incompatibility with the EU Charter of Fundamental Rights. Unfortunately, in a decision released on Monday, the French supreme administrative court, the Conseil d’État, rejected the organisations’ arguments and their request to refer the case to the CJEU. It is an extremely disappointing outcome for two main reasons. First, the French court illegitimately appropriated the legal debate over the TCO regulation’s compatibility with EU primary law. This matter should, however, be addressed at the EU level. According to the EU Treaties, the CJEU is the primary jurisdiction responsible for ruling on the legality of EU acts – which was the organisations’ main request. By carrying out its own legality assessment, the French court is de facto preventing the CJEU from exercising its exclusive competences. Second, the decision also means that law enforcement authorities across the EU can continue to use the excessive censorship powers under the TCO Regulation for the foreseeable future. Since the proposal was first published in 2018, the organisations that challenged the TCO Regulation have consistently voiced concerns about potential violations of fundamental rights due to its inadequate safeguards. Looking at the data available on the regulation implementation, there are concerning indications that some Member States may be using TERREG it as a political tool to suppress certain types of online expression. For example, of all 349 removal orders issued in the EU between June 2022 and April 2024, 249 were issued by German authorities following the events of October 7th in Israel. This is highly alarming given the increasing crackdown in Germany on freedom of expression and on freedom of assembly and association, targeting those speaking up for Palestinian rights (including protest bans, cancellations of events, suppression of student-led initiatives, etc.). The organisations insist on the urgent need to remove the TERREG disproportionate censorship powers out of police authorities’ hands and to protect people’s ability to freely express themselves online, especially in a context of shrinking civic space across the whole continent. They commit to seek other litigation opportunities to obtain the CJEU’s review of the legality of the TCO Regulation. -------------------------------------------------------------------------------- La Quadrature du Net (LQDN), promotes and defends fundamental freedoms in the digital world. Through its advocacy and litigation activities, it fights against censorship and surveillance, questions how the digital world and society influence each other, and works for a free, decentralised and empowering Internet. The European Center for Not-for-Profit Law (ECNL) is a non-governmental organisation working on creating legal and policy environments that enable individuals, movements and organisations to exercise and protect their civic freedoms. Access Now defends and extends the digital rights of people and communities at risk. It defends a vision of technology that is compatible with fundamental rights, including freedom of expression online. European Digital Rights (EDRi) is the largest European network of NGOs, experts, advocates and academics working to defend and advance human rights in the digital era across the continent. ARTICLE 19 works for a world where all people everywhere can freely express themselves and actively engage in public life without fear of discrimination, by working on two interlocking freedoms: the Freedom to Speak, and the Freedom to Know. Wikimedia France is the French branch of the Wikimedia movement. It promotes the free sharing of knowledge, in particular through Wikimedia projects, such as the online encyclopedia Wikipedia, and helps to defend freedom of expression, particularly online.
Predictive Policing in France: Against opacity and discrimination, the need for a ban
As part of a European initiative coordinated by Statewatch, La Quadrature has translated its report on the state of predictive policing in France. In light of the information gathered, and given the dangers these systems carry when they incorporate socio-demographic data as a basis for their recommendations, we call for their ban. After documenting back in 2017 the arrival of so-called predictive policing systems, and then being confronted with the lack of up-to-date information and real public debate, we sought to investigate them in more detail. For this report, we have therefore compiled the data available on several predictive policing software systems formerly or currently in use within French police forces. These include: * RTM (Risk Terrain Modelling), a “situational prevention” software program used by the Paris Police Prefecture to target intervention zones based on “environmental” data (presence of schools, shops, metro stations, etc.); * PredVol, a software developed in 2015 within the government agency Etalab, tested in Val d’Oise in 2016 to assess the risk of car thefts, abandoned in 2017 or 2018; * PAVED, a software developed from 2017 by the Gendarmerie and trialed from 2018 in various departements to assess the risk of car thefts or burglaries. In 2019, shortly before its planned nationwide rollout, the project was “paused”; * M-Pulse, previously named Big Data of Public Tranquility, developed by the city of Marseille in partnership with the company Engie Solutions to assess the suitability of municipal police deployments in urban public space; * Smart Police, an application that include a “predictive” module and that is developed by French startup Edicia which, according to its website, has sold this software suite to over 350 municipal forces. DANGEROUS TECHNOLOGIES, WITHOUT SUPERVISION OR EVALUATION Here we summarize the main criticisms of the systems studied, most of which use artificial intelligence techniques. CORRELATION IS NOT CAUSATION The first danger associated with these systems, itself amplified by the lack of transparency, is the fact that they extrapolate results from statistical correlations between the different data sources they aggregate. Indeed, out of bad faith or ideological laziness, the developers of these technologies maintain a grave confusion between correlation and causation (or at least refuse to make the distinction between the two). Yet these confusions are reflected in very concrete ways in the design and functionalities of the applications and software used in the field by police officers, but also in their consequences for women residents exposed to increased policing. When using these decision-support systems, the police should therefore at the very least strive to demonstrate the explanatory relevance of using specific socio-demographic variables in their predictive models (i.e. go beyond simple correlations to trace the structural causes of delinquency, which could lead to considering actual remedies rather than mere securitarian policies). This would imply, first and foremost, being transparent about these variables, which is far from being the case. In the case of PAVED, for example, the predictive model is said to use fifteen socio-demographic variables which, according to the developers, are strongly correlated with crime. However, there is no transparency about the nature of these variables, let alone any attempt to demonstrate a true cause-and-effect relationship. The same is generally true of the variables used by Smart Police, Edicia’s software, although in this case we have even less visibility on the exact nature of the variables used by the system. POTENTIALLY DISCRIMINATORY VARIABLES It is likely that, just like the algorithms used by the Caisse nationale des allocations familiales (CNAF) we recently uncovered, some of the socio-demographic variables mobilized are discriminatory. Indeed, risk scores are possibly correlated with a high rate of unemployment or poverty, or a high rate of people born outside the European Union in the neighborhood under consideration. This is because in a system like PAVED, we know that among the data used for establishing “predictions” are nationality and immigration data, household income and composition, and level of education. All of these variables are likely to lead to the targeting of the most precarious populations and those most exposed to structural racism. CRIMINOLOGICAL FALSE BELIEFS Another danger associated with these systems, itself amplified by the lack of transparency, lies in the fact that they entrench decried criminological doctrines. The promoters of predictive policing refuse to build a general understanding of deviant behavior and illegalisms: they make no mention of the policies of exclusion, discrimination, and the social violence of public policies. But when they venture in proposing explanatory models, and attempt to fit these models into their scoring algorithms, developers seem to rely on “knowledge” whose relevance is perfectly dubious. Some doctrinal allusions appear, for example, in research articles by PAVED’s main developer, Gendarmerie Colonel Patrick Perrot. They contain basic assumptions about crime (for example, crime as a “constantly evolving phenomenon”), alluding to “weak signals” and other “warning signs” of delinquency that echo “broken windows” theories, the scientific basis of which is widely questioned. Similarly, in the case of Edicia, the predictive module seems to be based on the idea that delinquency has a geographical spillover effect (or “contagion” effect), and also incorporates postulates “brought up” from the “field” which claim that “petty delinquency leads to major delinquency”. These inane doctrines serve above all to mask the disastrous consequences of neoliberal policies, to criminalize everyday incivilities and must be interpreted as the key element in an attempt to criminalize the poor. They are now incorporated into the automatic systems that the police grant itself, making them harder to decipher. A RISK OF SELF-REINFORCEMENT The criticism is widely known, but it deserves to be reiterated: predictive policing software raises a major risk of feedback loops and self-reinforcing effect, therefore leading to a demultiplication of police domination on specific neighborhoods (surveillance, identity checks, uses of coercive powers). In fact, their use necessarily leads to the over-representation of geographical areas defined as high-risk in the learning data. As soon as a significant number of patrols are sent to a given area in response to the algorithm’s recommendations, they will be led to observe offenses – even minor ones – and to collect data relating to this area, which will in turn be taken into account by the software and contribute to reinforcing the probability that this same area will be perceived as “at risk”. Predictive policing thus produces a self-fulfilling prophecy by concentrating significant resources in areas already plagued by discrimination and over-policing. POSSIBLE ABUSES OF POWER While we have not found any information on the specific instructions given to police officers when patrolling in areas deemed high-risk by predictive systems, one source told us that, thanks to PAVED, the gendarmerie was able to obtain authorization from the public prosecutor for officers on patrol to position themselves in transit areas so as to stop passing vehicles. This involved checking drivers’ license plates and driving licenses, and in some cases carrying out vehicle searches. If the information proves accurate, it would mean that preventive checks, carried out under authorization from the Public Prosecutor’s Office, were decided on the sole basis of a technology founded on dubious doctrines and whose effectiveness has never been assessed. A situation which, in itself, would materialize a characterized disproportion of the restrictive measures of freedom taken against the people subjected to those stops and searches. TECHNOLOGIES OF DUBIOUS EFFECTIVENESS With regard to their discriminatory nature, even if these predictive policing systems proved effective from the point of view of police rationality, they would pose significant problems in terms of social justice and respect for human rights. Yet, despite the absence of any official evaluation, available data points to the absence of added value of predictive models in achieving the objectives the police had set themselves. In fact, these tools seem far from having convinced their users. PredVol was no better than simple human deduction. As for PAVED, although it may have prevented a few car thefts, it proved disappointing in terms of predictive capabilities, and did not translate into an increased number of flagrant arrests, which remains the standard of efficiency for the police under the reign of the “policy of numbers”2. Despite initial plans, PAVED was never implemented within the Gendarmerie Nationale. Following an experimental phase from 2017 to 2019, it was decided to shelve the software. And while M-Pulse has found a new lease of life under the “citizen rebranding” pushed by Marseille’s new center-Left municipal majority, its police uses seem relatively marginal. For what reasons? The opacity surrounding these experiments makes it impossible to say with any certainty, but the most likely hypothesis lies both in the absence of any real added value in relation to existing knowledge and beliefs within police forces, and in the organizational and technical complexity associated with the use and maintenance of these systems. IMPORTANT SHORTCOMINGS IN THE HANDLING OF PERSONAL DATA For those opposing these systems, the information presented in our report might seem reassuring. But in reality, even if the fad surrounding “predictive policing” seems to have passed, R&D around decision support systems in the context of policing goes on unabated. In France, substantial sums of money are being spent to meet the stated ambition of “taking the Ministry of the Interior to the technological frontier”, as envisioned in the 2020 Internal Security White Paper1. In the context of a primacy given to techno-securitarian approaches, PAVED could thus be reactivated or replaced by other systems in the near future. As for Edicia, in recent months the company has been considering incorporating new sources of data from social networks into its predictive module, as envisaged by the designers of M-Pulse at the start of the project. Predictive policing is thus still in order. Interrogated via a FOIA request in March 2022 and again in November 2023, the CNIL, the French data protection authority, told us that it had never received or produced any document relating to predictive policing software as part of its prerogatives. It suggests that the agency had never taken any interest in such automated decision-making systems as part of its oversight powers. In and of itself, this begs important questions when considering that, for some of them, they are used by thousands of municipal police officers across France. Finally, insofar as the administrative police powers exercised in areas deemed “at risk” by predictive systems can be legally considered as “individual administrative decisions”, the requirements set out by the French Constitutional Council in its case law on “public” algorithms should be respected2. In particular, these prohibit the use of “sensitive data”, and impose the possibility of administrative appeals for data subjects. Added to this are transparency obligations imposed by law, notably the 2016 law known as the “Digital Republic”3. These legislative requirements as well as European case-law do not seem to be met when it comes to predictive policing systems. Not only is there no significant, proactive attempt to inform citizens and other stakeholders about exactly how these systems work, apart from the occasional bits of information opportunistically share by the police or other governmental agencies. Worse still, the right to freedom of information that we have exercised via our FOIA requests to learn more about them only delivered very partial information. More often than not, these requests were met with the absence of a response, particularly from the Ministry of the Interior. IT’S URGENT TO BAN PREDICTIVE POLICING Predictive policing systems are hardly in the news anymore. And yet, despite a blatant lack of evaluation, legislative oversight and poor operational results, the promoters of these technologies continue to entertain the belief that “artificial intelligence” will be able to make the police more “efficient”. From our point of view, what these systems produce is, above all, an automation of social injustice and of police violence, and an even greater dehumanization of relations between the police and the population. In this context, it is urgent to put a stop to the use of these technologies and then conduct a rigorous evaluation of their implementation, effects and dangers. The state of our knowledge leads us to believe that such transparency will prove their ineptitude and dangers, and provide further evidence of the need to ban them. HELP US To compensate for the opacity deliberately maintained by the designers of these systems and by the public authorities who use them, if you have at your disposal documents or elements enabling a better understanding of their operation and effects, we invite you to share them on our anonymous document sharing platform. You can also send them to us by post to the following address: 115 rue de Ménilmontant, 75020 Paris. Finally, please don’t hesitate to point out any factual or analytical errors you may find in our report by writing to us at contact@technopolice.fr. And to support this type of research in the future, please also feel free to make a donation to La Quadrature du Net. Read the full report -------------------------------------------------------------------------------- 1. The White Paper proposed to devote 1% of GDP to internal security missions by 2030, representing an expected increase of around 30% in the Ministry’s budget over the decade. Ministère de l’intérieur, « Livre blanc de la sécurité intérieure » (Paris : Gouvernement français, 16 novembre 2020), https://www.interieur.gouv.fr/Actualites/L-actu-du-Ministere/Livre-blanc-de-la-securite-interieure. ︎ 2. See the decision on the transposition of the RGPD (Decision n° 2018-765 DC of June 12, 2018) and the decision on Parcoursup (Decision n° 2020-834 QPC of April 3, 2020). ︎ 3. On the legal obligations of transparency of public algorithms, see : Loup Cellard, « Les demandes citoyennes de transparence au sujet des algorithmes publics », Note de recherche (Paris : Mission Etalab, 1 juillet 2019), http://www.loupcellard.com/wp-content/uploads/2019/07/cellard_note_algo_public.pdf. ︎
Algorithmic videosurveillance is spreading in Europe, let’s fight back together!
Today, La Quadrature du Net publishes the English version of its booklet on algorithmic videosurveillance (“AVS”). This document gathers all our work on this topic, in the hope of making it as accessible as possible. It explain what AVS is, how it works, the economical and political interests at stake in its implementation, and the political project of repression and discrimination that this technology serves. This publication aims to help activists around the world understand this new type of surveillance better in order to fight it. You can download it here. Over the past years, we have tried to shed light on algorithmic videosurveillance software. This type of software analyzes CCTV footage to categorize and classify our body, our gait, our movements, our clothes or even our face to detect odd or suspect individuals in the hustle and bustle of the streets. We have been campaigning against it everywhere, in local groups, before courts,and even in Parliament. Now, we want this fight to spread outside of France, as we see the rise and spread of the AVS market in other countries. Since opacity and secrecy is one of the biggest obstacle to organize this struggle, we find it particularly important that information spreads across Europe. We hope the English version of our booklet can help more people to dig into this subject. We think this is even more important as we see numerous projects being developed in many countries. In Italy, the “Marvel” and “Protector” projects in Trento were declared illegal whereas the “Argo” project in Torino is still running. In Germany, AVS is deployed in Mannheim and Hamburg, while it’s implemented in railway stations in Spain. Moreover, reports from the French government provide us with valuable information on the state of algorithmic video surveillance in other countries through an international comparison of the use of these technologies (available here and here, in French). In these documents, we learn for example that software developed by Briefcam, an Israeli company bought by Canon, is used in Belgium and Italy, aside from being deployed in hundreds of French cities. We also learn that, for 18 months, the German Federal police tested a semi-automatic video analysis software called Investigator, developed by Digivod, which includes physical characteristics and facial recognition. Our concerns and our will to broaden the fight to allies outside of France is also justified by the recurrent statements of French companies bragging about conquering new markets around the world. For example, a start-up called XXII currently aims for Spain, while a software called Veesion, used to analyze behaviors in supermarkets, has now obtained contracts in Portugal. Meanwhile, big French companies such as Thales and Idemia are still promoting software for “safe cities” around the globe. In 2019, we launched our initiative called “Technopolice” to face the rise of biometric surveillance, sensors and cameras in the cities of France. Numerous local groups were created in Marseille, Montpellier, Lyon, Arles or Paris to fight against digital policing. Now the fight against algorithmic surveillance must become a European and an international struggle. Download, read, print and share our booklet available here! And continue the fight where you live. If you find information regarding surveillance projects in your cities and countries, make it public and be vocal on it! Let’s fight back together !
April 11, 2025
La Quadrature du Net
Simplification’ law: stop the data center boom!
Today sees the start of the plenary debate of the draft law on the simplification of economic life at the French National Assembly. A seemingly technical law, with obscure issues, it marks a new show of force in the service of industry, and to the detriment of human rights and the environment. Article 15 of the bill, in line with the promises made by Emmanuel Macron to investors at the AI summit last February, aims to accelerate the construction of huge data centers by allowing the government to impose them onto local authorities and the population. Against this power grab aimed at building these infrastructures for the benefit of the tech giants and at the cost of a monopolization of land, electricity and water resources, a broad segment of civil society organizations is calling for the deletion of this article and the establishment of a moratorium on the construction of large data centers. SIMPLIFICATION, THE TROJAN HORSE OF DEREGULATION The examination of the “simplification” bill begins today in the National Assembly, amid a certain indifference. However, the stakes are high. As France Nature Environnement points out in its recently published report, which takes stock of 20 years of “simplification” laws, the latter are in fact to be ” a Trojan horse of deregulation, an insidious and dishonest process that weakens the rule of law and environmental justice, jeopardizing the protection of ecosystems and the construction of a livable world”. Article 15 of the bill, relating to data centers, fits perfectly into this dark history: it authorizes the government to grant a status derived from the 2023 law on green industry to fast-track the construction of large data centers, which have an extremely high environmental impact: label “major national interest project” (PINM). According to the government, this status could be granted to data centers with a surface area of between 30 and 50 hectares (the equivalent of up to 71 football fields)! With this PINM status, the tech multinationals and the investment funds that support them with tens of billions of euros would be provided by the government to impose data centers on municipalities: the national government would then take over the powers of local authorities relating to urban planning and regional planning, by itself rewriting local urban plans in order to adapt them to a given data center project. Public consultation procedures would be further weakened. Finally, the government could grant exemptions from environmental regulations, particularly those relating to protected species. In other words, the state could bypass existing rules in the name of “simplification” and “innovation” so as to impose the construction of resource-intensive and polluting data centers on local communities. MORATORIUM! For several weeks, the collective Le Nuage was under our feet, which organized itself in Marseille to resist the Hiatus coalition (launched in February to ‘resist AI and its world’), are calling for two things: on the one hand, the repeal of Article 15, and on the other hand the adoption of a moratorium on the construction of large server warehouses. Several amendments, resulting from contacts established at the political level, were aimed specifically at relaying these demands. Amendments to delete Article 15 were thus tabled by the Socialist Party, the La France insoumise. This is our urgent and minimum demand: to defend it, contact your representatives to convince them to adopt these amendments, go to this page where you will find arguments in support of these positions. But other amendments, based on , had been tabled with a view to a more ambitious objective: to introduce a moratorium on the construction of large data centers, until a citizens’ convention can lay the foundations for a debate on the appropriate framework for the development of digital infrastructures. Sadly, although these moratorium amendments has been tabled and examined in committee in March without any problem, this time for the debate in plenary session, the services of the National Assembly deemed them inadmissible because they contravened Article 40 of the Constitution. Apparently, such a moratorium or the organization of a citizens’ convention would contribute to “worsening a [fiscal] burden” or “reducing public resources.” REALITY CHECK This is all the more concerning that the principle of a moratorium — a way of laying the foundations for democratic control of data centers, and of countering the government’s desire to accelerate ever further in defiance of rights and democracy — is supported by a wide range of actors. A broad front of civil society, including researchers, activists and political representatives, is thus calling, in a Op-Ed in Libération, for the introduction of such a moratorium. In Marseille, the public investigator in charge of investigating the SEGRO logistics warehouse and data center case has also just called, in his recommendations addressed to the regional authorities, to “take a break so as to bring the players around the table, imposing a moratorium”. Similarly in Ireland, where data centers today account for de facto moratorium until at least 2028. In short, this policy option is not only possible, but also realistic and necessary to begin putting digital technology back in its place, at a time when the rise of AI is leading to a speculative boom in the development of these infrastructures. So let’s keep pushing! Visit our campaign page to urge MPs to vote to remove Article 15 of the “simplification” law, and to demand that the government introduce a moratorium on large-scale data centers! You can also support us in this fight by making a donation to La Quadrature!
« Simplification » Bill: a denial of democracy to impose giant data centers in France
The National Assembly has begun examining the bill on the simplification of economic life. Through its Article 15, this “simplification” bill (or PLS) aims to speed up the construction of huge data centers in France, by allowing the government to impose them on the territories concerned and by increasing exemptions to municipal planning regulations, environmental laws, and to the principle of public participation. Against this new denial of democracy imposed to serve the interest of the tech industry, La Quadrature du Net and the collective “Le Nuage était sous nos pieds”, together with the other members of the“Hiatus” coalition, are calling for the deletion of Article 15 and a two-year moratorium on the construction of large data centers, so as to provide the time to establish the conditions for the democratic control of these digital infrastructures. In early February, at the Paris summit on AI, Emmanuel Macron once again donned his suit as the great leader of the Startup Nation. The result was announcements of funding from all sides: while the French Parliament had just adopted the most austerity-driven budget of the 21st century, billions were pouring in, particularly to finance a boom in “data centers” in France. Data centers are industrial production plants, huge warehouses where thousands of servers owned or used by tech multinationals are piled up. In the age of AI, we are witnessing a real boom in the construction of these data computing and storage infrastructures. This boom is amplifying the misdeeds of computing, not only from an ecological point of view, but also in terms of surveillance, exploitation of labor, and the destruction of public services, as denounced by the Hiatus coalition in its founding manifesto. Because France has nuclear energy that can lower the “carbon footprints” of tech multinationals, and because it is ideally placed on the international map of submarine cables, Macron the salesman presents it as a promised land for investors. To attract them, the French president has made a promise: simplifying and deregulating to avoid protests and ensure the swift construction of these resource-intensive infrastructures. The law on the simplification of the economy, already passed by the French Senate and currently being examined by the National Assembly, aims to translate this promise into action. WHAT THE SIMPLIFICATION BILL (PLS) SAYS In its article 15, the “simplification” bill – in fact a deregulation bill – authorizes the government to grant construction projects for very large data centers a label derived from the 2023 law on “green industry”: the label “project of major national interest” (PINM). According to the government, this label is intended to be reserved for data centers with a surface area of at least 40 hectares, or more than 50 football fields! With this status of “project of major national interest”, tech industrialists would see the government work with them to impose data centers on municipalities: the government would then take control of the powers of local authorities relating to town planning and regional development, by taking over the rewriting of local urban plans so as to adapt them to these data center projects. The public consultation procedures will be further streamlined, and the government will also be able to decide that these infrastructures can infringe on environmental regulations, particularly those relating to protected species or the non-artificialization of soil. Finally, in its article 15 bis, the simplification bill enshrines into law the 50% reduction enjoyed by data centers that consume more than 1 gigawatt in a year – a provision that is currently set out in a joint decree of the Minister for Energy and the Minister for Industry. Thus, by encouraging the explosion of ever more gigantic and resource-hungry data centers, the “simplification” law accelerates the ecocidal impact of the tech industry, all to enable France and Europe to remain in an illusory “AI race”. WHY THIS DEREGULATION OF DATA CENTERS IS UNACCEPTABLE This attempt to “accelerate” is all the more unwelcome as the proliferation of data centers in France is already the subject of citizen protests across the country due to the conflicts of use they generate. As documented by the collective “Le Nuage était sous nos pieds”, in which La Quadrature participates, their establishment in the port area of Marseille has, for example, led to the monopolization of waterfront land. It has led to the postponement of the electrification of the quays where cruise ships dock, as evidenced by documents from RTE. The latter thus continue to spew their toxic fumes into the Saint-Antoine district, causing various illnesses among the inhabitants. Finally, to cool the servers that are running at full capacity, data centers also require huge amounts of water, monopolizing a resource that is essential for ecosystems and the maintenance of agriculture. Added to this is the regular release of fluorinated gases with a high greenhouse effect, and an almost constant noise pollution. In view of these problems, the current legal and democratic vacum surrounding data centers is particularly shocking. Local elected representatives and citizens’ groups agree on the need to rethink the regulatory framework around data centers. As for the National Commission for Public Debate (CNDP), it has asked to be consulted during the construction of these warehouses, but has come up against the will of the government to exclude the body from a growing number of industrial projects, through a recent draft decree. The situation is therefore highly problematic at the moment. But with the “simplification” law, the government is proposing to deregulate even further, aggravating the denial of democracy. The aim is to roll out the red carpet for tech speculators, to enable them to turn France into a kind of “digital colony” stamped “low carbon”. FOR A MORATORIUM ON THE CONSTRUCTION OF NEW DATA CENTERS We refuse to allow our towns, villages and neighborhoods to be taken over by the tech giants in this way. We refuse to see our territories and natural resources sold to the highest bidders, undermining the few mechanisms of regulation and collective control that exist today. We do not want to “accelerate” the ecocidal headlong rush of tech as Emmanuel Macron invites us to do. We want to put a stop to it! That is why we call on members of Parliament to reject Article 15 of the “simplification” bill and to support a two-year moratorium on the construction of large data centers in France, until a public debate can be held on how to regulate them. The two-year moratorium would apply to data centers larger than 2,000 m2 or with an installed power of 2 megawatts. According to the typology established by Cécile Diguet and Fanny Lopez in their research report for ADEME, a moratorium on facilities larger than 2,000m2 preserves the possibility of medium-sized data centers and does not hinder any projects that the State or local authorities may wish to undertake for public use. The public debate to which we are calling could take the form of a citizens’ convention. It should address both the democratic control of digital infrastructures such as data centers and the artificial intelligence systems now deployed in all sectors of society. It should address the question of the uses of digital services, trying to break the dependence on the toxic models of the major multinationals in the sector. Against the industrial deregulation granted to tech, against the concentration of power and the amplification of social injustice that artificial intelligence reinforces, it is urgent to put digital technology back in its place and to think of a model for the development of its infrastructures that is compatible with ecological limits as well as human and social rights. NEXT STEPS The members of the special committee responsible for examining the text have already tabled amendments to it on Thursday 20 March. Several of them aim to delete Article 15. An amendment to call for a moratorium has also been tabled by Green MPs Hendrik Davi and Lisa Belluco. These amendments will be examined by the special committee from Monday 24 to Thursday 27 March. The examination in public session will then take place from April 8 to 11, 2024. We will soon be launching a campaign page to make it easier for everyone to participate! In the meantime, spread the word and get ready for the battle against this umpteenth piece of shitty legislation! <3
March 21, 2025
La Quadrature du Net
All-out mobilization against the French “war-on-drugs” law
In the midst of the media uproar over drug trafficking, a law on “drug trafficking” is passing through Parliament. In reality, this text does not only apply to the sale of narcotics and leads to a heavy reinforcement of the surveillance capacities of the intelligence and judicial police. It is one of the most repressive and dangerous texts of recent years. This law could notably give even more powers to repress militant actions. This bill was adopted unanimously in the Senate, with the support of the Socialists, the Ecologists and the Communists, and will now be discussed in the National Assembly. La Quadrature du Net is calling for urgent mobilization to raise awareness of the dangers of this text and to push left-wing parties to reject it. On this page, you will find a set of resources and tools to help you understand this law and convince your elected representatives to mobilize: * The so-called “Drug Trafficking” law undermines the protection of encrypted messaging services (such as Signal or WhatsApp) by requiring the implementation of backdoors for the police and intelligence services. * By modifying the legal regime for organized crime, applicable in other cases, this law does not only apply to drug trafficking. It can even be used to surveil activists. * The safe-deposit box, a provision of the law, makes secret the documents in a file detailing the use of surveillance techniques during an investigation. This violates the right to defend oneself and prevents the public from knowing the extent of the surveillance capabilities of the judicial police. * The text provides for authorizing the police to remotely activate the microphones and cameras of fixed and mobile connected devices (computers, telephones, etc.) to spy on individuals. * It extends the authorization to use “black boxes”, a technique for analyzing data from all our communications and exchanges on the Internet for the purpose of “fighting delinquency and organized crime”. * The police will be able to tighten its policy of censoring Internet content by extending it to publications related to the use and sale of drugs. The risks of abuse of freedom of expression are therefore amplified. SUMMARY OF THE LAW IN VIDEO PIPHONE: CHOOSE WHICH MP TO CALL Some progressive elected representatives have bought into this narrative and this dangerous security escalation. In the Senate, the Green, Communist and Socialist groups voted in favor of this text. We believe that they must be made to face their responsibility by denouncing the dangers of this law. With our tool, you can contact them directly to send them the arguments and resources shared on this page. It is also possible to try to convince the deputies of Together for the Republic and Modem to vote against the most dangerous measures of this bill. You can call them all week and if possible on Mondays, Thursdays and Fridays, when they are not in the chamber. You will probably have an assistant on the phone and that’s okay! Feel free to talk to them, and then ask them to relay your opinion to their MP. Thank you to everyone who is putting their energy into opposing this umpteenth security crackdown, and well done to those who had the courage to contact their MPs! <3 IN DETAIL: WHAT DOES THE PROPOSED LAW PROVIDE FOR? – AMENDMENT OF THE ORGANIZED CRIME REGIME This law significantly strengthens the organized crime regime, which does not only concern drug trafficking. This legal framework was created twenty years ago to target, in theory, mafia networks by providing specific rules that derogate from common law. In particular, they allow the police to use surveillance techniques that are much broader and more intrusive than normal (wiretapping, IMSI-catchers, bugging, data capture, etc.). The scope of the offenses covered by the organized crime regime is defined in a list of the code of criminal procedure, which has grown steadily over the years, affecting more people and situations. In particular, it covers criminal association and a number of offenses and crimes committed by “organized gangs”, these classifications being increasingly used to prosecute activists. This was particularly the case during the movement of the pursue an activist fighting against the construction of administrative detention centers. To legitimize the extension of supposedly limited measures, the government and parliamentarians are using sensationalist discourse to address the issue of drug trafficking, which is not new and has known other solutions than the “all-repressive” approach. This is not insignificant. By giving the fight against drug trafficking an exceptional dimension, they are trying to justify the need to resort to extraordinary means that are extremely detrimental to our freedoms. In this respect, they are following the legislative pattern used in recent years in the fight against terrorism, which has allowed for the establishment of very significant derogations from the functioning of the institutions that have gone far beyond terrorism. – SECRET POLICE SURVEILLANCE The law prevents people from knowing how they are being monitored, which is an unprecedented and very serious attack on the founding principles of the French judicial system, which are the right to defend oneself and the principle of adversarial proceedings. Thus a measure known as the “safe file” or “separate report” would make it possible to separate from the criminal file the reports related to the implementation of surveillance techniques. These reports will only be accessible to investigators under the control of the prosecutor or the investigating judge, preventing lawyers and the persons concerned from reading and discussing them and thus from detecting potential illegalities. This will also deprive the public of the opportunity to know the extent of the surveillance capabilities of the judicial police and will facilitate abuses in the use of highly intrusive techniques, such as spyware or the compromising of devices. The senators amended the text to destroy the confidentiality of encrypted messaging services such as Signal or WhatsApp. The law stipulates that communication services are obliged to introduce access – a “backdoor” – for the benefit of the police and intelligence services, under penalty of heavy sanctions. This would create an unprecedented breach in end-to-end encryption technology, exploitable by both states and malicious actors. Such a measure is extremely dangerous. As reiterated by numerous institutions, including ANSSI and the European Data Protection Board has been warning that this would weaken the level of protection for all communications and threaten the confidentiality of all our exchanges. For years, we have been defending the right to encryption. You can read our position from 2017 here. – REMOTE ACTIVATION OF CONNECTED OBJECTS This law provides for a new escalation in surveillance by continuing the legalization of spyware (such as NSO-Pegasus or Paragon). It thus authorizes the police to remotely activate the microphones and cameras of fixed and mobile connected devices, such as computers or telephones, to spy on people. This technique relies on compromising computer systems by using flaws in connected devices. Proposed by Eric Dupont-Moretti in 2023 in a law on judicial reform, this surveillance measure was partially censored by the Constitutional Council. It is reproduced here with slight modifications, while the urgency would be to ban this type of surveillance, as it poses dangers to democratic balances and individual freedoms. – EXPANSION OF INTELLIGENCE POWERS AND BLACK BOXES Intelligence services would also see their powers strengthened with this law. On the one hand, the exchange of information between so-called “second circle” services (for which intelligence is only part of their mission, particularly within the police and national gendarmerie) is in principle very limited. It would be facilitated here by the removal of the need for authorization, and this well beyond the sole perimeter of drug trafficking. On the other hand, the law broadens the scope of application of the “black boxes” to the purpose of “fighting delinquency and organized crime.” This intelligence technique analyzes the data of all our communications and data retrieved from the internet via algorithms under the pretext of “detecting” new suspects. Since their creation in 2015, – Internet censorship The law will allow the police – via the Pharos service – to censor any content on the internet that it considers illegal in connection with an offense relating to drug trafficking. These are very broad prerogatives that are in addition to an already very significant capacity for administrative censorship. This possibility of requiring the withdrawal of publications without the intervention of a judge had initially been authorized for child criminal content before being extended to terrorism. This desire to lock down the internet can only lead to abuse in view of the volume of content concerned and the extra-judicial framework of this censorship, without really having an impact on the social problem of drug use, which is based on many other factors. – AND OTHER MEASURES So far, we have mentioned the most worrying measures, but unfortunately this law contains many other security extensions applying to “organized crime”, which, as we know, concerns many more situations than drug trafficking: IMSI-catchers in private places, the power of the prefect to ban people from “appearing” in a place, the use of drones in prisons, compulsory cameras in ports, a new broad and poorly defined offense of “participation in a criminal organization”…
February 27, 2025
La Quadrature du Net