Archived: An EU Law Could Let US Prosecutors Scan Phones for Abortion Texts

This is a simplified archive of the page at https://www.wired.com/story/eu-law-scan-phones-abortion-texts/

Use this page embed on your own site:

Tools used to find child abuse material could be wielded in the US to protect “unborn children” if the EU bans end-to-end encryption.

ideas, reproductive rights, surveillance, apple, meta, textaboveleftsmall, web, tagsReadArchived

The drive to protect children online will soon collide with an equal and opposing political force: the criminalization of abortion. In a country where many states will soon treat fetuses as children, the surveillance tools targeted at protecting kids will be exploited to target abortion. And one of the biggest threats to reproductive freedom will unintentionally come from its staunch defenders in the European Union.

Last week the EU unveiled draft regulations that would effectively ban end-to-end encryption and force internet firms to scan for abusive materials. Regulators would not only require the makers of chat apps to scan every message for child sexual abuse material (CSAM), a controversial practice that firms like Meta already do with Facebook Messenger, but they would also require platforms to scan every sentence of every message to look for illegal activity. Such rules would impact anyone using a chat app company that does business within the EU. Virtually every American user would be subject to these scans.

Regulators, companies, and even stalwart surveillance opponents on both sides of the Atlantic have framed CSAM as a unique threat. And while many of us might sign up for a future in which algorithms magically detect harm to children, even the EU admits that scanning would require “human oversight and review.” The EU fails to address the mathematical reality of encryption: If we allow a surveillance tool to target one set of content, it can easily be aimed at another. This is how such algorithms can be trained to target religious content, political messages, or information about abortion. It’s the exact same technology.

Earlier child protection technologies provide us with a cautionary tale. In 2000, the Children’s Internet Protection Act (CIPA) mandated that federally funded schools and libraries block content that is “harmful to children.” More than 20 years later, school districts from Texas to progressive Arlington, Virginia, have exploited this legislation to block sites for Planned Parenthood and other abortion providers, as well as a broad spectrum of progressive, anti-racist, and LGBTQ content. Congress never said medically accurate information about abortion is “harmful material,” but that is the claim of some states today, even with Roe still on the books.

Post-Roe, many states won't just treat abortion as child abuse, but in several states likely as murder, prosecuted to the full extent of the law. European regulators and tech companies are not prepared for the coming civil rights catastrophe. No matter what companies say about pro-choice values, they will behave very differently when faced with an anti-choice court order and the threat of jail. An effective ban on end-to-end encryption would allow American courts to force Apple, Meta, Google, and others to search for abortion-related content on their platforms, and if they refuse, they’d be held in contempt.

Even with abortion still constitutionally protected, police already prosecute pregnant people with all the surveillance tools of modern life. As Cynthia Conti-Cook of the Ford Foundation and Kate Bertash of the Digital Defense Fund wrote in a Washington Post op-ed last year, “The use of digital forensic tools to investigate pregnancy outcomes … presents an insidious threat to our fundamental freedoms.” Police use search histories and text messages to charge pregnant people with murder following stillbirth. This isn’t just an invasive technique, but highly error-prone, easily miscasting medical questions as evidence of criminal intent. For years, we’ve seen digital payment and purchase records, even PayPal history, used to arrest people for buying and selling abortifacients like mifepristone.

Pregnant people don’t only have to worry about the companies that currently have their data, but everyone else they could sell it to. According to a 2019 lawsuit I helped bring against the data broker and news service Thomson Reuters, the company sells information on millions of Americans’ abortion histories to police, private companies, and even the US Immigration and Customs agency (ICE). Even some state regulators are raising the alarm, like a recent “consumer alert” from New York State Attorney General Letitia James, warning how period tracking apps, text messages, and other data can be used to target pregnant people.

We must reevaluate every surveillance tool (public and private) with an eye to the pregnant people who will soon be targeted. For tech companies, this includes revisiting what it means to promise their customers privacy. Apple long garnered praise for how it protected user data, particularly when it went to federal court in 2016 to oppose government demands that it hack into a suspect’s iPhone. Its hardline privacy stance was especially evident because the court order came as part of a terrorism investigation.

But the firm has been far less willing to take on the same fight when it comes to CSAM. Last summer, Apple proposed embedding CSAM surveillance in every iPhone and iPad, scanning for content on its billion+ devices. The Cupertino behemoth quickly conceded to what the National Center for Missing and Exploited Children first called “the screeching voices of the minority,” but it never gave up the effort completely, recently announcing CSAM scanning for UK users. Apple is hardly alone, joining firms like Meta, which not only actively scans the content of unencrypted messages on the Facebook platform, but also circumvents claims of “end-to-end encryption” to monitor messages on the WhatsApp platform by accessing copies decrypted and flagged by users. Google similarly embeds CSAM detection in many of its platforms, making hundreds of thousands of reports to authorities each year.

Regulators and companies want both an open internet and a surveillance state. This is impossible. The same encryption that hides CSAM will soon be a lifeline for abortion seekers and political dissidents. The same moderation tools that can target child abuse will soon be commandeered to protect “unborn children.” And the same police that partner with platforms against CSAM will soon be arresting doctors and pregnant people.

Tech companies can’t change the law, but they can decide to make platforms that put privacy and safety first. And while some may think anti-CSAM surveillance is more important than protecting pregnant people, the uncomfortable truth is that anti-CSAM surveillance doesn’t work. Even as widespread surveillance undermines almost every aspect of internet safety, the amount of CSAM has only gone up.

But tech companies won’t even have the chance to do the right thing if EU regulators go through the labyrinthine process required to turn their draft rules into a reality and force member states to implement laws. While the Dobbs decision will arrive before these EU requirements go into effect, the time to act is now: EU officials have shown that they are eager to take on Big Tech. Beyond that, they are far from the only officials to contemplate such measure Anti-encryption advocates have pushed measures in Congress like the Earn It Act, which would impose similar obligations, breaking end-to-end encryption. But while American attacks on internet privacy have floundered, European efforts appear to be gaining momentum.

Officials are rightfully skeptical of tech firms’ ability to police themselves, but it’s yet to be seen whether they are willing to empower anti-choice police in the process. EU legislators may be trying to help children, but instead they are creating a digital version of the Handmaid’s Tale. They must reverse course and reaffirm encryption as a fundamental right before pregnant people pay the price.