One Year After Dobbs: Assessing the Fight for Reproductive Privacy

Alexandra Reeve Givens / Jun 23, 2023

Alexandra Reeve Givens is President & CEO of the Center for Democracy & Technology.

Protestors gather at the U.S. Supreme Court, May 2022. Drew Petrimoulx/Shutterstock

A year has passed since the U.S. Supreme Court overturned Roe v. Wade. When the decision came down, the Center for Democracy & Technology (CDT) and other advocates quickly recognized how the decision would impact data privacy, heighten surveillance of personal activities, and endanger access to information about abortion. Prosecutors and even private citizens have been empowered to pursue evidence against people seeking and providing reproductive care.

One year later, where do we stand at the intersection of tech policy and reproductive rights? While a number of states moved immediately to further restrict and criminalize abortion for millions of Americans, a few have pushed back, instituting “shield laws” to protect the privacy of abortion providers and patients who travel to another state to receive care. The Biden Administration has taken steps to protect patients’ medical records and to pursue companies that have unfair and deceptive data practices. Some companies have announced (or quietly pursued) new steps to protect people’s private health data.

But there is still a long way to go. This article lays out three priority areas where tech policy impacts reproductive rights. It takes stock of the wins to date, and the work ahead for those who wish to fight for reproductive privacy and access to reliable online information at this critical time.

Priority 1: Reforming commercial data practices to protect reproductive privacy

Many kinds of data reveal personal information about someone’s medical history and healthcare choices. Emails, texts, search history, online purchases, and location data stored on a person’s phone could indicate whether someone is, or was, pregnant. The companies that capture this data aren’t necessarily health companies, but the information they collect is some of the most sensitive data about people’s health choices. That’s why companies across all sectors and of all sizes must be responsible for carefully analyzing and limiting the personal information they collect, store, and share.

To help the private sector better understand how to manage sensitive personal data, CDT recently published a best practices guide urging a wide range of companies to closely review the types of individual user data they have access to, and to minimize the collection of personally-revealing information. We cautioned that, if companies must collect personal data, they should retain it only for as long as necessary to perform the task for which it was collected, and then delete it. They should also consider ways to protect the information by encrypting it, and should prevent the sharing or selling of any such sensitive personal data.

We’ve seen some companies take these kinds of actions, often not pointing directly to the Dobbs decision but to a broader desire to protect users’ information and earn customer trust. In May, Apple released a new whitepaper highlighting the ways in which it protects health data privacy. Google announced that it would delete sensitive places (including abortion providers and domestic violence shelters) from its Location History timelines, a step CDT applauded while flagging its limitations. Period tracking app Flo adopted a new “anonymous mode” after concerns about user privacy.

Meanwhile, even before the Dobbs ruling, Meta announced plans to move toward end-to-end encryption as the default setting for Messenger and Instagram. The significance of this development was made clear when a mother in Nebraska was prosecuted on the basis of private but unencrypted Facebook messages with her daughter. At RightsCon this month, providers of encrypted messaging services – including Signal, WhatsApp, Element, and OpenMLS – reaffirmed their commitment to strong encryption at a panel hosted by CDT Chief Technologist Mallory Knodel. Calls for companies to encrypt their users’ communications have only grown stronger, such as in the civil-society led campaign MakeDMsSafe.com.

This year, health providers also received a wake-up call about their data practices. Investigative reporting by The Markup and other journalists revealed that many health providers are inadvertently sharing user data through marketing pixels on their sites. In March, U.S. mental health startup Cerebral admitted it shared the private health information of more than 3 million users with Facebook, Google, TikTok, and other ad giants via tracking pixels. The FTC and HHS quickly issued guidance on pixels and are getting the word out to the private sector. Clearly, there is a lot more work to be done to educate companies about protecting data and providing them with tools to mitigate these harms.

The FTC has also taken action against private companies for a variety of health-data-related harms. Most notable is an enforcement action against the data broker Kochava for selling geolocation data that tracked people at reproductive health clinics, places of worship, and other sensitive locations (while the case was initially dismissed, the FTC has filed an amended complaint). The FTC has also used its other authorities to protect health data, such as its enforcement against Premom and GoodRX for violations of the Health Breach Notification Rule. In the case of GoodRX, the company failed to notify its customers of the unauthorized selling of their personal health information to Facebook, Google, and other companies. That said, the GoodRX fine was a mere $1.5 million for a company with a market capitalization of $2.27 billion – raising concerns about whether companies will change their practices in response.

Even as the FTC took the first steps in its proposed rulemaking on commercial data practices, the need for comprehensive federal privacy legislation became ever clearer. New research showed again how data brokers are aggregating and selling people’s mental health information, including data that can be tied to individual consumer identities, such as whether someone has depression, insomnia, or ADHD. Reports have surfaced about the sensitive location information collected and shared by cars. Texas recently became the tenth U.S. state to pass a comprehensive privacy law, but the effectiveness of these state bills in protecting consumer rights varies widely. The need for comprehensive legislation that meaningfully protects all Americans has never been more palpable.

With or without legislation, it’s also clear that companies can and must improve their data practices, particularly what data they collect and how they protect it. CDT reiterated this call at SXSW, before the International Association of Privacy Professionals, in Ms. Magazine, and more, and many more advocates are urging the same. As consumers, we need to be smart about the companies we entrust with our data – and demand more from companies to help protect our rights.

Priority 2: Limiting law enforcement access to private health information

Another priority after Dobbs is for companies to responsibly handle law enforcement requests for people’s data. In this work, there’s a role for state and federal policymakers, as well as companies of all sizes.

This area saw the most legislative progress over the past year, as California, Washington, and New York enacted shield laws that prevent in-state electronic communication companies from complying with any warrants (including those from out-of-state) in connection with abortion investigations. Despite their similar goals, the bills have notable differences: California and Washington’s shield laws apply to a range of legal demands and require all demands to be accompanied by an attestation that the investigation is not connected to abortion (a key safeguard, as abortion bans are often enforced through facially neutral laws like child endangerment). In contrast, New York’s law only applies to warrants and is not backed up by a mandatory attestation. In addition, the Washington law protects information relating to gender-affirming care, not just reproductive health services, an important addition at a time when gender affirming care is being banned in numerous states. As more pro-choice states consider data shield laws, these and other differences are important – and may impact how the laws withstand legal challenges. With the help of Yale Law School’s Media Law Clinic, CDT published a guide for state legislators considering such provisions in the months and years ahead.

In 2023, the Department of Health & Human Services (HHS) also stepped in to protect people’s private medical records from law enforcement investigations. In April, HHS’s Office of Civil Rights proposed a promising modification to the HIPAA Privacy Rule that would prohibit HIPAA-covered entities from disclosing people’s reproductive health records for criminal, civil, or administrative investigations when the health care is lawful in the circumstances in which it was provided. CDT, reproductive healthcare providers, and other advocates warmly welcomed this action. But in comments, CDT noted that the proposal should be strengthened by using a broad definition of “reproductive health data” and, critically, by extending its protections to include the provision of gender-affirming care. CDT also recommended that, in the limited instances when sharing patient data with law enforcement is permitted, the Privacy Rule should require that any law enforcement request is narrowly tailored and explicitly states the specific elements within a health record that are necessary to the investigation.

Despite the Biden Administration’s commendable work on these issues, there’s still more the Administration can do. Last December, CDT led a coalition of over 50 civil society organizations in calling on the Administration to ensure that federal taxpayer dollars are not used to support state and local law enforcement’s abortion-related investigations. There are ways to do this without jeopardizing information- and resource sharing on important national priorities such as anti-terrorism and violent crime.

In a series of blog posts, CDT explained how federal resources are used in state and local investigations, and how the Administration could amend existing memorandums of understanding and other structures to ensure federal dollars aren’t used to prosecute reproductive healthcare cases that remain lawful in many parts of the country. This work has motivated Members of Congress, with a group of over 30 Senators and Representatives issuing a letter to the Biden Administration calling for it to block federal aid for abortion-related investigations.

Priority 3: Ensuring people’s access to reliable information online

In the year since Roe v. Wade was overturned, reproductive rights advocates have raised concerns that both states and tech companies could restrict speech about access to abortion care. This includes targeting websites and online resources that connect people with care in another state, as well as putting barriers in place for people who want to connect and coordinate in defense of reproductive rights.

Several states have introduced bills that would block access to online information about abortion services and resources, notably in South Carolina and Texas. Although these efforts should fail based on precedent and First Amendment protections, it is difficult to predict how the U.S. Supreme Court might rule on future cases involving state attempts to limit speech about abortion.

There are also general social media laws that threaten access to accurate reproductive health care information. The Texas social media law adopted in 2021, H.B. 20, would expose social media companies to lawsuits if they block content “based on a user’s viewpoint” – a provision that would likely limit online services’ ability to moderate mis- or disinformation about reproductive care. The law has been stayed from going into effect while it is litigated through the courts, and it is widely expected to be considered by the Supreme Court next term (see CDT’s amicus brief and many other amicus filings from last fall urging the Court to prevent the law from going into effect).

CDT and other advocates have also highlighted the risks to secure, trustworthy provision of information about reproductive health care posed by legislative proposals currently pending before Congress. In particular, we’ve cautioned that the EARN IT Act and the Kids Online Safety Act (KOSA) – while well-intentioned – could expose users’ communications to greater surveillance and incentivize online services to block or restrict access to information about sexual health and well-being.

Tech companies themselves should address abortion-related mis- and disinformation on their platforms—such as ensuring that when a person searches for an abortion clinic, the results actually show abortion providers, rather than misleading sites which may shame, confuse, or even dox the person seeking care, or send them to services that are not in fact providers of medical care. The FTC has cautioned that such misleading sites may face FTC investigation, but there are steps social media platforms can and should take to help users report them, and ensure responsible moderation of ads and search results.

At a time when abortion advocates have raised concerns about social media sites blocking or suppressing their posts, platforms must also enforce their content moderation policies transparently, and with a nuanced understanding of the issues at stake. Meta’s Oversight Board is considering a set of cases on abortion-related speech on Facebook at this very moment, examining whether three different posts expressing pro- and anti-abortion views using violent language violated the site’s policies against incitement.

At this active time, there’s work to be done to support a rights-respecting and transparent framework that involves abortion care providers and other reproductive rights advocates in an effort to support access to reliable information online.

Looking back at a year of action and disappointing inaction, it’s clear there’s progress to celebrate while also considerable work ahead. One thing is evident: tech policy is having a direct impact on the exercise of reproductive rights and freedoms. For allies in the tech policy community wondering if they have a role to play in the fight for reproductive justice, the answer is yes. At this critical moment, it’s past time for companies, policymakers, and a broader coalition of advocates to act.

You can learn more about CDT’s work on reproductive privacy & access to information here.


Alexandra Reeve Givens
Alexandra Reeve Givens is the President and CEO of the Center for Democracy & Technology (CDT), a nonpartisan nonprofit that advocates for protecting democracy and human rights in the digital age.