Internet Privacy Is A Disability Rights Issue

Ariana Aboulafia / Jan 19, 2024

Former CDT intern Sydney Brinker contributed to the research for this article.

Jamillah Knowles & Reset.Tech Australia / Better Images of AI / People with phones / CC-BY 4.0

Imagine being forced to disclose sensitive data related to your health or identity to access basic activities and services needed to fully participate in society. Then, imagine disclosing that information repeatedly, as new services you’d like to take advantage of come online. Unfortunately, this scenario is the lived experience of many people with disabilities. On top of it being essentially impossible for anyone to understand online data practices, disabled people in particular are often in situations where they are unable to protect information about their health status, as they are often forced to choose between accessing necessary services and technologies (like standardized tests, rideshares, or assistive apps) and keeping their information private. This “choice,” of course, is hardly a choice at all.

People with disabilities may affirmatively reveal sensitive and private health information to receive an accommodation, including in rideshares and standardized testing. They may also incidentally (and largely unintentionally) share their disability status simply by using assistive or adaptive technology. The “choice” for a person with a disability is to divulge that information, or to either receive an inaccessible version of a service, or to not use that service at all – which impacts that person’s ability to live a full and independent life. Not only do disabled individuals have little control over whether or not to share this information in the first place, they also may have little knowledge as to how that data is processed, stored, and shared after its initial disclosure. For these reasons, advocates for disability rights and disability justice should prioritize the protection of personal and digital privacy as a central issue, and organize towards solutions.

There are ways to protect the data and personal privacy of individuals with disabilities, along with everyone else. Data minimization (the idea that companies collect should only the data that is necessary to provide a service’s essential functions), and purpose limitation (the idea that companies should use data only for the original purpose for which it was collected), would allow disabled people to receive the accommodations and services they need, and to which they are entitled, while better protecting their privacy. The inclusion of these protections is one of many reasons my organization, the Center for Democracy and Technology (CDT), supported the 2022 American Data Privacy and Protection Act (ADPPA), a comprehensive federal privacy bill that would protect the privacy of “health information,” including disability information.

There are a variety of scenarios where disabled people must disclose health or disability-related information in order to enjoy certain rights or benefits. For example, students with disabilities may have to disclose their disability status – and potentially other sensitive information – to third-parties when requesting accommodations on tests such as the ACT, SAT, and AP exams. Some students could, in theory, choose not to request accommodations on these high-stakes exams, thus risking their educational futures. But they shouldn’t have to, and this is not an option for many disabled students. Instead, disabled students are forced to disclose their disability status, both to schools and to separate testing companies, but are not empowered to determine what happens to that information afterwards; and, too often, this information is used in discriminatory ways. In fact, a group of affected students sued the ACT in 2019, alleging that the company disclosed students’ disability status to universities they applied to, and sold colleges “personally identifiable data about students, including detailed student disability data” leading to the exclusion of prospective students with disabilities from college recruitment efforts. The ACT settled the case without admitting fault in 2020, but paid affected individuals $16 million and agreed to halt the practice of flagging a student’s accommodations on score reports to colleges.

People with disabilities also face pressure to disclose their disability status when attempting to access transportation. Uber, for example, allows users with disabilities to submit complaints if a driver refuses a ride because of a service animal or an assistive device, and provides wheelchair-accessible vehicles through their “WAV” feature upon request (where available). Further, Uber allows disabled people to receive refunds for longer wait times related to disability, and to permanently request waivers of wait-time fees via a “certification” of disability. These are welcome policies and features that make the app more accessible to users; but, questions remain as to how the data disclosed through use of these features is used. Uber states that, for certifications, it does not use that information for any purpose except to provide fee waivers. As for other information, including who uses the WAV service, however, the privacy policy does not clearly indicate what happens. The policies advise users that Uber collects information related to some users’ declared and inferred demographic data, transaction data, communications data, and more, and it is unclear which category disability-related data would fall under. Disability data could, for example, be part of inferred demographic data based on use of the WAV service; requests for accessible transportation could also fall under transaction data (as a wheelchair-accessible vehicle request could be considered a “type of service requested”) or communications data (particularly if a user is communicating with a driver or with Uber support about a disability-related request or accommodation).

If disability-related data is being collected by Uber, it would be subject to Uber’s privacy policy. This states, generally, that Uber shares and discloses some data (depending partially on which category the data falls under) with several third parties, “including social media advertising services, advertising networks, third-party data providers, and other service providers” to “reach or better understand our users and measure advertising effectiveness.” Uber also shares some data with “social media companies, including Facebook and TikTok, in connection with Uber’s use of their tools in Uber’s apps and websites.” The data shared with these platforms through Uber would then be subject to those companies’ privacy policies, potentially allowing for further sharing of whatever data they collect to facilitate sales or advertising, as an example. Because of the vagueness of the policy, there is no way for a disabled person to know if their sensitive data is being shared more widely than that individual originally intended, or even used to facilitate third-party advertising – which could pose serious privacy concerns for people with disabilities.

People with disabilities also face privacy concerns related to the disclosure of their disability status and other disability-related data when accessing assistive technology. Here, rather than affirmatively revealing information to receive an accessible version of a service, the mere use of the technology itself leads to unintentional disability and/or related data disclosure.

For example, “smart” medical wearables like hearing aids can pose significant privacy threats for disabled users. One study found that wearers are not provided “information about the full range of data collection, transfer, storage, and potential purposes” from their hearing aids, and that some data, including wearers’ geolocation information, is shared with third parties. The study noted that protections are limited for users of these devices, as the companies that create them are often not covered by existing privacy statutes like the Health Insurance Portability and Accountability Act (HIPAA). Similarly, apps for blind or low-vision individuals like Be My Eyes (which allows users to send video and audio recordings to sighted users for identification) advise users that they collect and share users’ personal information, use some of it “to provide and inform targeted advertising,” and store it “as part of the normal functioning” of the app, with no mention of automatic or standardized deletion. Be My Eyes recommends that users avoid sending personal information via the app, but it is unclear whether the company imposes any other safeguards to protect that information for when user disclosure inevitably occurs, including whether that information is stored or deleted.

While there may be some role for allowing disabled individuals to control and protect their own data, the primary onus should not be placed solely on any individual to keep their information private. Instead, the companies that collect, monetize, and sell this data should be charged with protecting the privacy of that data, and legislators and companies should commit to protecting disability-related data in the first instance as a way to help protect individuals online, particularly those in marginalized communities. Companies and legislators should also specifically be thinking about the harms that disabled people face when their sensitive information is disclosed, and prioritizing specific federal privacy legislation that safeguards these individuals, including but not limited to bills that have strong data minimization and purpose limitation provisions. These sorts of solutions should also be prioritized by disability rights and justice advocates.

No disabled person should have to choose between accessing technologies that help them lead fulfilling, self-directed lives and protecting their personal information. And, people with disabilities should be able to benefit from technology without worrying that their health-related data will be used for nefarious or unknown purposes. It is vital that leaders in the disability community embrace internet privacy as a disability rights and justice issue, and support changes in law, regulation, and industry policies that will help protect the data of all individuals, including those with disabilities.


Ariana Aboulafia
Ariana Aboulafia is an attorney with expertise in the intersection between disability and technology, as well as criminal law and the First Amendment. She holds an affiliate role at the Center for Information, Technology, and Public Life (CITAP).