Home

Neurodata – the New Epicenter of Data Protection

Bojana Bellamy, Eduardo Bertoni / Sep 17, 2024

Plato observed that "when the mind is thinking, it is talking to itself." This statement is increasingly incorrect—these days, the brain is talking to itself and to anyone who may be listening. Advances in neurotechnology and artificial intelligence (AI) systems show just how imminent mind-reading technology is. In a 2023 study, researchers at The University of Texas at Austin successfully decoded human brain activity and captured the meaning of people’s thoughts. The growing field of neurotechnology applications is mostly confined to medical and health fields for now and may not enter the mainstream consumer markets for some time. Yet, there are at least 30 non-invasive neurotechnology consumer products available to the public today, promising to improve cognitive abilities, monitor productivity, focus, and fatigue, and counter sleeping problems, depression, and other conditions.

Developments in the fields of neurotechnology and AI have spurred widespread interest from civil, public, and private sector stakeholders in the legal and ethical frameworks that might govern these technologies. Most countries in the world have adopted data protection laws governing the use of personal data (with the notable exception of the United States). It appears that many of the requirements set forth in these laws will apply to any personal data collected by consumer neurotechnology products and services. Yet, it will be important to assess how data protection laws will practically apply to neurodata and to consider if data protection rights are sufficient to protect against the potential risks and harms of neurotechnology, or if new rights to specifically address neurodata are needed.

Amidst these ongoing debates, it remains imperative for organizations to apply existing best practices for data privacy and governance to ensure the responsible development and deployment of consumer neurotechnology. However, a study published in April 2024 assessed the policy documents of 30 companies that provide consumer-facing neurotechnology devices and found that 60% “provide no information for consumers about how their neural data is handled and what rights they have in relation to it.” Many of the companies assessed were based in countries with comprehensive data protection laws that require organizations to provide individuals with information about the use of their personal data and the ability to exercise their rights regarding the collection and use of their personal data.

Despite the projected growth of the neurotechnology market, the value of which is expected to exceed $50 billion by 2033, legal and regulatory guidance governing this sector is lacking. Publications issued by the European Data Protection Supervisor (EDPS), the Spanish Data Protection Agency (AEPD), and the United Kingdom Information Commissioner’s Office (ICO) squarely place neurodata collection and processing within the data protection domain but raise key challenges that still need to be addressed.

Neurotechnology refers to external or internal computer devices (e.g., headset, ear pods, or computer chips implanted directly on the brain) that can capture, interpret and potentially change human nervous system activity, including signals from the brain and spinal cord. The neurotechnology field promises remarkable scientific and medical benefits, such as potentially aiding people with quadriplegia regain some agency through mind-controlled computer use. However, the collection and use of neurodata without proper protections and ethical guidelines could lead to disproportionate risks to individual privacy and other human rights.

Neurodata or neural data is the data collected from a person’s nervous system. Whether existing data protection frameworks apply to neurotechnology largely hinges on whether neurodata is deemed to be personal data. To the extent neurodata is personally identifiable, it is subject to existing privacy laws and protections. Neurodata could also be considered biometric data under laws whose definition of “personal data” includes biometric data. Additionally, neurodata can disclose an individual’s medical and health information and be considered sensitive personal data, which typically requires heightened protections. For example, Europe’s General Data Protection Regulation (GDPR) prohibits the processing of health data unless one of ten permissible purposes applies, such as express consent or use of data for provision of health services. Finally, even when neurodata is de-identified, there remains a risk that it can be re-identified. It could be the case that sensitive data protections should by default apply to neurodata, regardless of whether the data is processed for identifying purposes.

Given that data protections will apply to personally identifiable neurodata, organizations developing and deploying neurotechnology will have to comply with the relevant requirements, including fair and lawful processing, legal basis for processing and sharing, purpose limitation, transparency, data quality, data security, accountability, and restrictions on data flows to other countries. They must also uphold key individuals rights, such as the right of access, correction, deletion, and objection, as well as any rights related to automated decision-making. Finally, data protection laws would also ensure proper oversight and enforcement by data protection supervisory authorities.

While many existing laws may address issues raised by neurotechnology in some capacity, a thorough scoping and adapting of existing privacy and human rights laws will still be necessary to fully understand the application of data protection laws and identify the potential gaps and areas of neurotechnology use that may require additional protections.

Some jurisdictions have already moved to affirmatively protect mental privacy and extend privacy protections to neurodata. For example, most recently the California legislature passed an amendment to the California Consumer Privacy Act to add neural data as an element of “sensitive personal data” and the bill now awaits signature by the governor. In 2021, Chile approved a Constitutional amendment enshrining neurorights, which essentially captures the right to mental privacy, integrity, and liberty, and has adopted an agenda to regulate neurotechnology to safeguard brain activity and neurodata from potential infringements. In the US, the state of Colorado passed a law earlier this year classifying "neural data” as sensitive data under its comprehensive privacy law, meaning that heightened protections for the collection and processing of sensitive data will extend to neural data. Argentina’s pending criminal law bill, as well as Article 19.I of France’s recent bioethics law and Article XXVI.2 of Spain’s Charter of Digital Rights also introduce neurorights protections. Intergovernmental soft law instruments, such as the OECD Recommendation on Responsible Innovation in Neurotechnology and its companion Toolkit, call for the protection of mental privacy and cognitive liberty, and UNESCO is actively drafting recommendations on the ethics of neurotechnology. However, a balance of both soft and hard law efforts must be found to encourage neurotechnology innovation and beneficial use cases, while enforcing accountability and responsible development and use of technology.

Given the current trend for demonstrable accountability in data protection law, organizations that develop and deploy neurotechnology should implement current best practices for data governance, comply with data protection requirements, and uphold individuals’ rights where possible, regardless of whether they operate in a jurisdiction that has a comprehensive data protection legal framework. After all, responsible data management is a key business enabler in the rapidly evolving digital economy. These best practices include developing data privacy management programs that extend to neurotechnology and neurodata, utilizing privacy, implementing security and ethics in the design and use of neurotechnology, as well as in the sharing and sale of neurodata, and deploying privacy-enhancing and privacy-preserving technologies. Existing accountability and data governance models like the Centre for Information Policy Leadership’s organizational accountability framework can guide neurotechnology developers and deployers in adopting demonstrable and effective data governance best practices.

Policymakers and lawmakers should also consider emerging lessons from the regulation of biometric data and biometric systems when evaluating the legal and regulatory frameworks that govern neurodata and neurotechnology:

  • First, they should avoid defining neurodata in a manner that is either too narrow or broad and consider allowing regulators to amend the definition based on technological advancements and industry consensus.
  • Second, they should avoid creating legal uncertainty by using different definitions for key technical terms.
  • Perhaps most importantly, policy and lawmakers should regulate in a risk-based manner, focusing on use cases and applications, rather than the technology or data alone.
  • Public and private sector stakeholders should also prioritize open data initiatives to ensure that neurodata is ethically and freely accessible for responsible use by researchers worldwide and across borders. This can aid in generally democratizing neuroscience research and data and increasing inclusivity, reproducibility, openness, and fairness.
  • Finally, though existing data privacy laws and protections appear to be largely sufficient, given the particularly sensitive nature of some types and uses of neurodata, it would be important to assess whether additional legislation, regulatory guidance, or amendments to existing laws, as well as enforceable standards or certifications may be needed to address the need for additional obligations and rights that cover neurodata.

In the meantime, existing human rights frameworks and data protection laws, coupled with organizational best practices for data governance and accountability, offer a useful foundation for responsible and trustworthy consumer neurotechnology development. This can help foster trust between society and the emerging neurotechnology field, which in turn will allow for the wide-scale adoption of safe and beneficial neurotechnologies.

Addressing consumer neurotechnology should be a top priority in data protection, so that the mind can continue to privately talk to itself.

Authors

Bojana Bellamy
Bojana Bellamy is the President of Hunton Andrews Kurth’s Centre for Information Policy Leadership (CIPL), a preeminent global privacy and data policy think tank in London, Washington, DC, and Brussels. Bojana works with global business and technology leaders, regulators, policy and law makers to sh...
Eduardo Bertoni
Professor Eduardo Bertoni (PhD, Buenos Aires University) is currently the Director of the Center for Human Rights and Humanitarian Law ate the American University Washington College of Law. Representative of the Regional Office for South America of the Inter American Institute of Human Rights until ...

Topics