Home

Donate
Perspective

Control for Whom? Keeping an Eye on the Dark Side of America’s New Wearables Campaign.

Nada Salem, Theodora Skeadas / Oct 2, 2025

Fanny Maurel & Digit, Ambient Scribes, Licensed by CC-BY 4.0.

We are currently observing numerous parallel efforts to integrate data sharing between the US government and different parts of the tech ecosystem.

In July 2025, the White House announced a new plan to empower Americans to “take control of their own health,” involving giving Big Tech unprecedented access to American health data. According to reporting from PBS NEWS, this new “private health tracking system…will make it easier for patients to access their health records and monitor their wellness across health care systems and technologies.” According to the Administration and the companies involved, collaboration between the federal government and Big Tech would enable patients to more efficiently manage and exchange their health data with physicians, hospital systems, and digital applications. The collaboration, however, raises a range of privacy questions.

The new initiative falls under the purview of the Centers for Medicare & Medicaid Services (CMS) as of a new Health Technology Ecosystem Initiative, a data exchange program involving some of the country’s largest health and tech companies, including Amazon, Anthropic, OpenAI, and Microsoft AI. Notably, the providers of some of the most popular wearables and remote health monitoring apps have already pledged to join the Health Technology Ecosystem Initiative as early adopters — such as Fitbits by Google, Apple watches, Oura rings and more — though it remains to be seen whether data collected by these everyday wearables will feed into the CMS’s planned ecosystem’s data exchange.

Around the same time, Health and Human Services Secretary Robert F. Kennedy Jr. announced that the Department of Health and Human Services (HHS) would be running the biggest advertising campaign in its history to put a wearable on every American. These two developments are part of a larger effort that is being branded as the “Make America Healthy Again” agenda.

In an ideal world, promoting the use of health technology can be a powerful proactive approach to healthcare. Real-time health monitoring can alert patients and physicians to risks before they become serious, make chronic disease management easier, and allow people to make data-informed decisions about their health. On a larger scale, this data can provide population-level insights to strengthen public health approaches to disease prevention. However, data is power, and without the right protections in place, the accumulation of vast amounts of personal health data can work against individuals’ interests.

Notably, this is not the first attempt by healthcare leaders to mainstream wearable technology. Insurance companies have been doing this for years, claiming to incentivize healthy habits while offering lower premiums to policyholders who agree to share data from their fitness trackers, and thus allowing providers to cherry-pick “healthy” users. While fitness trackers primarily collect basic health metrics like sleep patterns and heart rates, secondary analysis can reveal far more sensitive information, including chronic health conditions, substance use behaviors, and other personal lifestyle factors users may have never intended to share.

As the federal government partners with private technology companies on what could be the largest health data exchange to date, the line between protected medical data and unprotected wellness data is likely to blur. Beyond publicly stated goals, it remains unclear how much control patients will retain once their data becomes intertwined with broader commercial and surveillance interests, or whether current privacy regulations are robust enough to govern this massive convergence of public health and private tech.

Here are a few things the public should keep in mind before opting into the new CMS program:

Chilling effect on provision of CMS services to targeted communities. Centralizing health and wellness data can be risky within the context of recent federal data grabs. In July 2025, reporting from WIRED revealed that ICE is using Medicaid and Medicare data to identify and locate immigrants. “Per the agreement, ICE officials will get login credentials for a Centers for Medicare and Medicaid Services (CMS) database containing sensitive medical information, including detailed records about diagnoses and procedures. Language in the agreement says it will allow ICE to access personal information such as home addresses, phone numbers, IP addresses, banking data, and social security numbers.”

As the WIRED article further notes, this collusion between the Department of Homeland Security and CMS may create a chilling effect in which people are less likely to utilize emergency care or state health benefits, fearing the information they share with healthcare providers could make them vulnerable to immigration enforcement.

Lack of HIPAA protection for private app data. Health data collected by consumer wellness apps and tech companies is not protected under HIPAA. In many cases, this data can be shared, traded, or sold to advertisers and insurance companies. Given the lack of comprehensive data protection laws at both the federal and state levels, and continued opposition by tech companies to stronger privacy protections, the exploitation of health data remains a growing concern.

The issue extends beyond health apps and wearables. Schools around the US are rapidly adopting AI chatbots for student learning, even as anthropomorphization and other design features of these technologies encourage personal information and data-sharing. Without the proper data protections, students may inadvertently disclose sensitive health information while interacting with chatbots. As private health data increasingly enters consumer tech spaces, stronger protections are needed for health data beyond traditional clinical settings.

Lack of transparency in data collection and secondary insights. Wearables often collect more data than users may realize, and the list of measurable health metrics is only growing. For example, some headphones and earbuds now integrate EEG sensors to enable neural signal monitoring, a feature that could soon appear in future generations of AirPods, following Apple’s recent patent. Secondary analysis of wearable data can also reveal insights about substance use and location history, among other lifestyle factors. Yet, transparency requirements surrounding data collection remain inadequate. Stronger transparency and disclosure requirements are needed for companies that sell wearable technology to Americans.

Empowering patients with personal health data could be a positive step towards preventative healthcare, but the new CMS initiative leaves us with many unanswered questions about health data privacy. Ultimately, White House-driven efforts to encourage the wider sharing of health data of Americans collected by private technology companies raise significant concerns about how this data might be accessed and used by technology companies and federal agencies. In the current climate, where the Department of Homeland Security is increasingly able to access government data to target US residents for immigration enforcement, increasing the flow of private health information without adequate safeguards threatens to erode public trust in beneficial technologies and could chill access to public healthcare services.

Authors

Nada Salem
Nada Salem is a bioethicist and science writer examining the ethical, societal and public policy impacts of emerging biomedical technologies, including neurotechnology, medical AI and genomics. She holds a Master of Science in bioethics from Harvard Medical School, where she focused on the consent g...
Theodora Skeadas
Theodora Skeadas is a public policy strategist and thought leader at the forefront of technology ethics, platform governance, and responsible AI. Theodora is Chief of Staff at Humane Intelligence, a nonprofit committed to collaborative red teaming and improving the safety of generative AI systems. S...

Related

Perspective
States Are Fighting Back To Defend Medical Privacy and Safeguard DemocracyJuly 16, 2025

Topics