Transcript: US Senate Hearing on 'Examining Whistleblower Allegations that Meta Buried Child Safety Research'
Justin Hendrix / Sep 10, 2025
US Senator Marsha Blackburn (R-TN) at US Senate Judiciary Subcommittee hearing, “Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research,” Sep. 9, 2025, Washington, DC. (source)
On Tuesday, September 9, the United States Senate Judiciary Subcommittee on Privacy, Technology, and the Law hosted a hearing titled “Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research.” The hearing examined whistleblower testimony from two former Meta researchers, Dr. Jason Sattizahn and Cayce Savage, whose allegations were first detailed by the Washington Post.
The witnesses alleged that Meta suppressed and manipulated research revealing harm to children, including children under the age of 13, on its virtual reality platforms. The witnesses testified that Meta's legal department intervened to alter, delete, or prevent the collection of data showing sexual exploitation, harassment, and the abuse of minors in the company’s VR platforms. The witnesses said that the company prioritizes user engagement and profits over child safety.
Both Sattizahn and Savage provided written testimony to the subcommittee, and answered a variety of questions during the nearly three-hour hearing.
1. On Meta's alleged suppression of harmful research:
In his testimony, Sattizahn, who spent six years as a researcher at Meta, said:
After Meta VR sales were banned in Germany for two years over concerns about how Meta treats user data, in 2022, Germany allowed sales to resume. When I was asked to perform research in Germany, I understood that Meta was trying to show that their VR headsets were safe for German users. However, when our research uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos, and other acts that no child should ever be exposed to, Meta demanded that we erase any evidence of such dangers that we saw.
2. On the prevalence of harm to children:
When Senator Josh Hawley (R-MO) asked about the proportion of minors exposed to sexual content, Savage testified:
The prevalence of it is extremely high because it's very difficult to monitor that content in VR. It's difficult to monitor it on Instagram, much less VR. So I would estimate that any child that is in a social space in VR will come in contact with or directly expose something very inappropriate.
3. On Meta's knowledge of underage users:
When Klobuchar asked if Meta's leadership was aware of children under 13 on the platform, Savage replied:
It's such an issue that every single time I have used VR personally, the majority of individuals that I have observed or interacted with have been audibly under the age of 13. It is something that is the top complaint to my knowledge publicly for any other user that uses VR. It's something that our own, or Meta's own leadership would frequently post and be like, "Hey, here's a video of my six-year-old."
4. On legal intimidation of researchers:
When Senator Ashley Moody (R-FL) asked about witness interactions with Meta’s legal department, Sattizahn explained:
We were also threatened by lawyers. I want to make that clear. It wasn't a suggestion. It wasn't research guidance. Meta's already claimed that in the Washington Post. It wasn't legal being able to make the research better. We, both of us, had met with legal and they threatened our own jobs if we did not do this. One of the quotes they said was, "You wouldn't want to have to testify publicly if this research was to get out, would you?" And we're here, so clearly we didn't mind that.
5. On Meta's motivations for targeting children:
When Hawley asked why having children under 13 was important to Meta, Savage replied:
Children drive product adoption in the household and VR as a gaming device… That means the reason that families buy VR headsets is for kids.
6. On the character of the company
Towards the end of the hearing, Hawley asked, “Based on what you know now and your time at the company, do you think that Meta is a force for good in this country and in the world?” Savage replied:
I don't see how they can be.
Sattizahn concluded:
No. It is aggressively ambivalent to people.
What follows is a lightly edited transcript of the hearing. Refer to the official video when quoting.
Sen. Marsha Blackburn (R-TN):
Senator Klobuchar is on her way, but I am going to go ahead and begin with my opening statement. As you all are aware, we have a vote series that is taking place, but I do want to say thank you so much for our witnesses being here and for each of you being here today.
We will hear from Dr. Jason Sadazan and Ms. Casey Savage. They are two brave whistleblowers who have come forward to detail shocking allegations about Meta's coverup of deeply disturbing child safety research. They were hired to purportedly make the platform safer for children, but what they found was a company that knew their products were unsafe, and they just did not care.
Nearly four years ago, I held my first hearing with a Meta whistleblower who detailed how Meta exploits our children online simply to maximize user engagement and boost their profits. Since then, we've seen a national movement of parents, legislators, whistleblowers, and children that have said, "Enough is enough. Our children are more precious than the interest of depraved big tech CEOs, and that Congress must pass the bipartisan Kids Online Safety Act."
After all these years, Meta continues to knowingly, knowingly allow sexual exploitation and harms to children on their platforms. I'm incredibly grateful that individuals are still willing to come forward and shine a light on Meta's disturbing, willful, and intentional actions.
Mark Zuckerberg has promised that, and I'm quoting him, "In the Metaverse, you'll be able to do almost anything you can imagine." And for once, Mr. Zuckerberg's statement rings true. That is for predators, for pedophiles, for groomers, for traffickers, and all kinds of disgusting bad actors. As these whistleblowers will explain, the metaverse has become the Wild West for criminals who prey on our children.
As our witnesses will tell us today, virtual reality can be in incredibly dangerous for children, specifically because of the immersive nature of the technology. Children have a much harder time processing the difference between violence and abuse in the physical and virtual spaces. In other words, when their avatar on Meta's virtual reality platform is raped or harassed, children experience that trauma as if it is actually happening to them. Let's be clear, virtual reality is reality. These harms are real and this abuse happens every single day, every day on Meta's reality platforms.
One employee stated on an internal message board in 2017, and again I quote, "We have a child problem and it's probably time to talk about it." That was 2017, one of their employees stating, "We have a child problem." But years later, Meta's C-suite has zero interest in these findings. According to these whistleblowers, Meta executives decided they didn't want to see research detailing how harmful their products are to children, so they unleashed their attack dogs within Meta legal on the research teams, intent on creating a rosy picture of their products.
Meta legal manipulated research methods and buried negative data. Meta instructed researchers to avoid asking teen survey participants, questions which might lead them to discuss harms they've experienced because of Meta's design. And when their suppression didn't work, they just simply deleted the evidence. They erased it so no one would know.
In one user review, Meta's Horizon Worlds was dubbed and I quote, "The pedophile kingdom." This makes clear what we have known for years, that Meta profits from the sexualization and abuse of children. If you need another example, look at the recent report that Meta's own internal guidelines allowed AI chatbots to engage in sensual conversations with children. Their refusal to protect children and to instead, profit at the expense of kids' safety, this is revolting. It is heinous conduct.
Again, I want to thank the two whistleblowers that are here with us today, and the other whistleblowers who have come forward, who are not here. I know that this was not an easy decision for either of you, but what you're doing brings us one step closer, one step closer to holding these big tech platforms accountable.
We have parents in the room today, parents all across the nation are speaking out, and we thank you. And to Maureen, Brian, Mia, Christine, those parents that have lost their children to social media harms, we're grateful that you're here and grateful for your passion to make sure that no parent ever has to experience the trauma that you have experienced. With your help, we will get the Kids Online Safety Act passed and we will get it to the President's desk and we will hold big tech accountable. Now I turn to Senator Klobuchar.
Sen. Amy Klobuchar (D-MN):
Well, thank you very much, Chair Blackburn, and thank you for your long-time leadership of this, as well as the very important bill that you and Senator Blumenthal have been proud to support it and be co-sponsor, are pushing through and have already passed the Senate once, so we're close.
I have worked in this area for a long time myself and I've known the frustration of, no matter what you seem to do, you get lobbied against and millions of dollars against you, and I just think we're reaching a moment where the time is up and there are too many families and too many parents that are affected by this.
I want to thank our two whistleblowers that are here. I'm sure you never, in your wildest dreams imagined that you were going to be in front of a Senate panel in this way, but I want to thank you for doing the right thing.
For too long these companies have worked to attract kids to their platforms. They do so knowing that their platforms use algorithms that increase the risk of sexual exploitation, push harmful content, facilitate bullying, and provide venues, sadly for dealers to sell deadly drugs like fentanyl.
Meta cannot continue to turn a blind eye to these harms. It was in this very room with this very committee, maybe it was in a different room, but the same committee where Mark Zuckerberg actually turned to some families who had lost children to drugs, who said, "I'm sorry. I'm sorry this happened." Well, sorry is not enough anymore and we need to put in some rules of the road to stop this from happening.
This is not the first time whistleblowers from Meta have come forward. In 2021, another whistleblower, Frances Haugen testified that Meta knew that its products took a significant psychological toll on users. In that case, it was eating disorders.
That testimony should have been a wake-up call for Meta, chance to right the ship. Meta did make changes after her testimony, but not to protect kids. Instead, the company took steps to establish plausible deniability. Meta blocked, manipulated, hid and deleted research that showed that its virtual reality products were frequently used by underage kids who were exposed to real and significant harm.
I want to underscore that the entire appeal of the metaverse is that it's supposed to feel like real life, and that can be fun. It's communication, it's entertainment, there's bells and whistles, but the problem is that the virtual reality platform, as it has been created, also allows adults to form relationships with unwitting children that can be exploited.
As The Washington Post pointed out in their lengthy investigative piece yesterday, "A 25-year-old man was convicted of kidnapping a 13-year-old after interacting with her through Meta's virtual reality products." Despite this, Meta forged ahead pushing new features to attract younger users, and allowing younger and younger kids onto its virtual reality platforms without safety testing.
One employee from Meta estimated that more than 80% of users were underage in one of the virtual rooms. Adult users frequently complained virtual reality spaces were overrun by children. I don't know what else you need to hear to know there's a problem. Gleaning their presence by the sound of their voices, yet Meta using the code name Project SALSA because allegedly they knew it would be a spicy topic, moved to lower the official age minimum for its virtual reality headsets from 13 to 10.
In Meta's ongoing tradition of moving fast and breaking things, it broke its repeated promise to parents in Congress, to protect kids on their platforms. That's why a bipartisan coalition of 42 state attorney generals with wildly different political views on a number of things decided to take this on, and that included my attorney general in Minnesota, Keith Ellison.
But Meta has continued to prioritize user engagement. It does that because the more time people, no matter how young spend on their platforms, the more money it makes. We know the profits on kids' data. According to a recent study, social media platforms generated 11 billion in revenue in 2022 from advertising directed at kids and teens, including nearly two billion in ad profits derived from users age 12 and under.
And while today's whistleblowers left the company before its current push to catch up in the generative AI space race, their testimony raises serious questions about whether Meta is ignoring child safety issues related to AI products, especially in light of the recent reports that Meta allowed its chatbots to engage children into "internal documents, romantic or sensual conversations."
That's why we must come together, Democrats, Republicans to set up common sense rules. We have worked on both, the judiciary and commerce committees to do this. Senator Cruz and I, basically through a lot of hard work over many years, passed our Take It Down Act when the president signed it into law that would hold platforms accountable for taking down pornographic images of kids or adults that are either the actual images or AI created within 48 hours.
We must pass Senator Blackburn and Blumenthal's Kids Online Safety Act to ensure that platforms design their products to prevent and mitigate harm to kids. We also know that other industries do not enjoy this similar level of protection. If they have a appliance that blows up or they have a tire that blows up on the road, there's accountability. They get sued and that's a major incentive to fix it.
We don't have that with these social media platforms, and that is why I have long supported repealing Section 230, which was basically set up while these were little companies in the garage. That's not true anymore. They're the biggest companies the world has ever known.
To end with this, one parent once told me that her trying to get her young kids, six and eight off of these platforms, she'd have to resort to relying on her 12-year-old, 15-year-old to try to get it down. They couldn't figure out how to do it. They'd find another platform. It's just endless. Despite all her best efforts to be a mom, she said it was like a sink overflowing with a faucet she couldn't turn off, and she was just sitting out there with a mop. These parents need more than mops. They need us to pass this bill. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Grassley, you're recognized.
Sen. Charles E. Grassley (R-IA):
And like Senator Klobuchar said, I want to compliment you on your leadership. Ever since you've been in the Senate, you've been fighting to protect children from social media abuse. During my time in the Senate, I've always fought for whistleblowers, both in the government as well as in the private sector. I've offered updates to the False Claims Act, the IRS Whistleblower Program and other whistleblower laws. And just this year, I've helped almost 20 government whistleblowers get their jobs back.
Whistleblowers are key to rooting out fraud, waste, and abuse. That includes the private sector, not just the government. On September 13th, 2022, ranking member of the Judiciary Committee and I, along with Senator Durbin held a hearing on these very issues. A Twitter whistleblower at that time testified, disclosing to this committee that Twitter potentially exposed user data to foreign intelligence agencies, including the government of China. His disclosures made public that the FBI notified Twitter of at least one Chinese government agent at the company.
This month, I, along with Senators Blackburn and Hawley sent a letter to Meta about the company's use of targeted ads against teenagers. Our letter highlighted concerns that Meta's potentially violating the Children's Online Privacy Protection Act. It's been alleged that Meta collected personal information from children under 13 years of age without parental consent.
My oversight has also shown that these tech companies look to silence whistleblowers. Last year, I wrote to OpenAI about my concerns that they tried to silence whistleblowers, and I raised the same concerns this year with Meta. To address this, I've introduced bipartisan legislation to implement whistleblower protections in the artificial intelligence industry.
Now it's been alleged that another Meta whistleblower, Jason Sattizahn has suffered retaliation. For example, in March of 2023, he received a performance review stating he "exceeds expectation." Then in October 2023, he raised concerns to his leadership that Meta had been violating the Children's Online Privacy Protection Act. So what did Meta do? After spending six years at Meta with promotions and positive performance, they fired him six months after his disclosure.
Ms. Savage also raised concerns about Meta's compliance with the law. Instead of addressing her concerns, Meta's lawyers reportedly told Ms. Savage to make sure that her work did not put the company "at risk." I often say that whistleblowers are treated like skunks at a picnic. It appears our witnesses as well as other whistleblowers who've approached me have unfortunately been treated like those skunks.
Last year, we've been working on this for a long time, I should say, not just last year. So I thank you folks for your courage and bravery in coming forward to Congress. I and my colleagues will continue our investigations.
Sen. Marsha Blackburn (R-TN):
Thank you, Mr. Chairman. Senator Blumenthal.
Sen. Richard Blumenthal (D-CT):
Thank you, Madam Chair, and I want to begin by thanking you, Senator Blackburn for holding this hearing, but even more important, for your extraordinarily dedicated, tireless, relentless work on the Kids Online Safety Act, which has been for me, the opportunity of a lifetime to champion. And I have come to know some of the bravest, strongest people who are with us today, parents of children who have been lost as a result of the toxic content driven by big tech at children, purposefully and knowingly.
And I want to thank the parents who are in attendance today, as well as many, many others who couldn't be with us and say to you, we are going to continue to fight for the Kids Online Safety Act and we will win this fight. The whistleblowers who are with us today are part of this battle. You are the truth-tellers and you are part of a long line, I wish there were more, of people of conscience and conviction who have chosen to speak truth to power.
And Senator Grassley is absolutely right that we need you because wrongdoing in government and in private industry is exposed as a result of people having the courage to come forward as you have done, and you deserve more protection. I hope that Senator Grassley may join me in a bill that would provide more protection to whistleblowers.
Four years ago, another whistleblower, Frances Haugen came forward to reveal how Meta knew it was fueling and aggravating a teenage health crisis, a mental health crisis that continues today even more widespread and exacerbated. Meta's own researchers described Instagram as a "perfect storm" that "exacerbates downward spirals" of addiction, eating disorders and depression. They found that Instagram makes body image issues worse for one in three girls.
Two years later, another whistleblower came forward, Arturo Bejar. He testified before this committee that teens had dangerous, harmful experiences on Instagram at an alarming rate. And again, that Meta knew about it at the very top levels of leadership.
We worked on a solution to this problem, the Kids Online Safety Act. Meta promised it would work on solution, but it did the opposite. It worked to suppress research and tools to give parents better ways to protect their children. It purposely in effect, obstructed and blocked critical fact-finding that you both sought to do.
I will never forget Mark Zuckerberg testifying before our Judiciary Committee and turning to the parents in the audience saying he apologized, and Meta would do better. Not only did he betray that promise, he knew it was false when he made it because at that very moment, Meta was in fact suppressing research and fact-finding.
According to documents provided to our offices, Meta straightjacketed its staff under a social issues protocol that restricts research on key types of harm, including suicide, eating disorders, bullying, and child trafficking by designating them "sensitive." Yeah, they were sensitive because they would've undermined business and the reputation of the company.
What that meant in practice is Meta installed monitors from their legal department who routinely altered, blocked and shut down work on teen safety. In one research study, those monitors even demanded the destruction of data on underage children being solicited for sex acts.
Mark Zuckerberg and other Meta executives wanted to make sure that government regulators, parents and teens never heard anything more about the dangers of their products that are caused to young people. And as your disclosures show, they did so because Meta's attitude simply hasn't changed. It prioritizes profits over the well-being of children and teens. For example, Meta executives canceled proposals to more accurately identify children on its virtual reality platform and to provide them safeguards against abuse.
These disclosures show why Meta has hired armies of lawyers and lobbyists, spent millions of dollars to kill the Kids Online Safety Act. It's dangerous for their business model, even if their practices are dangerous to kids. They don't want a duty of care. They don't want transparency. They don't want tools for parents. They want the Wild West, which continues now.
And you know, Big Tech has been compared to Big Tobacco. I sued Big Tobacco. As Attorney General of the state of Connecticut, I led my fellow attorneys general in suing Big Tobacco. And you know what gained us a victory, ultimately a settlement worth a lot of money, which is still coming to the states and a change in practices? The industry's own documents showed they were lying. The parallel is indisputable. We had whistleblower there. We have whistleblowers here, who are speaking truth to power and revealing that this industry knows how its business model of driving toxic content to kids and even destroying lives is known to them. I take from your testimony that Meta has no shame, no conscience. It's waged an all-out war against kids' online safety. It's spent millions to stop that bill in the House, even though it passed overwhelmingly 91 to 3 in the Senate. The majority of American people, the vast majority, 91 to 3, bipartisan, want this measure. We're here to demand it and we're going to keep fighting until we get it done. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Durbin, do you have a statement? Skip it? Okay. I will skip you for now. At this point, I want to introduce our witnesses. Dr. Jason Sattizahn is a researcher with over 15 years of academic, social media and video game development experience. Most recently, he was a staff user experience researcher at Meta where he worked for over six years until 2024, leading research on Marketplace integrity, virtual reality, ranking algorithms, and emotional and physical harm to vulnerable populations. Prior to his role at Meta, he earned his PhD in integrative neuroscience from the University of Chicago, where he was also a graduate researcher before developing video games and accessibility features at Sony PlayStation.
Ms. Cayce Savage is a user experience researcher with 12 years of experience in academic virtual reality and social media research. After earning her master's degree in positive organizational psychology and program evaluation from Claremont Graduate University, Ms. Savage worked as a user experience researcher for a startup at T-Mobile before joining Meta in 2019. During her four years at Meta, she worked on Facebook Marketplace, Facebook jobs, Facebook groups, and virtual reality. She specifically focused her research within Reality Labs on users under the age of 18 and immersive emotional, social and physical harm to minors. She now serves as the lead user experience researcher for eBay Live. We welcome each of you.
At this time, I would like for you to rise and raise your right hand. Do you swear or affirm that the testimony you're about to give before this committee is the truth, the whole truth, and nothing but the truth, so help you God? Thank you. You may be seated. Both witnesses answered in the affirmative. You each have five minutes for your opening. Dr. Sattizahn, you're recognized.
Jason Sattizahn:
Chairman Blackburn, Ranking Member Klobuchar. Senator Grassley, Senator Durbin, Senator Blumenthal, and members of the Subcommittee, thank you for having me here. I'm here to discuss Meta's manipulation of research to cover up dangers facing billions across Meta's products and of particular concern, the millions of children and using Meta's virtual reality products. I also want to recognize and thank the five past and current Meta employees, all researchers like me who have worked directly on creating Meta's products, who made the brave decision to be a part of this disclosure that brings us all here.
My name's Jason Sattizahn. Growing up in the '90s and in the middle of Missouri, I saw both the value and the problems that advances in technology brought the world. After a PhD in integrative neuroscience, I wanted to use my research experience to make these technologies and products better for the people that use them. Most recently, I spent six years as a researcher at Meta working in some of their most sensitive spaces and tasked with understanding users their needs and using this to try and make their products safer.
I am here today because it is evident that Meta consistently chooses profit over safety. I'm not the first to discuss this as repeated whistleblowers have shared Meta's reckless disregard for users. However, in the wake of past whistleblowers, Meta has chosen to ignore the problems they created and bury evidence of users' negative experiences. I worked at Meta from 2018 to 2024. During these six years, I witnessed data scandals, multiple whistleblower disclosures about Meta's disregard for users and mounting public pressure for Meta. To address these issues, I saw the company respond to these pressures by deliberately compromising internal processes, policies, and research to protect company profits over their users.
During my first role at Meta, I led integrity research for Facebook Marketplace and the data was clear, Marketplace causes suffering for users, including financial loss, stolen and counterfeit items, and personal safety issues ranging from being sexually propositioned by strangers to physical assaults and attempted kidnapping. My time on Marketplace was my first exposure to help Facebook deprioritized safety for boosting user engagement. Simple safety investments such as not allowing people to message strangers with a single click were flatly rejected because product teams were afraid to do anything that could decrease engagement, the metric determining success and bonuses. It was around this time I also first saw Facebook make false statements to Congress, particularly about their inability to estimate stolen goods on Marketplace. Their statements directly contradict my own internal research, which I've submitted with this whistleblower disclosure.
In the fall of 2021, Francis Haugen disclosed to Congress how Meta's products fuel mental health issues for teens, including body dysmorphia and self-harm. Meta's immediate response to Congressional concern was not to do the right thing, but rather roll out new processes and policies to manipulate control and erase data. We researchers were directed how to write reports to limit risk to Meta. Internal work groups were locked down, making it nearly impossible to share data and coordinate between teams to keep users safe.
Mark Zuckerberg disparaged whistleblowers, claiming past disclosures were "used to construct a false narrative." Despite Meta's attempts to prevent researchers from collecting necessary insights, the research we were able to do continued to show the dangers of Meta's products on users. This only highlights the sheer scale severity and prevalence of harm on these Meta's products. In early 2022, I moved to Meta's Reality Labs to lead integrity research and help improve those using Meta's VR headsets. Virtual reality allows someone to wear a headset and experience an alternate reality to play games, watch movies, socialize with others. For Meta, VR is designed to push socialization above all as Meta saw this as a path to unbridled engagement and profit. The company invested billions integrated social media like Instagram into headsets, and even rebranded as "Meta" to align with the future of the company.
From my first days in Reality Labs, Meta leadership and legal teams were in complete control of the research I was conducting. This was crucial research because this was a largely untested technology, but I soon learned that Meta had no interest in VR safety unless it could drive interaction and thus profit. After Meta VR sales were banned in Germany for two years over concerns about how Meta treats user data, in 2022, Germany allowed sales to resume. When I was asked to perform research in Germany, I understood that Meta was trying to show that their VR headsets were safe for German users. However, when our research uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos, and other acts that no child should ever be exposed to, Meta demanded that we erase any evidence of such dangers that we saw.
Despite Meta's attempts to hide these sensitive findings, my research still revealed emotional and psychological damage, particularly to women who were sexually solicited, molested, or worse. In response, Meta demanded I change my research in the future to not gather this data on emotional and psychological harm. When my colleagues' research showed the emotional impact of children being threatened by physical harm by strangers online, Meta not only restricted internal sharing, but manipulated reports to obscure any emotional damage. During my time working-
Sen. Marsha Blackburn (R-TN):
Your time's expired, so let's wrap up.
Jason Sattizahn:
Of course. If I had one thing to say, I would just want to make it very clear that Meta is incapable of change without being forced by Congress, whether it's engagement or profits at any cost, they have frankly had unearned opportunities in order to correct their behavior and they have not. So with that, I'll say thank you again for the opportunity to be with you all today, and for your commitment to stop the deliberate harm for millions of Americans so thank you.
Sen. Marsha Blackburn (R-TN):
Well, thank you. Ms. Savage, you're recognized for five minutes.
Cayce Savage:
Good afternoon Subcommittee Chair Blackburn, Ranking Member Klobuchar, Senator Blumenthal, Senator Durbin, and Senator Padilla. I am a user experience researcher. It is my job to listen to and advocate for users. I have a graduate degree in experimental psychology and 12 years of experience working as a researcher. I do this work because fundamentally, I care about people. I worked at Meta from 2019 to 2023. In those four years, and most especially as I led research on youth safety and virtual reality, it became clear to me that Meta is uninterested and unwilling to listen to their users or prioritize their safety. While I speak about virtual reality, it is important to understand that the way Meta has approached safety for VR is emblematic of its negligent approach to safety for all of its products. The research is clear on what we must do to ensure that new technology is safe for children.
Yet across social media, messaging apps, and now wearable technology, Meta has failed to prioritize child safety until they are scrutinized by outside regulators. Then they scramble to develop features they know are insufficient and largely unused and advertise this as proof of their responsibility. Meta is aware that its VR platform is full of underage children. Meta purposely turns a blind eye to this knowledge, despite it being obvious to anyone using their products. If Meta were to acknowledge the presence of underage users, they would be required to kick off those users from their platform in order to remain COPPA compliant. This isn't happening because it would decrease the number of active users that Meta is reporting to shareholders. At Meta, engagement is the priority above everything else.
Because VR is immersive and embodied, negative experiences cause greater psychological harm than similar experiences on an iPad or an Xbox. In VR, someone can stand behind your child and whisper in their ear and your child will feel their presence as though it is real. VR is tracking a user's real-life movements, so assault in VR requires those movements to happen in real life. What happens in virtual reality is reality.
Most importantly, Meta is aware that children are being harmed in VR. I quickly became aware that it is not uncommon for children in VR to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, and to be regularly exposed to mature content like gambling and violence, and to participate in adult experiences like strip clubs and watching pornography with strangers. I wish I could tell you the percentage of children in VR experiencing these harms, but Meta would not allow me to conduct this research. I personally saw these things happening in VR, consistently heard reports from teens and parents in research and read countless accounts from concerned parents online.
It's easy to learn that children are not safe using Meta's VR products just by reading public reviews like this one. Thanks Meta for making this the pedophile kingdom. They have made it so easy for us to meet and exchange information with children here. Meta first acquired Oculus, its VR technology. In 2014, I was the first and for a time the only researcher dedicated to understanding whether its VR software experiences were safe for children, and I wasn't hired until 2022. So for eight years, as tens of millions of headsets were sold, Meta did not think about the safety of the children. It relied on to achieve global market dominance.
Throughout my time on Meta's VR youth team, child safety issues regularly went unresearched despite the frequency and severity of the harm. I was given a legal counterpart to scrutinize everything that I did to tell me what research I could and could not conduct, and to ensure my research reports would not create risk to Meta should they be publicly disclosed. I was told not to investigate the kinds of harm children were experiencing in VR, and made to feel I was risking my job if I pressed the matter.
Instead of amplifying the voices of our users, my work began being used to silence them so that Meta could claim deniability. I know a number of my colleagues were put in similar positions. Meta cannot be trusted to tell the truth about the safety or use of its products. Meta says it doesn't have a record of a large number of underage children using VR. This is because it has purposefully avoided gathering that data, despite members of Meta's own leadership indicating that even they are unaware of or don't understand the importance of the minimum age of use. Despite this research proposed to address this idea was not allowed.
I deliberated for a long time about whether to come forward. Meta responded to Francis Haugen's disclosure in 2021 by cracking down on research internally. Researchers across the company were subjected to sudden censorship and were told it was for our own protection so we wouldn't be part of any future leaks. Candidly, I am worried that speaking to you today will put my former colleagues as well as the field of user research within Meta at risk. To my former colleagues who continue to advocate internally for child safety, I would like to express the greatest gratitude and admiration.
Previous whistleblowers have come before this body to publicly testify to the suffering adults and children experience using Meta's products. Meta has promised it would change. I'm here to tell you today that Meta has changed, but for the worse. Meta has spent the time and money it could have spent making its products safer, shielding itself. Instead, all the while developing emerging technologies which pose even greater risk to children than Instagram. Meta consistently demonstrates that it cares more about the bottom line than the emotional or physical safety of the children who uses products every single day. How can Meta care for the safety of children when it doesn't acknowledge that they exist? Senators, thank you again for your time and for your support regarding this matter.
Sen. Marsha Blackburn (R-TN):
Thank you, Ms. Savage. We'll begin our questioning now, and Ms. Savage, let me start with you. You talk about Meta's refusal to acknowledge the harms, but I think many people who are watching this hearing today don't know what those harms would be. Just very quickly, let's talk about if you were an eight-year-old girl and what you would experience in an unsafe situation in the Metaverse?
Cayce Savage:
Thank you. I think it's important to start by describing briefly what it's like to use VR, especially because a lot of parents haven't. You're wearing a headset on your head. It's strapped to your face and it completely obscures your vision and your hearing so you can no longer see the real world around you, which can also feel very vulnerable because people can come up behind you in real life. This means that some of the experiences in VR can feel heightened because you're already feeling vulnerable. VR is designed to be immersive and embodied, that's its appeal. And so when you experience things in VR, they feel meaningfully more real, psychologically more real than if you were to experience that same thing on a television. You also have an avatar that you are embodying and which research shows us you identify with. So if something happens to your avatar, it feels like it's happening to you. It's also important to note that this is a social space, so there are other users with bodies that can corner you, that can surround you, that can touch you, and folks can also speak to you.
Sen. Marsha Blackburn (R-TN):
And let me add in there, users who are not known to you.
Cayce Savage:
Yeah, that's a very good point. It's most common for you to be interacting with people you don't know in real life. And a lot of parents in my research indicated that they weren't aware that their children were interacting with strangers.
Sen. Marsha Blackburn (R-TN):
All right.
Jason Sattizahn:
Just to be very explicit, because it's not something that I think a lot of people have exposure with, when we talk about harms for children and for adults in VR, and we talk about things that are said, the audio that's transmitted isn't just solicitation or speech. There will also be instances that we have seen of where you can hear people sexually pleasuring themselves transmitted over audio in a spatial sense as you are being surrounded and brigaded and being harassed. It's not just simple statements, it is actually the transmission of the motion and the audio of sex acts itself.
Sen. Marsha Blackburn (R-TN):
And so that is what causes children to have the physiological and psychological response, as if it were happening to them in the real world.
Cayce Savage:
Yes, exactly. Visually and auditorily, it feels real.
Sen. Marsha Blackburn (R-TN):
Okay, Dr. Sattizahn, I want to come to you. Senator Blumenthal and I wrote to Meta back in April really sounding the alarm on some of their policies with AI chatbots entering into sensual conversations with children. And these policies went so far as to allow chatbots to tell a shirtless boy, and I'll quote this, "Every inch of you is a masterpiece, a treasure I cherish deeply." So you've been through this process at Meta dealing with them, does it surprise you that they would allow their chatbots to engage in these conversations with children?
Jason Sattizahn:
No, not at all. One thing that I'd like to just caveat this is I never directly worked on the Meta AI chatbot team, but I did work directly on Facebook ranking, and to not get in the tech weeds with this Facebook ranking or ranking in general is a sibling to artificial intelligence, it's actually something that's included with AI algorithms. When I worked on the Facebook ranking team, I was working on how algorithms could be used to keep people safer. Whenever they see things that are shown to them in Marketplace, they see sexual content, we take it away, et cetera. Ranking is something that uses, generally speaking, more structured data that is easier to control for safety. And even when I was working on ranking, that was something that when I would try to work on the teams to improve safety, they were unable to do.
Sen. Marsha Blackburn (R-TN):
Okay, let me ask you this. Talking about algorithms, I know Meta has said TikTok is their main competitor, so is Meta intensifying, intentionally intensifying, their algorithms in order to be able to compete more closely with TikTok?
Jason Sattizahn:
It is so much sillier than that. When I attempted to look at things called coefficients or variables to make these algorithms safer, the engineers themselves told me that they didn't actually look at them, because if the algorithm predicted it and engagement went up, then it must be good. So to answer your original question, no, this doesn't surprise me at all because in the context of something like ranking, if they can't figure it out, I have no idea why they would ever be able to figure it out within AI as well.
Sen. Marsha Blackburn (R-TN):
Got it. Senator Klobuchar, you're recognized.
Sen. Amy Klobuchar (D-MN):
Thank you very much, Madam Chair. Thank you for your testimony. We've talked about Frances Haugen and her testimony and in 2021 I asked whether the work of Meta's internal researchers, people like yourselves is thorough and reliable. She said then that Meta had a top ranked research program and that its researchers were some of the biggest heroes inside the company in answer to my question because they were willing to boldly ask real questions, which I truly appreciate. I believe that for many of them, that was true. Could you tell me how did Meta's top ranked research program, which you were a part of, change after she exposed that Meta's platforms frequently caused significant harm to young users? And be somewhat brief please, thank you.
Jason Sattizahn:
Of course. So from after 2021, after the Frances Haugen disclosure, there was essentially what I always refer to as a funnel manipulation put on research. What I mean by that is every stage of research before, during and after its creation was locked down, monitored by either legal teams or management themselves. I'll be very brief. To summarize this, one my colleague brought up is legal surveillance. That's having a lawyer constantly look over things and possibly edit them. That is limiting the topics, the questions, the methods that you can use before you even collect data.
It's the monitoring of research, so you might have legal actually watching the data you collect, and you are told to erase it proactively if they believe it is too sensitive. It's even going as far to once reports are written, legal will go in and directly alter the findings themselves or demand that you take them out before publishing them. Generally speaking, there was this increase after '21 to also silo the research into closed off areas. And so my ability to go to my colleagues and talk about safety and say, "Hey, here's some data that you need on Facebook groups," et cetera, went away.
Sen. Amy Klobuchar (D-MN):
Okay. Let me just do some, that was very helpful, some quick yes/no questions because for one of the things that I think really bothers these parents who've lost children or have children who are just simply addicted to these platforms is that it appears and there's good proof that they are more interested the company in their bottom line than protecting these kids. Just yes, no, did Meta stop research projects into child safety, do you believe because it didn't want to know the result?
Jason Sattizahn:
Yes.
Sen. Amy Klobuchar (D-MN):
And did Meta restrict the information researchers could collect about child safety to preserve plausible deniability?
Jason Sattizahn:
Yes.
Sen. Amy Klobuchar (D-MN):
And did Meta alter research designs to avoid collecting certain information?
Jason Sattizahn:
Yes.
Sen. Amy Klobuchar (D-MN):
And did Meta modify research reports and results?
Jason Sattizahn:
Yes.
Sen. Amy Klobuchar (D-MN):
And did Meta require researchers to delete data that showed harm to kids that was occurring on its platforms?
Jason Sattizahn:
Yes.
Sen. Amy Klobuchar (D-MN):
Okay, I think that says it all. Thank you. Ms. Savage, while at Meta you proposed research to better understand the age of children using virtual reality. So now we're now phasing into our current state this project called Project Horton, right? Horton the Who? It was named after Dr. Seuss thing because these kids are that young was initially approved by Meta's Chief Technology Officer and funded for more than 1 million, is that right?
Cayce Savage:
That's correct.
Sen. Amy Klobuchar (D-MN):
What happened to that research project?
Cayce Savage:
It was canceled with no explanation, which at Meta is very unusual.
Sen. Amy Klobuchar (D-MN):
And given that the research was approved by the company's chief technology officer, who at Meta could have overruled that decision and shut the project down?
Cayce Savage:
To my knowledge, the only other person is Mark Zuckerberg.
Sen. Amy Klobuchar (D-MN):
Why do you think Meta's top leadership canceled the research?
Cayce Savage:
If Meta were to improve the quality of its ability to identify the true age of its users, it would be required to shut down such a large number of accounts that it would meaningfully drop their engagement metrics.
Sen. Amy Klobuchar (D-MN):
How does turning a blind eye to the actual age of a young user in virtual reality put young users at risk?
Cayce Savage:
In so many ways. Part of development is that children are still developing the ability to distinguish between reality and fantasy, so particularly for very young children that are in VR, things that happen may have more significant psychological effect. The research on this is still ongoing, so we don't know the full extent of this, but of course the most egregious harm that we're aware of currently is that VR is very social and it's typical that that social interaction is happening between a child and a stranger, and avatars all look the same age so whether that other person's a child or an adult, we don't know.
Sen. Amy Klobuchar (D-MN):
The Children's Online Privacy Protection Act requires parental consent for the collection of personal data of users under 13, yet we have heard testimony today that Meta knew users under 13 were on its virtual reality platforms. Instead of addressing the issue, Meta, we believe suppressed research that could have confirmed the presence of young users. How prevalent in your mind were users under the age of 13 on Meta's virtual reality platform, and was Meta aware of the problem?
Cayce Savage:
It's such an issue that every single time I have used VR personally, the majority of individuals that I have observed or interacted with have been audibly under the age of 13. It is something that is the top complaint to my knowledge publicly for any other user that uses VR. It's something that our own, or Meta's own leadership would frequently post and be like, "Hey, here's a video of my six-year-old."
Sen. Amy Klobuchar (D-MN):
And was Meta's leadership aware of the presence of kids under 13 then honestly?
Cayce Savage:
Yes.
Sen. Amy Klobuchar (D-MN):
Did Meta take any steps to ensure it was not collecting data from these young users Without parental consent?
Cayce Savage:
It didn't have the ability to identify which users were under 13, so it wouldn't have had that ability.
Sen. Amy Klobuchar (D-MN):
Okay. Last question, back to you and very briefly, Mr. Sattizahn. Last month I sent a letter with Senator Schatz, Britt, seven other senators to Meta raising significant concerns about its internal policies that allow its generative AI chatbots to have romantic or sensual conversations with kids. Given your experience, do you think Meta was aware of the potential harms to kids that could occur by letting its chatbots engage in such conversations with kids?
Jason Sattizahn:
Yes, Meta has direct and global initiatives for years now to target youth across all of their products.
Sen. Amy Klobuchar (D-MN):
Do you have any reason to believe that Meta is doing more to protect kids in its AI work than its virtual reality work?
Jason Sattizahn:
To my knowledge, no.
Sen. Amy Klobuchar (D-MN):
Thank you.
Sen. Josh Hawley (R-MO):
Ms. Savage, let me just pick up where Senator Klobuchar just left off. You said that in your experience, a majority of VR users are under the age of 13?
Cayce Savage:
That's correct.
Sen. Josh Hawley (R-MO):
And that this is apparent to Meta leadership, you said?
Cayce Savage:
Yes, it's apparent to anyone who uses the product.
Sen. Josh Hawley (R-MO):
Who in particular in Meta leadership?
Cayce Savage:
I'm curious if you remember any particular posts, but I know it's apparent to at least C-suite.
Jason Sattizahn:
Correct. There were legal directors, for instance, who would consistently monitor employees who would post underage videos of their own children using VR, so this was directly visible to leadership who are posting these videos and having it removed internally.
Sen. Josh Hawley (R-MO):
Mark Zuckerberg is aware of this, yes or no?
Jason Sattizahn:
He would have to be, oh, sorry.
Sen. Josh Hawley (R-MO):
That's a yes, Dr. Sattizahn. Ms. Savage?
Cayce Savage:
The only way that he would not be aware is if he had never used his own headset.
Sen. Josh Hawley (R-MO):
Well, that's extraordinary. And it's particularly extraordinary because let's take a look at actually Mark Zuckerberg's testimony. Mark Zuckerberg has testified before this committee as recently as just last year, January 31st, 2024. Let's put it up so we can see it. He says, this is Zuckerberg testifying, "We don't allow people under the age of 13 on our service. So if anyone who's under the age of 13 rather, if we find them, we remove them from our service."
And then in response to another question, he says, "We don't want users under the age of 13." We don't want users under the age of 13. Ms. Savage, is this true?
Cayce Savage:
That's an interesting quote. I mean, it would require the user to be honest about their age, in which case Meta would kick them off. But from research, even before Meta acquired the Oculus technology, we know that children are usually not using an account that accurately reflects their age for several reasons. The want part is interesting to me, I don't think their behavior matches that sentiment.
Sen. Josh Hawley (R-MO):
Mr. Sattizahn, is it true that Meta does not want users under the age of 13 on the platform in any of their services?
Jason Sattizahn:
I'd like to go back to the point that you made in response to this is that if they did their value, their engagement, their profits would go through the floor in this quote, if is doing a lot of work because they have taken no substantive efforts to make sure that they're understanding that there are kids under 13. It's a lie by avoidance and Meta knows it.
Sen. Josh Hawley (R-MO):
It's a lie by avoidance. In other words, Mark Zuckerberg, when he tested, this is last year, I want to emphasize, so isn't 10 years ago, this is barely 10 months ago. This was just last year when Mark Zuckerberg says, we don't want people who are under age 13, they're not on our platform. And your testimony here today is in fact they're rampant on the platform and Meta is specifically targeting them. Isn't that your testimony?
Cayce Savage:
Yes, that's correct.
Sen. Josh Hawley (R-MO):
So Mark Zuckerberg has once again deliberately misled and lied to the American people. I mean, this is really, really extraordinary. Let me just make sure that I understand the full import of what you're testifying to. Dr. Sattizahn, if I could just come back to you, you have testified that Meta erases evidence of sexual abuse on its VR platform. Is that correct?
Jason Sattizahn:
Correct.
Sen. Josh Hawley (R-MO):
That Meta changes research to not gather data on emotional and sexual harm once it became aware that it was rampant. Is that correct?
Jason Sattizahn:
That is correct.
Sen. Josh Hawley (R-MO):
That Meta manipulates results of data to obscure any harm or other damage, shall we say, to users, is that correct?
Jason Sattizahn:
That's correct.
Sen. Josh Hawley (R-MO):
Why do the research at all?
Jason Sattizahn:
That's a great question. You have to do research. Actually, going back to my earlier point, the siloing of research helps explain this. When you put the most sensitive research into bucket where no one else sees it, all of the other research that's done is phenomenal, and other researchers of the company are great at their job. They help make the product look better, play better, and experience better, but they're not thinking about safety or integrity because they're told not to.
Sen. Josh Hawley (R-MO):
Well, my question is why do any research at all on harms? You've testified that you were doing research on harms, other people did research on harms, but Meta and its various entities, legal, et cetera, the C-suite would intervene and say, "Not that, change this, scrub that." Why do any of it to begin with then?
Jason Sattizahn:
It comes down to the fact that some research is necessary to create a paper trail in order to show that you were doing it.
Sen. Josh Hawley (R-MO):
Yeah. Exactly.
Jason Sattizahn:
For instance, privacy work, you have to show that you are abiding by some sort of general data privacy, whether you're in one country or another, and you have to have that on record. But there are some things that are too sensitive to be done, et cetera.
Sen. Josh Hawley (R-MO):
Isn't it precisely so that you can come before bodies like this one and go before the public and go before your shareholders and say, "We're doing research on harm. We're doing gobs of research on harm."
What they don't tell you is they're altering the research on harm. The whole point of doing it is so you can lie about it and create the impression that in fact, as Zuckerberg testified to this committee, "Oh, we're tracking it closely." When in fact they're lying about it through their teeth. I mean, isn't that in fact what's going on?
Jason Sattizahn:
If I may.
Sen. Josh Hawley (R-MO):
Please.
Jason Sattizahn:
Yesterday, Andy Stone, spokesperson for Meta, actually used this exact same excuse, a tweet about 12 hours ago where his response to the Washington Post article, which is we predicted a year ago, was, "We have done over 180 studies including topics such as youth harm, et cetera."
That response, again, is a lie by avoidance because it's pointing out some rote number that means nothing. The whole point of this testimony is that the research they're doing is being pruned and manipulated, yet now they're going before yourself, a government body, the public and saying, "But look, we did some research."
Sen. Josh Hawley (R-MO):
Why is having children under the age of 13, why is it so important to Meta? You both testified to this, your declarations testified to it. Why is it so important to Meta? Go ahead, Ms. Savage.
Cayce Savage:
Children drive product adoption in the household and VR as a gaming device.
Sen. Josh Hawley (R-MO):
So they drive product adoption, which means what for Meta? Just cash it out for us.
Cayce Savage:
That means the reason that families buy VR headsets is for kids.
Sen. Josh Hawley (R-MO):
Which means more money for Meta?
Cayce Savage:
Yes.
Sen. Josh Hawley (R-MO):
So this is about profits at the end of the day.
Cayce Savage:
Yes.
Sen. Josh Hawley (R-MO):
Let's just be really clear. Meta's number one bottom line is money. It is profits bar nothing. That's it. And what you're testifying to is they will do anything, anything including exposing our children to the most vile sexual abuse if it means more profits for Meta. Have I got that right?
Cayce Savage:
Yes. If I may.
Sen. Josh Hawley (R-MO):
Please.
Cayce Savage:
When I was doing research to identify the harms that children were facing in VR, which I had to be sneaky about because legal wouldn't actually let me do it, I identified that Roblox, the app in VR was being used by coordinated pedophile rings. They set up strip clubs and they pay children to strip, and that Roblox can be converted into real money. I flagged this to Meta, I said, "Under no circumstances should we host the app, Roblox, on our headset." You can now download it in their app store.
Sen. Josh Hawley (R-MO):
I'm going to turn this back over to the chairwoman because my time has expired. I just want to end by saying this. It is abundantly clear to me that it is time to allow parents and victims to sue this company. They have got to be able to get into court and to get in front of a jury and hold this company accountable. And that begins with Mark Zuckerberg. There has to be accountability. We need to open the courtroom doors and allow victims to have their day in court. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Padilla, you're recognized.
Sen. Alex Padilla (D-CA):
Thank you, Madam Chair, and thank you to the witnesses for being here.
Mr. Sattizahn, in your written testimony, you shared that in January of 2023, you launched a survey of Meta virtual reality users, including minors and adults. That survey revealed that nearly half of Meta's VR users experienced harm and 1 in 10 users experience severe harm, such as racism or sexual inappropriateness. These aren't isolated incidents. They reflect a systematic problem across the platform. Correct me if you disagree.
Jason Sattizahn:
Not at all.
Sen. Alex Padilla (D-CA):
Okay. So your survey also showed that Meta VR users, including minors, didn't know enough about the tools that exist intended to keep them safe. And yet after your study revealed this, Meta denied multiple requests to invest in user education. Is that your understanding?
Jason Sattizahn:
That is correct.
Sen. Alex Padilla (D-CA):
Okay. I ask this not just as a Senator, but as a parent. To think that Meta would have resources available to better keep kids safe but not implement them is beyond troubling. Just as we don't allow car companies to sell vehicles without seat belts anymore, or drug companies to market medicines without clear safety labels, Meta's VR systems should be no different. So my first question to you is, what resources would have been helpful? Who denied these requests and why?
Jason Sattizahn:
The resourcing that would be helpful is staffing, is money. I mentioned in my disclosure, over my six years at Meta, I was always in this exact space, integrity and safety. One constant over my six years was not having enough money to build for safety.
Our engineering staff was always understaffed. Research staff was always understaffed. You might've noticed, but I was the only researcher for integrity across virtual reality software. That still gives me goosebumps when I think about it because I think I'm a good researcher. No researcher is good enough to cover that whole space by themselves. The thing that would've helped is taking the resourcing from chasing the next shiny object just to boost user engagement and given some of those resources that time, that money to our teams trying to build for safer products.
Sen. Alex Padilla (D-CA):
Thank you. Now, I also recall a hearing that we had in the last Congress and this committee when CEOs of the five social media companies, including Meta, testified. It was a full committee, not a subcommittee. I recall asking each of them to share data with us about not just the tools available for minors and for parents, but what the adoption rates were for those tools.
Surprise, surprise, they didn't share a whole ton of information and what limited data we did receive suggested that the adoption rates were very low. As a follow-up question, in your testimony through your survey research that virtually no one used parental supervision controls in Meta VR. Why are adoption rates so low and what can be done, should be done to improve those adoption rates?
Jason Sattizahn:
Yeah. So both my colleague and I have worked in this space so long that we've known at Meta internally that these types of parental controls were not sufficient to keep individuals safe. I'd have to go back and I'm happy to follow up on this. I believe the earliest report that I saw was from 2018. An Instagram report showing that these types of parental controls that I believe Mr. Zuckerberg was referring to in this case were not sufficient for parents to keep children safe.
And so when we look into my research and we see somewhere between, I believe it was 2 to 10% of adoption rates, it didn't surprise me because not even the adults, the children who are experiencing VR in this case even see these things as valuable because from the onset they weren't actually built to be valuable for those people using them. I don't know if you have anything to add because you also work in the space.
Sen. Alex Padilla (D-CA):
Ms. Savage, please.
Cayce Savage:
Yeah, if I may. I mean certainly the parental supervision tools are not sufficient in large part because there's no parent education. Particularly when we're talking about VR, we know that usually the kids are the ones using the headsets and the parents are not using VR and it's fundamentally a different world. So parents don't understand the risks that children face. They don't understand the ways that they should be supervising. This is something that I flagged to Meta leadership that we needed to make a priority and it is not been meaningfully actioned on.
Sen. Alex Padilla (D-CA):
I thank you both for your testimony and I may follow up with questions for the record after the hearing today. Thank you.
Sen. Marsha Blackburn (R-TN):
Senator Moody recognized.
Sen. Ashley Moody (R-FL):
Good afternoon. Thank you for joining us. It's heartbreaking that we have to have this hearing. I am one of the newest US senators here in Washington, and when I was in Florida, I was the attorney general. One of the hardest things for me as attorney general was balancing being a mother with a school-aged child and also being the attorney general on behalf of the state, protecting all of the children in Florida.
As difficult as that was of being the first mom on our cabinet of a school-aged child, I thought it brought to me a perspective and almost an urgency in focusing on some of these issues, whether that meant vaping going on in our elementary schools or predators getting to our children online. I think what you're saying today is something that I have been trying to tell everyone. We really are the first generation of parents having to raise kids in this age of social media and VR and online platforms when Congress has done nothing to keep up with the evolution of technology. They're basically saying, "Figure it out yourselves. It's the wild, wild west." And you're saying today that they knew not only that parents really didn't know how to use it, but they weren't using parental controls. Is that correct? Yes or no?
Cayce Savage:
That's correct.
Jason Sattizahn:
Correct.
Sen. Ashley Moody (R-FL):
So I would consider myself pretty knowledgeable in this area considering I was one of the first attorney generals to sue Meta in court for damaging and harming our children. And yet it doesn't surprise you that someone like me who has all of this knowledge had to go to my own child and say, "How do I find the parental controls?"
Cayce Savage:
Not at all.
Jason Sattizahn:
Not at all.
Sen. Ashley Moody (R-FL):
And you believe that Meta knew this?
Cayce Savage:
Yes.
Jason Sattizahn:
Yes.
Sen. Ashley Moody (R-FL):
One of the other things that is shocking, what I've tried to tell other parents, and they're finding this out slowly and surely because if their children haven't been harmed, their friends have been harmed. We used to say the rules that we all knew as parents, "Don't go up to that van. Don't take that candy. Don't do this. Don't talk to that stranger. Stranger danger." Stranger danger now exists in our children's bedrooms. Is that correct?
Cayce Savage:
That's correct.
Jason Sattizahn:
Correct.
Sen. Ashley Moody (R-FL):
And did you find in your research, and is there documentation at Meta and do employees know that children under 13 or let's say children under 18 are being propositioned and harmed by predators?
Cayce Savage:
Yes.
Jason Sattizahn:
Yes.
Sen. Ashley Moody (R-FL):
And you're saying that Meta knew this and a way to deal with that is they brought in a team of lawyers to tell their researchers what language to use in reports so that it could qualify as attorney-client privilege?
Cayce Savage:
Yes.
Jason Sattizahn:
Correct.
Sen. Ashley Moody (R-FL):
So if there was a warehouse and someone knew that children were inside that warehouse being harmed by adults, do you think it would be sufficient to have a lawyer work with the people outside to make it conducive or allow for or cover up what was happening to the children in the warehouse?
Cayce Savage:
Of course not.
Jason Sattizahn:
Not at all.
Sen. Ashley Moody (R-FL):
But because it's online space, we're expected to all just take it because they brought in a team of lawyers and taught them what language to use?
Cayce Savage:
And they think parents don't know about the warehouse.
Sen. Ashley Moody (R-FL):
I believe there was a report that said Meta proposed two ways that researchers could limit the risk of conducting sensitive research. One suggestion was to loop lawyers into their research, protecting their communications from "adverse parties," and I'm using quotes there, I'm assuming the adverse parties mean parents and kids, due to attorney-client privilege. Researchers could also write about their findings more vaguely avoiding terms like "noncompliant" or "illegal". Is that correct?
Cayce Savage:
Yes.
Sen. Ashley Moody (R-FL):
Is that correct?
Jason Sattizahn:
It is correct.
Sen. Ashley Moody (R-FL):
I've been a lawyer, I've been a judge, I don't know if attorney-client privilege covers lawyers that are complicit and help with facilitating an ongoing violation of law, especially when it relates to harming children. But yet you as researchers, were instructed to use different language and include lawyers so that this could all be shielded.
Cayce Savage:
Yes.
Jason Sattizahn:
May I add something to this?
Sen. Ashley Moody (R-FL):
Please.
Jason Sattizahn:
We were also threatened by lawyers. I want to make that clear. It wasn't a suggestion. It wasn't research guidance. Meta's already claimed that in the Washington Post. It wasn't legal being able to make the research better. We, both of us, had met with legal and they threatened our own jobs if we did not do this. One of the quotes they said was, "You wouldn't want to have to testify publicly if this research was to get out, would you?" And we're here, so clearly we didn't mind that.
Sen. Ashley Moody (R-FL):
Well, aside from ethical obligations of lawyers, I'm pretty sure attorney-client privilege doesn't cover the cooperation and planning and enabling of the continued violation of laws. And I appreciate you, Madam Chairman, for being here today and conducting this hearing.
Sen. Marsha Blackburn (R-TN):
Absolutely. And before I turn to Senator Coons, I want to follow up on what you were saying. Here's the Oculus headset box, and we know that only 2% of parents use the parental controls. So when I got this, I went through looking for where are the parental controls. I have two grandsons. I know that they enjoy this. And right here on the bottom of the box is one tiny QR code, one SKU right here, one code. And that is what a parent would have to go to, after they put their glasses on, to try to find this thing and then find that that is something that they can scan and then pull this up.
I would just be curious, do we have any of the lobbyists for the social media platforms in the room today? I haven't seen anyone I think. I think they probably don't have enough courage to come and face these parents that are here today. And again, we thank you.
Senator Coons, you're recognized.
Sen. Chris Coons (D-DE):
Thank you so much, Senator Blackburn, Senator Klobuchar, for convening this important hearing today and to our two witnesses for your courage, your determination to make sure that the truth gets out. Protecting our children from harm is the highest obligation all of us have. And it must have been so difficult for you to work for a company that in some ways does admirable things, delivers great services, but that as you serve there longer and longer, you began to realize was knowingly and willfully blinding themselves to the harm that their products and services caused children. And then taking aggressive action to prevent you from studying or understanding the harm being caused to children and then try to prevent you from communicating about that to anyone.
So here you are today testifying in front of the Senate Judiciary Committee to a bipartisan panel that includes seasoned prosecutors and seasoned senators and parents and grandparents. And frankly, your testimony has been alarming, even jaw-dropping. To summarize, Meta prioritized engagement over safety for billions. And when you try to inform them of demonstrable harm and risk to children, they first turn to blind eye and then tried to handcuff or blind you and others charged with research and promoting integrity.
Whether it's artificial intelligence or social media or virtual reality, we are in a very difficult period for parents. As Senator Blackburn just demonstrated, many of us lack the focus, skill, and ability to navigate the exact path towards parental controls on systems our kids are employing. And so a very small percentage of parents are effectively protecting their children. They would expect that businesses that provide these services would test whether they're safe and would build them to be safe for children, yet your testimony proves otherwise.
This ends up happening because there's a huge imbalance in power between big tech platforms who have all the data and all the power to understand the impact of their products and engineer them to favor safety and the children and families, the policy makers, and the advocacy groups who have no way to get that insight. Addressing this imbalance of power is a fundamental and essential component of ensuring we're protecting our kids online. I want to talk briefly about three bills that I've introduced or am developing that are designed to help address this.
The first with Senator Cassidy is the Platform Accountability and Transparency Act. It creates mechanisms for independent researchers to study what's happening on social media platforms, how their algorithms drive engagement over safety. Second, with Senator Grassley, a bill to protect whistleblowers, particularly in the space of artificial intelligence who come forward to disclose serious safety violations or vulnerabilities. And then last a bill I'm developing to create similar mechanisms for independent research and transparency in artificial intelligence platforms.
If we don't move forward bills like these and some of the bills my colleagues have championed, we are continuing to allow big tech to grade their own homework, to build their own platforms, and to continue to cruise forward towards profitability through engagement, blinded or uncaring about the harm to our children.
So Mr. Sattizahn, if I might. Your testimony about all the ways that Meta buried or hindered internal research is just stunning. Given what you saw on the inside, could you just say a few words about the value and importance of ensuring there's mechanisms for independent researchers to actually study what's happening, whether it's in virtual reality, social media platforms or AI?
Jason Sattizahn:
Yeah. Thank you so much for going over those three different initiatives as well.
Sen. Chris Coons (D-DE):
Thank you.
Jason Sattizahn:
I'd say over the last six years, it was very clear to me that they will not change from the inside out. Meta will not change from the inside out. And during my long tenure there, the only things that actually led to professionalism or doing the right thing was Meta's fear of losing control, whether that was losing control over their own finances or losing control from some regulation or oversight coming in from the outside.
So all I have to say is I love that phrase, "They cannot grade their own homework" because the only thing that will change the company is initiatives like that vying for independence, because as I've also testified is if we rely on their own research, it will just be altered, changed, or even just erased. So thank you.
I don't know if you had anything to add
Sen. Chris Coons (D-DE):
And Dr. Sattizahn and Ms. Savage, if you would, what's most important for us to get right in legislation that ensures access for independent researchers?
Cayce Savage:
I mean partially starting at collection, starting with the methods that are being used to gather the data, the populations that we're looking at et cetera, but also having access to the data before it's analyzed. That's very critical.
Sen. Chris Coons (D-DE):
Ms. Savage, could you talk about the ways Meta specifically discouraged employees from coming forward as whistleblowers and what more robust whistleblower protections might be important in order to ensure that we, policymakers, the public, parents know about the risks their kids are facing?
Cayce Savage:
Absolutely. I think the most powerful weapon that Meta had internally was narrative. After Frances Haugen's disclosure, Meta referred to the incident as a leak and frequently said, "Oh, this was so harmful to the researchers whose reports were shared. This has really been harmful for our ability to do good research and actually investigate harms to users."
In terms of actual whistleblower protections then, I think part of it is the folks who are actually whistleblowing, but also the folks who remain at the company who do the investigation. I'm curious if you have anything to add there.
Jason Sattizahn:
I think it was phenomenal.
Sen. Chris Coons (D-DE):
And my sense is that in closing, that good research is independent, unbiased, repeatable, and through legal action can lead to improvements in product safety. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Schiff, you're recognized.
Sen. Adam Schiff (D-CA):
Thank you, Madam Chair. Thank you, and Senator Klobuchar for organizing this hearing today. And thank you both for your courage in coming forward and speaking out. I can only imagine how difficult that has been, but so vitally necessary if there's ever going to be any change, not just with respect to Meta, but with respect to the whole ecosystem.
I wanted to ask you in some of your prepared remarks or things you've said in the past, you identified information essentially that Meta's legal department or others willfully erased, removed. Can you tell us specifically what you're referring to there?
Jason Sattizahn:
For sure. I would like to contextualize this with an example. I was running a survey, for instance, that I discussed a little bit in my disclosure where we found explicit information showing that women were being emotionally and psychologically harmed. After that survey was released, a legal team came to me and said, "The survey will not run in the future no matter what unless you take those questions out." That would be something along the funnel that they are preventing any additional data from being collected around that topic.
In other instances where I saw emotional or psychological harm as one example, and I had already written the report, legal actually opened the report with me and said, "You have to take out these slides, you have to take out these lines." And for context, in research, all data on the back end needs to be deleted in 90 days. So if legal is going into report and taking out lines or slides, they're effectively simply erasing the data anyway because it's our ethical duty as researchers to erase everyone's other private data before we analyze it as well. So those are just a couple examples of things, ways that they would remove that information.
Sen. Adam Schiff (D-CA):
And what kind of justification, if any, did they give you for deleting that data or instructing you not to ask those questions? How overt were they about concerns that this would expand their exposure or these were answers they didn't want to know or they just wanted to purge the record of anything that didn't reflect well? I mean, what's the point of doing research if you're going to so bias the result, but what did they articulate as their justification?
Jason Sattizahn:
Legal's repeated explicit statements to me was that we did not want this data because it was too risky for us to have because if there was an outside audit, it would be discovered that Meta knew about these harms. That was said to me dozens of times by my direct legal partners that I was forced to work with from Meta.
Sen. Adam Schiff (D-CA):
And was there any pushback to that idea within Meta? That is, were there others at Meta who were saying, "We're seeing real harms here and whether you minimize your legal exposure or not, the harms are taking place, something has to be done about this"? Was there any kind of pushback like that?
Jason Sattizahn:
There was. I know myself, I went to both management and leadership over years with this and it was entirely denigrated and pushed aside. Other researchers that I knew who shared similar concerns, either left the company or were in direct fear of losing their role and decided to stop speaking up about it out of retaliation and losing their job.
Sen. Adam Schiff (D-CA):
And the management that rejected these concerns out of hand, how did they reject them? What was their argument?
Jason Sattizahn:
In one instance, my direct manager stated that she agreed with me and that I might very well be right, but that we needed to listen to legal anyway because that was the instruction.
Sen. Adam Schiff (D-CA):
And did that kind of instruction in your view, reflect the top leadership at the company essentially? Was legal following what they believed to be the culture of the very top leadership at Meta?
Jason Sattizahn:
When this entire change after Haugen happened, leading to what we're discussing now, it went as high as Meta's own CTO Andrew Bosworth arguing with people one-on-one in comment sections about researchers speaking out about these problems. The CTO of the company was arguing with researchers like us who were speaking up saying that this lockdown on research was inappropriate.
Sen. Adam Schiff (D-CA):
And do you have any sense from your colleagues who are or were within similar positions in other of the tech companies, whether what you were experiencing was an anomaly or whether what you're experiencing is really the rule within the industry, that is see no evil, hear no evil, don't want to have to address any evil or any ill result no matter how dire might be on young people?
Jason Sattizahn:
After working at Sony PlayStation, at Meta, and working directly with individuals who worked at Google, et cetera, I can say that this is pure Meta.
Sen. Adam Schiff (D-CA):
So you think this particular culture is unique, harmfully unique to Meta?
Jason Sattizahn:
Correct. Every tech company has issues with schedules and the fast pace leading to certain trade-offs, but this culture of kowtowing to legal blindly, I have not heard of it anywhere else.
Sen. Adam Schiff (D-CA):
Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Thank you so much. I think there's interest in the second round of questions. And if you all don't mind, we will proceed to that.
I want to stay right with this funnel of manipulation and look at that. I know that Frances Haugen had talked about the risk. Meta became paranoid and then they talked about not the risk to kids and to people that were on their platform, but the risk to their bottom line. Is that accurate? Okay, y'all are nodding yes.
Cayce Savage:
That's correct.
Sen. Marsha Blackburn (R-TN):
Okay. So in order to get legal protection, so they couldn't be hauled into court, they went to third-party vendors, correct? And Dr. Sattizahn, you want to talk about that for a moment?
Jason Sattizahn:
Correct. So one part of this manipulation was the legal's consistent guidance across nearly every study I ran, and I know my colleague, Ms. Savage ran as well, was that we were required to have third-party vendors collect data for us. Just for the record, third-party vendors are individuals who are not Meta employees but are paid via contract to collect data for Meta. The explicit expressed reason that legal was telling us to have third-party vendors collect the data was so that the data could be erased if they found anything that was "too sensitive".
Sen. Marsha Blackburn (R-TN):
Therefore, they would have some protection?
Jason Sattizahn:
Correct.
Sen. Marsha Blackburn (R-TN):
And some distance and would not have that liability?
Jason Sattizahn:
Correct.
Sen. Marsha Blackburn (R-TN):
Okay. Did you find it odd? You worked at PlayStation and PlayStation has recommended ages for PlayStation.
Jason Sattizahn:
That's correct.
Sen. Marsha Blackburn (R-TN):
Movies have age recommendations and ratings. Video games, likewise. Did you find it odd that Oculus did not have an age rating, that they had escaped that?
Jason Sattizahn:
I did. I found it very odd indeed.
Sen. Marsha Blackburn (R-TN):
Did you raise that with Meta?
Jason Sattizahn:
I did. Mainly through attempting to correct our age data so that we even knew what types of experiences would be exposed to different individuals.
Sen. Marsha Blackburn (R-TN):
Okay. Ms. Savage, I want to talk with you about the content of the research that Meta was fearful of. And much of this we think is because they were looking for some way to escape liability. But the documents that I've reviewed, and I thank you all for submitting documents to us, I think have some really great insight into the pervasive abuse occurring on these VR products.
And one example, Ms. Savage, that you gave us, raised questions about how children were sidestepping age requirement and that Meta didn't care about that. So talk to me about the research that you presented to Meta on that and how they pushed back.
Cayce Savage:
The first study that I did for the VR youth space was on parental supervision of VR. That research very clearly showed that parents were most involved during the setup of the headset. And then after that, really it was just the child using it.
Firstly, we saw that parents were often making accounts under their own information on behalf of the child, partially because they thought that that would give them supervisory access, but also partially because they didn't want Meta to have their child's data. So of course then the child is using an adult-aged account because Meta has not educated parents about the importance of children using appropriately-aged accounts. But also we saw that children are the ones who are driving headset, buying in the household, children are the ones using it. So often children are making their own accounts because they want access to restricted games, they want access to adult content and things like that. That's part of development. Children seek mature experiences and parents are often not aware that children are doing this because again, Meta hasn't educated them.
Sen. Marsha Blackburn (R-TN):
Okay. Let's talk briefly about the Germany case. I want to get that on the record. You have talked about that when we visited, and I think that that incident really, there's a lot of that resistance that coalesces in this. So outline the Germany case for us.
Jason Sattizahn:
Yeah. When VR headsets were going back to being sold in Germany around I believe November of 2022, we were asked as a team to research basically what the safety was like in Germany itself. Despite asking management and leadership, we were never actually explicitly told why we were doing research in Germany and it was to my understanding that it was because they wanted a paper trail, a research study to show that it was "safe" to do research in Germany to release again in Germany and have this headset sold to children.
When we went to a German family's house to have an interview with his family, a young mother, one of her children we were interviewing about how they use these headsets and her son began discussing how another member of their family, an underaged, I believe he was around eight or nine years old, his younger brother was sexually propositioned, was asked for nude photos, had all sorts of sexual harm occur to him when he was using the headset privately. And the mother was horrified. In fact, actively, even when Meta tried to in real time shut down the questioning by the moderator because they didn't like the data they were collecting, the mom kept asking for the son to talk about it. She was giving explicit consent in order to understand what is going on with my family in this case. Despite that fact, afterwards, we were asked to delete all of our notes and recordings of it happening and we did not discuss it again after that.
Sen. Marsha Blackburn (R-TN):
Okay. Let me ask you this. And my time has expired, but has Meta been sued by the EU for any privacy violations? If you can submit that so that I can get that on the record.
Jason Sattizahn:
I can-
Sen. Marsha Blackburn (R-TN):
Senator Klobuchar, you're recognized. Okay, Senator Blumenthal.
Sen. Richard Blumenthal (D-CT):
Thank you. I apologize. We have another hearing going on as frequently happens in this place and votes and so forth. So I really am sorry that I haven't been here and I especially apologize if I'm retreading some of what you've already talked about. But I think that the issue of shifting responsibility, offloading the burden of caring for kids as Meta has done is especially important here, because to the parents of America, I want to say you are not alone here. You need help. And I say that as one who has received some help from my four children and still regard myself as learning about how to protect children.
But what really offends me is that Meta knew that, as you have said, Ms. Savage and I'm quoting, "Parents were neither prepared to meaningfully use the VR parental controls, nor likely to use them often and that the controls alone were insufficient to keep teens safe." In all your disclosures you indicate that Meta was all too happy to pass the responsibility for protecting their own children onto parents or other companies, anyone but themselves. And I want to invite you to expand on that point that that you've made Ms. Savage.
Cayce Savage:
I appreciate it. I think the most egregious delineation of this is on the bottom of the box it says, "Not all children are ready for Meta VR." The implication there to me is if something goes wrong in VR, it is the child's fault because they are not mature enough to handle the experience. Part of the reason that parental supervision tools are not sufficient is because of the lack of education about what it means to be ready for VR, what it means to use VR, but also the fact that VR is social and usually entails interacting with strangers and what that means. Parents whose children use VR headsets are often shocked when I describe to them what the experience of using VR actually is and what the experience of being sexually harassed in VR actually is like.
Sen. Richard Blumenthal (D-CT):
And that is the reason why in our legislation, the Kids Online Safety Act, we impose a duty of care to provide parents with help but impose a responsibility on big tech companies to provide that help, not just as a matter of their goodwill but their responsibility when they know that there is a danger to children and to take action to help prevent that harm. What is your experience with age verification?
Cayce Savage:
Oh, yes. I proposed a research study called Age Assurance, which encompasses age prediction, stated age, and age verification because the quality of Meta's age data is very poor. As we can see Meta claims if we know that there's a 13-year-old kicked off of the platform and yet their platform remains full of children under 13 and so clearly I knew additional research was needed, particularly research because VR is a new kind of technology which encompasses new ways of gathering data to understand someone's age and that was the research that was mysteriously shut down. Part of verification though is the user needs to perceive some value and they need to be willing to provide the company with information either their birthday or their ID. And Meta has something that internally is called a brand tax, which means that users don't trust Meta with their data and so are less willing to provide things like ID and that's part of the reason why users often don't accurately represent their age.
Sen. Richard Blumenthal (D-CT):
Thank you. And I want to just ask you finally, have you heard from Meta's leadership or their executives about your testimony here today?
Cayce Savage:
Not directly.
Jason Sattizahn:
Not at all.
Sen. Richard Blumenthal (D-CT):
When you say not directly?
Cayce Savage:
They did respond to the Washington Post.
Sen. Richard Blumenthal (D-CT):
In other words, you've read what their responses are in published sources, but they haven't contacted you directly?
Cayce Savage:
Correct.
Jason Sattizahn:
No at all, no.
Sen. Richard Blumenthal (D-CT):
Thanks, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Hawley.
Sen. Josh Hawley (R-MO):
Ms. Savage, you testified earlier that a majority in your view, a majority of Meta's VR users are under the age of 13. I'm just curious, what percentage, if you had to guess, or maybe you know the exact number, maybe you do, Doctor, but what proportion of Meta's VR users, the children, the minors are exposed to sexually explicit content, sexual propositioning, sexual abuse of some kind on a VR platform?
Cayce Savage:
The prevalence of it is extremely high because it's very difficult to monitor that content in VR. It's difficult to monitor it on Instagram, much less VR. So I would estimate that any child that is in a social space in VR will come in contact with or directly expose something very inappropriate.
Sen. Josh Hawley (R-MO):
I'm sorry, did you say any child?
Cayce Savage:
Yes.
Sen. Josh Hawley (R-MO):
So you mean every user?
Cayce Savage:
Yes.
Sen. Josh Hawley (R-MO):
Every user will be exposed, every minor, every child will be exposed to sexually explicit content of some sort is what you're saying?
Cayce Savage:
I see it every time I use the headset.
Sen. Josh Hawley (R-MO):
That's remarkable. That's just remarkable. You also said Ms. Savage that negligence in the VR platform is emblematic of negligence across Meta's platforms when it comes to child safety. Let me just ask you about chatbots. This has come up before. I'm sure that you're aware of the recent reporting that Meta allowed its AI of chatbots to engage in explicit content, romantic and sensual I think conversations with children. Let's just take a look actually at what the internal document brought forward by a whistleblower said. It's right here in this middle portion. This is a Meta document. It is acceptable to engage a child in conversations that are romantic or sensual. Let me say that again. It is acceptable to engage a child in conversations that are romantic or sensual. It is acceptable to describe a child in terms that evidence their attractiveness. Again, this is Meta's own internal, these are their own internal guidelines. Does this surprise you at all?
Cayce Savage:
Unfortunately, no.
Sen. Josh Hawley (R-MO):
Doctor, this surprise you?
Jason Sattizahn:
Not at all.
Sen. Josh Hawley (R-MO):
When Meta says, as they have said, following the disclosure of these guidelines that they will redouble their efforts to make sure that children are safe on their platforms, including with chatbots, does that give you any comfort? Do you think that we should take them at their word?
Cayce Savage:
Oh, no.
Jason Sattizahn:
In no way, shape or form, no.
Sen. Josh Hawley (R-MO):
In no way, shape or form. Let me just ask you one other thing. You were with Meta, how long, Ms. Savage, approximately?
Cayce Savage:
Four years.
Sen. Josh Hawley (R-MO):
Four years. Dr. Sattizahn, what about you?
Jason Sattizahn:
Six years and I do want to say for context, that's about longer than 90% of employees will ever work at the company because the turnover is so quickly. I believe it would've been about 80 for you.
Cayce Savage:
They calculate it for us on our profiles so we know that that's correct.
Sen. Josh Hawley (R-MO):
Wow. Four years and six years respectively. As you look back on it, if you knew then what you know now, would you have gone to work for Meta, Ms. Savage?
Cayce Savage:
I think I knew that about the company going into it. I hope to make it better.
Sen. Josh Hawley (R-MO):
Doctor?
Jason Sattizahn:
I would've, and I would've taken even better documentation.
Sen. Josh Hawley (R-MO):
Let me just end with this. Based on what you know now and your time at the company, do you think that Meta is a force for good in this country and in the world?
Cayce Savage:
I don't see how they can be.
Sen. Josh Hawley (R-MO):
Dr. Sattizahn?
Jason Sattizahn:
No. It is aggressively ambivalent to people.
Sen. Josh Hawley (R-MO):
Aggressively ambivalent to people, and yet this is a company all about people, their attention, their interests, their data, their lives. This a company that's lost its way, profoundly lost its way, and they are harming our children as a result. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Senator Klobuchar.
Sen. Amy Klobuchar (D-MN):
Thank you very much, Senator Blackburn. And thank you Senator Hawley, Senator Blumenthal, and I will remember forever the aggressively ambivalent, which I think is one of the many ways we can describe what's going on here. So Mr. Sattizahn, Meta is the global leader in virtual reality and mixed reality devices with a reported 73% market share. And I think this is key, as you may know, I do a lot of work on antitrust and so when the dominant player in virtual reality suppresses alters and deletes research into youth safety, how does that hinder the broader industry from implementing and improving safety standards for young users? Not that it would be good if anyone did it, but in my mind it makes a difference. Could you explain your view of that?
Jason Sattizahn:
There's no incentive for the company to change at this point. Working internally at the company, there were consistent discussions about this that we were too big to fail in this space. My own data that I've shared with you all has shown the sheer number of individuals that are focused only on Oculus platforms for VR. And one of the thing that I do want to say that you would only understand if you work inside of these companies is that when you have such a major player that is sucking up all engagement and commercialization of this thing, you end up taking a lot of the engineers and the people that build for it that then have to abide by the rules, these unethical rules that they're forcing everyone to follow and it creates this vacuum effect where then nothing can change.
Sen. Amy Klobuchar (D-MN):
So it's harder for the other companies to access that and then as well as they create almost this completely, already to me, unfair advantage by being so big. And then they also make it harder for the little ones to do it because then they wouldn't even be competing on this very sad playing field to where a lot of the goal has been to bring in these kids to get more advertising research. Not the only goal by far, but one of the goals.
Jason Sattizahn:
If I may, I also want to add that the advent of Meta's VR is so unique, because the integration of billions of data points of content from social media has never existed. Meta is social first in VR, and so not only are they taking the entire gaming space, it's that now they're integrating Instagram content and Facebook content, which no other platform has. And it creates this system of virtual reality that it's not just gaming, it's everything and no one understands the rules.
Sen. Amy Klobuchar (D-MN):
Yep. Mr. Sattizahn, you provided research results to the subcommittee that showed that 36% of users reported unwanted sexual advances either every time or often in virtual reality. Is it true that Meta stopped tracking information about unwanted sexual advances rather than enact safeguards against sexual harassment?
Jason Sattizahn:
It is true, and I also want to mention that they also obscured findings from study to study to limit that data being collected as well.
Sen. Amy Klobuchar (D-MN):
Okay. And can you talk about the prevalence of sexual harassment on Meta's virtual reality platforms and whether Meta did enough to stop it?
Jason Sattizahn:
No. I would have to go back and look at statistics. I'm happy do a follow-up for this. To my memory, to my recollections, it was about 10 to 20% of individuals across the board were experiencing sexual molestation, groping, solicitation, and it was significantly higher for women. This was something that I repeatedly went to management and leadership about and I was repeatedly told that we were underfunded. We could not develop the safety tools or even basic education so that people would know that this just might be something they'd see. Education itself is very powerful because even if someone just knows that this is a possibility, you might have some remediations or tools even insufficient in order to handle the problem.
Sen. Amy Klobuchar (D-MN):
You want to add something? Then I had a question with you Ms. Savage.
Cayce Savage:
Meta is worth almost $2 trillion. Are they now?
Sen. Amy Klobuchar (D-MN):
Yeah, that's right. Yes. So Ms. Savage, Meta's virtual reality platform can be used as we discussed in my opening, I know others, by child predators to mingle with kids online while concealing their age and identity and the consequences can of course be devastating. In 2022, two separate kidnapping incidences were connected to the virtual products. Adult men convinced teenage girls in the unreal world to meet them in the real world where they were able to kidnap and move them across state lines. Luckily both girls were found by police, returned home safely. Can you talk more about how seemingly innocent relationships fostered in virtual reality can lead to harms in the real world?
Cayce Savage:
That's a very good question, and my own research showed that while parents have rules about interacting with strangers for children, it takes only a few interactions before they consider someone a friend online. Trust is built very quickly. That's part of the development process.
Sen. Amy Klobuchar (D-MN):
Right. Especially with kids.
Cayce Savage:
Exactly. And so particularly when you don't know what someone looks like in real life, if they tell you your age, you'll likely believe them. So it's very easy to build trust. And of course there are playbooks that individuals who are groomers use for this.
Sen. Amy Klobuchar (D-MN):
That's right. Why is it uniquely challenging for parents and law enforcement to detect harm in virtual reality environments?
Cayce Savage:
Part of it is because there's no log of it and the child, they have something strapped to their face that you often can't see. So parents, it may be happening in the living room and parents may not be able to see what's happening, but also because children consider unfortunately sexual solicitation and things like this to be pretty normal for the course of being online, and so they may not tell their parents this is happening. Also, part of development is that children are learning what is and is not okay, so it may be happening and the child may think nothing is wrong.
Sen. Amy Klobuchar (D-MN):
Okay. So last question here. In addition to the bill that my colleagues have long worked on and got through the US Senate on kids and the standard of care and things that we need to do to update our laws, one other idea of course is to make changes to Section 230. In every other business sector, parents who believe a company contributed harm to their kids have a right to have their case heard in court, but social media companies argue that parents have no such rights because Section 230 immunizes them regardless of the content that they push to users all while making billions in profits off of tracking and marketing tickets. I often think if we had put in place this ability decades ago, a decade ago, five years ago, we would never be where we are now with fentanyl being sold and pornography and finally we're starting to do something, but all that time passed, all those kids' lives were damaged or lost forever.
Do you believe that victims, I'll ask you first Mr. Sattizahn, victims of harm that occur on Meta's platform should have the ability to sue Meta for its roles in these harms? And do you think Meta will change its behavior and take child safety online seriously without opening the courtroom doors to victims?
Jason Sattizahn:
To your first question, I believe it is necessary. As I mentioned before, only outside regulation and financial punishment is something that will actually create change inside the company. So to answer your question, that would essentially be necessary for the company to learn or at least change its behavior in the future.
Sen. Amy Klobuchar (D-MN):
Ms. Savage?
Cayce Savage:
I agree, and I would also say harm that happens online is harm, full stop. And so parents should be able to respond accordingly.
Sen. Amy Klobuchar (D-MN):
Okay, thank you.
Sen. Marsha Blackburn (R-TN):
Senator Blumenthal.
Sen. Richard Blumenthal (D-CT):
Thank you, Madam Chair. I know we've been at this for a while. I just have a few questions to conclude. I asked you before whether Meta had contacted you directly, you said not, but you have read their responses in the Washington Post. And I'll just put it bluntly, in effect, they're calling you liars. They say that your testimony is based on "a few examples stitched together to fit a predetermined and false narrative. False." And their spokesperson says, "We stand by our research team's excellent work and are dismayed by these mischaracterizations of the team's efforts."
Let me just say if Meta wants to come forward and answer questions, I'm sure that my colleagues and I would be glad to hear from them sitting at this table. And in fact, in the meantime, we are going to be writing them a letter. Senator Blackburn and I have been working on it and once again, I thank her for her leadership demanding that Meta produce its research and its research policies and its practices concerning kids' safety.
And if you are wrong, Meta is welcome to show us their evidence. In the meantime, the Washington Post article is of, in my view, a searing indictment. And I know it's long, but I hope folks will read to the end of it, to the very end of it where Meta is quoted on its use of age verification with an ID card or credit card and the way it uses that tool if it suspects people are lying about their age and to help with another tool to help third-party VR developers understand their users' ages.
In 2023, apparently parents were surveyed and they said in effect, this tool isn't working. And they were advised by Meta, a Meta lawyer apparently, "Maybe I can take a look and help you strategize a way to frame it so as to avoid these types of responses." She advised they couldn't destroy the responses they'd already collected, but they could reduce or end these kinds of responses going forward. In other words, see no evil, hear no evil. They don't know any evil because they're not going to ask the questions in a way that would elicit information. And the lawyer advised on another occasion. "I would just phrase it in a way to ensure that participants do not volunteer information about users under 13." The article says the lawyer also checked with the researcher to see whether he had directly asked adults whether their children used Meta VR devices. The researcher said he had not. "Okay, that's great." The lawyer replied.
When Meta denies that your testimony is accurate and correct and then we have these kinds of direct evidence, it's hard to give any credit to what they say. And so I invite them to respond to our letter to give us the research, to show us the policies, to in effect reveal what they've suppressed and how they have discouraged asking the questions that would show them the truth. And so I want to thank both of you for coming forward to tell us the truth. And thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Thank you, Senator Blumenthal. I wanted to ask you, you mentioned that bonuses were given on user engagement, correct?
Jason Sattizahn:
Correct.
Sen. Marsha Blackburn (R-TN):
Were bonuses ever given for creating safety protocols?
Jason Sattizahn:
To answer that question, I was instructed over years for my safety and integrity work leading to protocols to be tied solely to engagement. So in my professional opinion, engagement is not the best measure if people are safe, it's if people are safe. Yet from management and leadership, we were told repeatedly that safety initiatives had to be tied to user engagement.
Sen. Marsha Blackburn (R-TN):
There you go. And did you get the sense that Mr. Zuckerberg was intimately involved in decision making around the suppression of child safety research?
Cayce Savage:
Would it be helpful to describe what the company used to be like before 2021?
Jason Sattizahn:
I believe so. So when I joined the company in 2018, we used to do things called Zuck reviews. This is when Mark Zuckerberg would come to us or to a team and say, "Hey, I'd like you to do a presentation on marketplace safeties, Facebook dating, et cetera." When I joined in 2018, I would be in those reviews. I would make a slide. I was asked to contribute to those reviews and magically after Frances Haugen's disclosure in 2021, those Zuck reviews went away. Management and leadership still told us that they would have reviews with him about sensitive topics, but it was never directly stated if he saw your work.
The reason that we were told from management and leadership was because they didn't want to have a trail of knowing that he was reviewing sensitive information, but because we were reviewed and our performance was reviewed on, if higher ups had seen this stuff, management would look at the two of us and say, "Hey, it's going up to the higher ups." If it was the CTO, it was the VP or anybody else, they would use their name and if it went up to Mark Zuckerberg, they'd say, "Hey, it went up the chain to leadership." But I don't know if you had another example for that.
Cayce Savage:
After 2021, that's the only thing that changed, was only the absence of Zuck's or Mr. Zuckerberg's name.
Sen. Marsha Blackburn (R-TN):
So in essence what they did was change the culture to insulate themselves so they would not be sued by parents or by others. They suppressed the research. They used third party vendors to put up a wall, and again, what we have seen them do is to put profit over children and they have used children as a profit center when those children are online regardless of the harms. I want to thank you all. You've been excellent witnesses and we are very grateful. We would invite, I joined Senator Blumenthal, if someone from Meta, and I'm sure they are all watching this and they're texting one another back and forth or signaling so they don't get caught. If they want to challenge what they've heard today, I would encourage them to come forward. I think that they see there is truly bipartisan anger, not only with Meta but with these other social media platforms and virtual reality platforms and chatbots that are intentionally knowingly harming our children. And this has got to stop.
Enough is enough. We are intent on passing the Kids Online Safety Act. I think today has really laid out more of the reason for doing that. We will continue to submit questions to you for the committee that will go through Tuesday, September 16th at 5:00 PM. We would ask that you respond in a timely manner within a week to those questions. We're grateful to you all. We are grateful to the parents and this hearing.
Sen. Amy Klobuchar (D-MN):
I just...
Sen. Marsha Blackburn (R-TN):
Yes, you go right ahead.
Sen. Amy Klobuchar (D-MN):
Thank you. I just want to thank you Madam Chair and our witnesses, and just to reiterate that the company can come before this subcommittee. They can provide us answers, but the best way to resolve this is to get this bill passed to realize there's more and more senators that are interested in going even farther and that this is a reality on this committee right now, including with the chair and ranking member of the entire Judiciary Committee, and that they need to start taking this seriously, if not for us and whatever power we have.
The fact that we were able to get some bills passed over the last few years, not nearly as much as we want, that they have to understand that we are very serious about this and that this hearing and your willingness to come forward, the family's willingness to come forward can make change. It's just a question if they want to have any input in it or if they just want to make anonymous or spokesperson comments about it without actually being willing to come forward. And so we're ready to talk to them, but mostly we want to get something done. We're tired of the talk.
Sen. Marsha Blackburn (R-TN):
Hearing adjourned. Thank you.
Sen. Josh Hawley (R-MO):
Thank you.
Authors
