What Carrie Goldberg Has Learned from Suing Big Tech
Justin Hendrix / Feb 8, 2026The Tech Policy Press podcast is available via your favorite podcast service.
A wave of lawsuits in the Unites States is targeting tech firms for their product design decisions. Lawyer Carrie Goldberg has played a role in establishing the product liability theory that underlies them. As the founder of C.A. Goldberg, PLLC, in 2017, her firm brought a lawsuit that sought to apply product liability theory to a tech platform — Herrick v. Grindr — arguing that a dangerous app design, not just user behavior, was the source of harm.
In 2022, Goldberg was appointed to the Plaintiffs’ Steering Committee in the federal social media multidistrict litigation. She’s led cases against Amazon, Meta, and Omegle, has testified before the Senate Judiciary Committee on child safety issues, and is the author of Nobody's Victim: Fighting Psychos, Stalkers, Pervs, and Trolls. I spoke to her from her offices in Brooklyn about what she's learned over the last decade, and about some ongoing litigation that remains in dispute.
What follows is a lightly edited transcript of the discussion.

WASHINGTON — February 19 2025: Carrie Goldberg testifies at a Senate Judiciary Committee Hearing titled: "Children's Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps." (Photo by Joshua Sukoff/Medill News Service/Sipa via AP Images)
Carrie Goldberg:
My name is Carrie Goldberg. I am the owner of C.A. Goldberg, PLLC, which is a law firm that's based in Brooklyn that sues psychos, pervs, trolls, and toxic tech.
Justin Hendrix:
Carrie, we are neighbors in Brooklyn, though I don't think we've ever met in New York City. I want to ask you a little bit today just about your career. Your current work and where you see things headed when it comes to the work that you do in terms of pushing for greater accountability, and addressing many of the harms. You already mentioned the target of your law firm's work.
I think you first came to my attention maybe around 2018, 2019. Around the time that your book came out. You were being profiled in the media as the "revenge porn lawyer." Your book made some waves, and sounds similar to the target of the firm now. It was called Nobody's Victim: Fighting Psychos, Stalkers, Pervs, and Trolls. How did you come to this work of focusing on tech? Where did it start for you?
Carrie Goldberg:
Well, thank you, Justin. I'm really happy to be here. When I first started my firm 12 years ago in January of 2014, I had just had something really traumatic happen to me involving a guy that I had met on a dating app who, when I broke up with him after this brief four month relationship, he just went on an absolute rampage against me. Showing up at my home, texting, calling me nonstop. Going onto all of social media, saying that I was a whore and a drug addict, and saying he'd hacked into my work computer. And then sending me nude videos and pictures of me that he said that, he was blind-copying all these people in my life.
And then he was also contacting all these people in my life. And it was just this unrelenting, really traumatic barrage. And I was already a lawyer at the time, but I was working with elderly people. And I had this involuntary education into stalking and realized very quickly that New York State had no protections for when it came to intimate images being sent around.
Even after I got my order of protection, which the judge told me my offender had had First Amendment rights and didn't have to stop sending naked pictures of me. I think that was the scariest thing, was, like, "Oh my gosh. He can continue to harass me." And at the time, there were all these revenge born websites that were searchable by name and state. And if you typed in one of the victim's names into Google, it would just populate page after page of Google results with links to the revenge porn.
That's what we called it back then. And so I was just kind of a transformed person after this breakup because I suddenly knew what fear really was. It went on and on. But I was so transformed after I finally got my order of protection. And he also had me arrested for false charges, and I finally got all of that stuff removed.
And I just kind of assessed my life and none of it made sense anymore. And I needed to be working on this issue that I was still, really, so traumatized by. And just in this moment of courage and chaos and stupidity I quit my job at the Vera Institute of Justice and started my law firm when I was still really, really freaking out about all this. And I created a little website. I had no staff, no money. Just this tiny, windowless office in Dumbo. But I said that I was an internet privacy attorney.
And then suddenly that caught on. I started getting calls to be on panels, and little profiles about me, and people calling me the revenge porn lawyer. And I was spending most of my time just trying to get legislation in New York State, and working with other sort of national advocates. And I didn't have a client and was totally having imposter syndrome, being called the revenge porn lawyer of the country. But then I looked around and I was like, "Well, I guess no one else is doing this. So even me, clientless, I guess is." And I just started to get clients. I represented most people for free that first year, just trying to get their nude images off the internet, and learning how to do that.
And in the meantime, I'd also figured out that the websites that were hosting them were immune from liability because of Section 230. And so that's when that law finally got on my radar. And it was the end of the year, of November 2014, when I got this young client whose news had circulated all over her eighth grade school and then went across the borough. She lived in Brooklyn. And it was through an app that she trusted would disappear her images. And I was like, "My god. I don't know anything about product liability, but you relied on that product to disappear your images and instead, people were able to screenshot them and circulate them." And that's illegal child pornography.
Justin Hendrix:
So that's the first time the phone rang with a direct client. Is that right?
Carrie Goldberg:
I'd had other clients needing orders of protection and whose images I could just... Needed DMCA content removal. But this was the first heavy, hearty case where I was like, "This child has a real problem, and I need to figure out how to resolve it." And so I sent this demand letter for an ungodly amount of money to the CEO and all the angel investors of this company the Wednesday before Thanksgiving. And I was like terrified because I was just this one woman operation, and it was just me and my eighth grade client against this multi, multi-billion dollar company. And two days later they had hired a legendary Silicon Valley lawyer to contact me, and we took it from there, and the case resolved very quickly. And I was able to use that momentum to grow my law firm.
Justin Hendrix:
So I now understand you're representing, is it over 1,000 victims at this point?
Carrie Goldberg:
Over the years, yeah, it's been over 1,000 victims. And it's everyone from people whose nude images are online without their permission and us just doing content removal, to people who have been sexually assaulted at their school and were suing this school. And maybe nude images have been involved in those cases. And then huge cases against big corporations like Amazon for selling suicide kits to kids.
Justin Hendrix:
And I know we want to talk a little bit more about that Amazon case in addition to some others, but of course I have to ask you about the most immediate news, which ties very much... I think you could see it as a point in the succession of cases that you've been involved in. But two weeks ago, this lawsuit filed by the mother of one of Elon Musk's sons, suing over Grok generating explicit images involving her. Of course this Grok controversy we've talked about on this podcast in just the last couple of weeks. It's been, since the turn of the year, one of the big stories in tech. This type of thing seems to be exactly what you'd been warning about in your op-eds and testimony over the last couple years. What were you thinking when the phone rang this time? When it was this particular client.
Carrie Goldberg:
Wow. I spent about five years trying to shrug off this persona as the revenge porn lawyer, and I was like, "No. I've been doing all these very intense product liability cases against apps, and a lot of them involve wrongful death cases." And I'm like, "I'm not just this revenge porn lawyer." Over the last six months we've seen more and more situations where artificial intelligence is undressing victims. Innocent people whose images are being created and manufactured by AI and then posted online.
And we have a bunch of cases involving kids doing it to kids and stuff. But in the last couple months this technology was implemented into one of the world's biggest social media companies, X. And so suddenly there was this ability for the public to generate sexually-explicit images that are digital forgeries and to then, basically, publish them at scale. So I'd been talking a lot up to the press about the incidents of Grok and how we'd never really seen this weaponizing of AI, and certainly not published at this scale.
And that's when I got connected with Ashley, who, nobody had really borne the brunt of these deepfakes more than Ashley. She's a particular target on X because so many of Elon Musk's 233 million followers also despise her. He has publicly said some very nasty things about her, and it has motivated his followers to villainize her as well. So as soon as this new technology was rolled out, a lot of people were experimenting with it using her body.
So they would request that Grok undress her, put her into sexually-explicit poses, on all fours with her ass-cheeks spread. They would put a semen-like substance on her. And Grok would just manufacture these images and publish them on Grok's own handle on X. So our first thought was she needs a temporary restraining order to get Grok to stop producing and distributing these explicit images of her.
And this was actually similar to how I began my first large big tech case against Grindr, where a Grindr user was maliciously pretending to be my client and then sending all these anonymous people to his home to sexually assault him. And so in both these cases I tried to get a temporary restraining order against the platform to stop this heinous use. So we actually, you and I had to postpone this interview once, and then again today because I had a hearing today on it. But basically, this case is the culmination of everything that I've been doing for the last 12 years, which is representing people who've been victims of image-based sexual abuse. Also representing people against platforms that have released defective and dangerous products into the stream of commerce. This is where it's all suddenly blending together.
Justin Hendrix:
Let me ask you about some of the claims in this suit. Various causes of action. You're talking about design defects, manufacturing defects, failure to warn, deceptive practices. A range of different claims of wrongdoing against the firm. You go on to make claims around unjust enrichment, infliction of emotional duress. And effectively you're seeking damages here. I think one of the things I want to understand is how your kind of theory of product liability around digital products has come together. These are the types of claims that I feel like I read all the time now in claims against tech firms. Have things changed? These kind of product liability ideas, are they working in court? Are you seeing more success in the law than you did in past?
Carrie Goldberg:
The first case ever to use these types of claims was in our case against Grindr, where Matthew Herrick's ex was sending all these strangers to his home to sexually assault him. And we used almost identical claims in this case, and we have over the last eight or so years that we did in Herrick v. Grindr, which is just basically saying that whatever various tech product we're suing is a defective product. Because of that defect, it caused our clients harm. And our first time it was decided by Valerie Caproni in the Grindr case. There was no precedent. There was no other case where the theory had been used. And she just sort of adopted this really, really broad approach to Section 230, saying, "Well, okay, but everything stems from user content. If not for this malicious user, none of this would've happened, and therefore everything you're suing about is the content of a third-party. Section 230 immunizes Grindr totally."
The truth was that we were very careful in that complaint, explaining that we were not suing Grindr for any of its publication functions and that our claims were treating it as a product, not a service. The theory crashed and burned in that case. That case, though, has been very influential across the country, and it's the exact same theories that succeeded in our case against Omegle over in the District of Oregon that ultimately resulted in a settlement agreement requiring that Omegle, which is a product that matches users for streaming but is used primarily by children and adults and for sex streaming, and our 11-year-old client in that case got matched with a child predator who made her his sex slave for three years. These theories now have been successful in the multi-district litigation against Snap, TikTok, Google, Meta relating to child addiction.
It's been successful for us in our cases against Snap involving dealers selling counterfeit fentanyl-laced pills to children. It's sort of sweeping the country at this point. And we're now using it, of course, against X using this dangerous and unsafe product, Grok, which it was knowingly, for X, undressing people and creating sexually-explicit images. So it's an old, old theory that typically is used in tangible product cases that lends itself perfectly to tech companies, which also are products, despite what they try to claim in court.
Justin Hendrix:
I just want to press a little bit more here about the kind of standard of proof you do need to overcome Section 230 immunity, what you've found. What is it that tends to push things over the line, get you past that hurdle?
Carrie Goldberg:
Well, it's different for every judge, and there's really no uniformity around the country because the Supreme Court has not ruled about Section 230 in a substantive way yet. It depends somewhat on where you're filing the case and whose desk it lands on. And I've found it to be very arbitrary, which cases survive a motion to dismiss and which don't. But my rule of thumb is to always figure out what specific features contributed or caused the harm that injured my client. So I really, really drill down to the technology.
So for instance, we have a case against Match Group because a serial predator was on Hinge and Tinder, drugging and raping women throughout the Denver metropolitan area. And Hinge told its users who'd reported the guy that they had removed him, and they hadn't. They continued to let this guy use their product for two more years, which created God knows how many victims. We represent nine. There are another 15 who have sued. But in that case, we sued for the algorithm that was matching these women with him. And we sued for the fact that this technology was recommending this abusive abuser. There were all these gamification strategies on the app. The app failed to warn users about this known predator and the fact that Hinge and Tinder can't actually remove known predators. So really... And we really got down into the specifics, even with the particular algorithm that Hinge uses for its recommendations and stuff.
Justin Hendrix:
And we could go through a kind of laundry list of some of the other specific cases and questions around accountability. And you've already mentioned many. The Snapchat cases, fentanyl cases, Grindr. You mentioned Amazon. There's BandLab. There's other attempts at Meta, at Google. You call some of these firms, these big tech firms, your "favorite adversaries." At this point do you think of them as all alike? Or do you see differences in the behavior of some firms versus others? Are you able to spot distinctions amongst these adversaries? Or do you still see them as following somewhat similar patterns across the board?
Carrie Goldberg:
Well, it's interesting because there's kind of two things that I think that differentiate them. There's the, what did the company do? And then, what is their legal strategy? There's some companies that severely injured a client, but then they were remorseful and jumped to help my client. And they weren't all about abdicating responsibility and trying to compel arbitration, or these bullshit claims that I see all the time for forum non-convenience. Or right now I'm contending with X claiming that its robot's output is free speech. So it's sort of like, the company that did harm, are they making product changes now that they know about how they harmed? And are they recognizing and apologizing and making my client whole, or are they not?
So Amazon, for instance, it's in my top tier of horrific companies in terms of both these things. It was selling a suicide chemical to the general public. I had one client at the time, about four years ago. I contacted Amazon's chief legal officer. I said, "You are selling a chemical that is being promoted on all these suicide forums, and there are links to Amazon. And there's no other household use for this product. And you're selling it for $20. You don't want to be doing this. You don't want to be involved in suicide and death."
And I thought that particular case was just, I was going to send this letter. I was going to show my client this grieving woman who lost her only child. That Amazon had complied with her information. That it removed this product, and we'd be done. It was just going to be a pro bono case. Instead, they hired a law firm that told me that there was no world where Amazon could ever be responsible for the intentional misuse of one of their products, and to please update them if there's a change in the law.
After that, getting that letter, I then ultimately filed the lawsuit in February of, oh my god, 2022. And then I started getting case after case after case because all these other parents who thought that this only had happened to their child realized that there were all these other families that this had happened to. I went 10 more months after I filed that case where Amazon was continuing to sell the product. So a total of 18 months from when I notified them that they were continuing to sell the product. The majority of my clients' kids who died died after I had already personally been in contact with Amazon's lawyers. They continued to knowingly sell a product to the general public and deliver it to households that had no use besides suicide. To me that is murder.
Justin Hendrix:
The worst type of corporate behavior.
Carrie Goldberg:
It's the worst type. And then they take no responsibility in court. And it's been years of them, us winning every motion to dismiss and then them appealing it and appealing it and just... Their lawyers are just at the rodeo, just having a ball.
Justin Hendrix:
Is there possibly another example that's more egregious than what you just described?
Carrie Goldberg:
I'd say that another example I have is sort of on par, where a humongous tech company invited seven of my grieving clients, who all had lost young children because of harm, egregious harm on this platform. It invited them all to basically come to their corporate office. And this was in the midst of all these settlement negotiations. And so I took all my clients to their corporate office with the understanding and belief that this was in furtherance of our settlement negotiations. Our clients then poured their heart out to these corporate officers. The chief legal officer, a room full of their staff members, many of whom were crying. The company's own employees were crying, hearing from our clients. And then we got back a week or two after and their lawyer told us that they'd decided that their counteroffer to our client's offer was $0.00.
Justin Hendrix:
So you're in the trenches with these victims. You're representing their interests. You're also now trying to change the overall system, the circumstances in which you operate. You're doing more legislative advocacy. You think Section 230 should be amended. I think a lot of folks in the tech policy space, lots of folks who are maybe more free speech maximalists or concerned about the possible implications of reform to Section 230 on speech, have criticized you over the years. I don't want to hash out every single one of those critiques, but I thought I might ask you the question of what, perhaps, you've learned from your critics over the years. How have your views changed on these issues?
Carrie Goldberg:
I would say that the most helpful people in terms of sharpening my arguments and my lawsuits are my critics. I learn a lot from Techdirt and Eric Goldman about what they hate about my claims. And so it's always sharpening. I learn a lot from people on Twitter who criticize my claims. And so it's really the best jury imaginable, are the ones who are rooting against me, because how else am I going to really be able to argue around? But I would say that I've always kind of been a Section 230 abolitionist. Section 230 has been interpreted by judges way more broadly than the black letter of the law.
Justin Hendrix:
A lot of folks think things shifted a lot in the last couple of years. Are we perhaps within some view of an appropriate middle place where the kind of intermediary liability that a lot of folks do believe is valuable in terms of preserving user speech and protecting companies from frivolous lawsuits and all those types of things, are we closer, based on some of the successes you've had, to getting to a place where, even without additional reform, Section 230 may not be quite the impenetrable wall that it once was?
Carrie Goldberg:
I'm in favor of reform through litigation. I don't particularly hate Section 230 as it's written, which really... As it's written, it should really apply to defamation and speech-based claims. And that was what existed back in 1995 when it was first introduced in cases against Prodigy and CompuServe, where people were being defamatory toward other people on those websites. I don't have an issue with that kind of use of Section 230, but the problem is that the cases that I bring where there's really significant harms, sometimes death, those have to do not with user content but with the platform's own misconduct. The infrastructure of the product. The features of the product that the company has knowingly introduced. These are decisions that are made not within content moderation but in a boardroom, and they should be responsible for those kinds of things.
And I think that there has been so much progress. And I have spent these last 12 years with nothing more important in my life than handling these types of cases because I think that they are influential to the entire world, because everyone is using these products. I think the scary thing for me, as a lawyer who still has clients come to me all the time, is that I still don't know if a case is going to survive a Section 230 motion to dismiss. I can never tell.
Some of my strongest cases have been dismissed, even recently. And some of the cases where I think that there is a better argument for the platform to be claiming that it was content-based, those have gone through. And so it's hard to deal with a client and not be able to tell them whether their case is going to be thrown out of court in a preliminary motion, or if we can actually try their case in front of a jury. So we may be getting closer to a middle ground in terms of the cases that do actually succeed because those cases really are just succeeding past a motion to dismiss.
They're succeeding on the right ground, where we're looking at the defects of the product, the failure to warn, infrastructure. So that's great. But we're still in a terrain where there's a lot of unpredictability at the moment we file the case. And the tech companies relish in that. They will gladly take their chances on a Section 230 motion than settle with a client.
Justin Hendrix:
I mentioned the idea that you are involved in advocating for various legislative reforms to Section 230. One of the most recent is the Algorithmic Accountability Act. You mentioned Mike Masnick earlier. This is another one that he's been quite critical of, and in particular making a couple of arguments about it that are probably somewhat similar to arguments you've seen against Section 230 reforms from Mike and from others over the years. But basically arguing that, on the one hand, even if you were to reform Section 230 in this way, there's still the First Amendment right behind it, which is something we hear a lot around Section 230 reforms. But then basically taking on the language that the two senators, in this case Republican John Curtis (R-UT) and the Democrat, Mark Kelly (D-AZ). They put forward this particular bill. This idea that they are making these comparisons to physical products and, effectively, critics of this bill are saying, "In fact, this would do perhaps the opposite of what these folks want to accomplish."
I don't know. When you look at the criticisms of this type of bill, this type of reform, how do you contend with those types of arguments?
Carrie Goldberg:
Well, I think that the whole issue of whether something is a product or a service, I feel like most judges don't want to engage with that. There's the restatement of torts, which basically says that if something's product-like, analogous to a product, then you can still use product liability claims on those cases. So I think the Algorithmic Accountability Act is great in terms of clarifying that and making it unambiguous, but it's probably not the most essential thing. I think what we always need is unambiguous guidance for judges to know that a cause of action can proceed.
Justin Hendrix:
One of the things I find myself thinking about a little bit lately, and I'll just put this to you and see if it resonates in any way, is the extent to which, especially these Silicon Valley firms and some of the large ones that we've talked about here, are seen as more or less not caring about the consequences of their products. Not, certainly, acting fast enough or putting enough resources into trust and safety. And of course we've seen even more profound rollbacks this year. The Grok case, I don't even know where to start. There's no way to look at that and just not question how the adults who earn their paychecks there can sleep at night, at least from my perspective. But the other thing I wonder about sometimes, especially when it comes to the kind of free speech question, is the extent to which these types of abuses are also feeding into the hands of authoritarians and despots around the world who want more severe surveillance systems.
And I often wonder about that, the extent to which, because Silicon Valley firms have not addressed some of the liability issues that you are aimed at, or because they've been shielded from that in many cases, if it has not created a situation where in fact the real censors are able to make an easy case: These are dangerous products, dangerous platforms run by executives who can't be trusted.
Carrie Goldberg:
I think that it's kind of twofold. I think what you're saying is absolutely right, that we already have countries banning X. And I think our country protects the tech industry. Our courts protect the tech industry. Legislation protects them. And so it's easy for the rest of the world to say, "My god. The United States just lets these companies run amuck."
Justin Hendrix:
And now is willing to tariff us to make sure that-
Carrie Goldberg:
Right. And everything that these tech companies are doing to American people it's also doing to people in these other countries. There's treaties that require litigation be in the United States or that countries have the equivalent of Section 230 in order to trade with the United States. And these companies... American companies, they're doing so much surveillance, so it would be terrifying. Just like the United States didn't want a Chinese TikTok, how can these other countries be comfortable with these American products that have so much power and are abusing people and, in addition to that, are creating this mounting case law to protect themselves?
I feel that there is no product bad enough for lawyers to not defend. Amazon had one of the most prestigious law firms, and still has it, defending its right to continue to sell a household product with no use besides suicide. And because of that advice that it gave to its client, hoards more people died. The lawyers are not infusing any morality into this situation when it comes to the ones that these defendants choose.
Justin Hendrix:
This is hard work. You mentioned that it started with a trauma for you, and you are perpetually looking into the minute details of many people's trauma. How do you keep yourself going? How do you sustain yourself in this work? I realize you say it's a vocation, and you're clearly very animated on behalf of your clients, but it must have some corrosive effect.
Carrie Goldberg:
A lot of times I'm just like, a lot of exercise and adorable dogs just keeps me going. But things can get really dark, particularly when you're in front of a judge who isn't sympathetic to a client's plight. And so it's like, when we're dealing with really horrific cases like child rape and child abduction and suicide, it gets really hard on the days where we're not only representing the clients but we're also dealing with a loss in court. It got really hard when Amazon continued to sell this product and I was getting client after client during this interval of time. And I was going insane, a little bit, because no one cared. No one would write about it.
Mainstream media started developing news stories twice and then killed those stories because higher editors said that they didn't want to produce it. Meanwhile, Amazon was continuing to sell the product. So it got really hard, and it gets really hard sometimes, but I don't know. I come to work every day. I love this job. I'm doing my life's work. I do have really good coping strategies. Nothing in this job is ever going to be as hard as my first job out of college, which I did for five and a half years, working with Holocaust survivors every day. So that always puts things in perspective.
And also getting to take a client's trauma and alchemize it into a way to get them justice. It's very relieving for the client. And so I'm not just constantly immersed in a client's pain. I'm like, "How do we transform this to make it so that this doesn't happen to another family?" And that's usually what my clients' goal is. We sue for money, but their biggest... Money's never going to satisfy them. I could get them a $1 billion dollar judgment and they will go home miserable because they won't have their child who've died because of that harm. It won't bring back the tragic loss that they're dealing with.
But what they do care about is making sure that this doesn't happen to another family. So the fact that Amazon finally, in October of 2022, removed sodium nitrite, and I've gotten almost no cases of dead kids from that interval on, shows that the work that we did with our clients really mattered. This suicide chemical that killed thousands of people during this period of time between 2017 and 2022 is almost obsolete now. And that's because of these cases and my clients being brave enough to sue the biggest corporation in the history of the universe.
Justin Hendrix:
It's more about, for you, reclaiming their power.
Carrie Goldberg:
It is. And it's really rewarding. It's exhilarating. There've been times when I get fueled by my hatred for the enemy and I have to really remind myself, and I do a bunch of different exercises and rituals, and we do them as a group in my office to make sure that the engine that's fueling us is one of our clients' love for their deceased child. And as long as I'm on that side of the line, where we are fueled by our clients' love and not hate toward the enemy, I can keep going. And it's weird because it's just like a perceptual twist, but it makes it a lot... It makes it joyful because it's like we're doing this in the name of their child. And I get to know these kids.
Justin Hendrix:
Carrie Goldberg, thank you very much.
Carrie Goldberg:
Thank you, Justin.
Authors
