Home

Donate
Podcast

Through to Thriving: Honoring Our Elders with Dr. Timnit Gebru

Anika Collier Navaroli / Jun 22, 2025

For a special series of episodes that will air throughout the year, Tech Policy Press fellow Anika Collier Navaroli is hosting a series of discussions designed to help envision possible futures—for tech and tech policy, for democracy, and society—beyond the present moment, dubbed Through to Thriving. Audio of this conversation is available via your favorite podcast service.

Welcome to another episode in this special series of podcasts called Through to Thriving, where we are talking to tech policy folks about the things that we can put our focus into right now that will help us arrive at a better technological future.

For this third episode, I spoke with Dr. Timnit Gebru. Timnit is the founder and executive director of the Distributed Artificial Intelligence Research Institute. Last year, Timnit wrote a New York Times opinion that asked, “Who Is Tech Really For?” In the piece, she also asked a question that I had never heard anyone ask before: “what would an internet that served my elders look like?”

This year, DAIR has continued to ask these questions by hosting an event and a blog called Possible Futures that imagines “what the world can look like when we design and deploy technology that centers the needs of our communities. In one of these pieces, Timnit, along with her colleagues Asmelash Teka Hadgu and Dr. Alex Hanna, describe “An Internet for Our Elders.”

I recently spoke with Timnit about building technology that honors our elders. We discussed the elders that inspire our work, how investing in thinking about possible futures has inspired concrete research projects, the imaginations of science fiction, and the not so inevitable future tech landscape.

Timnit discussed that it’s important to understand that our technological realities are the creation of someone else’s imagination:

A few people who have lots of money, imagine what could be done, what they wanna do, how they wanna live, what kind of world they wanna live in. And then they get money and they get power to execute on that imagination.

She also described that that imagination also guides how technology is advanced and deployed into society:

We learn, especially, at least in the western world about science and tech as being from no one's point of view. It's like, it's just the truth out there or the next evolution, the next inevitable evolution. Like obviously there were horses and then steam engines, and then cars, and then self-driving cars and then flying cars, you know? This is the natural progression of things. And first we have to really understand that there is no natural progression of things. It is literal. All of it is about who is getting the resources to execute on what imagination.

Timnit described that living in someone else’s imagination has distinct consequences for those of us who work in tech policy:

We're sitting around being reactive. Why shouldn't we live in our own imagination? If we're living in someone else's imagination and if we're always being reactive, we're not giving ourselves the chance to think about that and execute on that. Instead, we're always gonna be in reactive mode. We're not gonna be able to forge what we think should be forged.

We also discussed the assumptions baked into tech, especially regarding elders:

The idea is that these things are not for them, it's like they first have to upgrade themselves in some way. That's really an unspoken sort of rule in tech. The idea is that people first have to change themselves in some way. And then they can use the tech for something. It's never about every single human being as they are in this world should have tools that improve their lives in some way. It's not that. That's not the philosophy. The philosophy is that there's this thing that'll improve your life in some way, but you first have to change yourself to be able to use that too.

What follows is a lightly edited transcript of the discussion.

Anika Collier Navaroli:

Welcome y'all, and thank you for joining us again for this special series of podcast conversations called Through to Thriving. My name is Anika Collier Navaroli, and I am your host and a fellow here at Tech Policy Press, and I am talking to some amazing folks about what we can put our focus and our energy into now so that we can get through this moment and to a place of a thriving and better future. So today I am going to be talking about building tech that serves and honors our elders with the amazing Dr. Timnit Gebru. Timnit welcome to the podcast, and thank you so much for joining us.

Dr. Timnit Gebru:

Thank you for having me. Good to see you.

Anika Collier Navaroli:

Good news here and good to talk to you too. For any folks who may be listening to a Tech Policy Press podcast and for some reason have no idea who you are, would you introduce yourself to us?

Dr. Timnit Gebru:

So my name is Timnit, Timnit Gebru. Timnit means wish in Tigrinya. I think that's relevant for our conversation today. That's my mother tongue. It's the first language I learned how to speak. Tigrinya is a language spoken in Eritrea and also the Northern Ethiopian region of Tigray. I mean, my job is that I'm the founder and Executive Director of DAIR, and DAIR stands for that Distributed Artificial Intelligence Research Institute.

Anika Collier Navaroli:

Well, thank you again for joining us. I want to talk a little bit about the first time that I actually spoke with you. I don't know if you remember this conversation, but I promise it's going to be relevant and it's going to get us into talking about our elders.

And the first time that we had finally connected, I remember I was visiting one of my best friends and their mom lives with them and I had said to their mom, "I've got to go get on a work call. I'll be right back. I'm going to go take this call, do what I got to do." And so I got on the phone with you and we're starting to have this conversation and a couple of minutes into it, their mom busts into the room and is looking at me and soon enough sits down on the bed, takes their hand out, stretches it to me, and there's a phone, like an iPhone.

Dr. Timnit Gebru:

Oh yeah, I remember this.

Anika Collier Navaroli:

Yes. Yes in it. And I was on the phone with you.

Dr. Timnit Gebru:

Okay.

Anika Collier Navaroli:

I was like, yeah. I was like, "I think that I've got to go right now." And I'm like, "I'm not quite sure exactly what's happening." And it turns out she was closing on a house, she's in her 70s and she couldn't figure out how to use DocuSign.

And so I ended up spending the rest of my day sitting there and on the phone trying to be like, can we get it sent to this email address? Doing electronic signatures and doing all of these things. And I literally remember, no joke, I talked to my therapist about this because I was like, what? It was a moment in time in my life where I was sitting around thinking about what am I prioritizing? What are the things that are important to me, and that I'm making a lot of time and space for? And it was one of the first times that I was saying, family is really important to me. Elders are really important to me. And so in that moment, having to make that decision, I was like, "I've got to go." And I was like, I really hope that Timnit understands and doesn't think that I am just the biggest flake on the planet.

Dr. Timnit Gebru:

I'm glad you went. Yeah.

Anika Collier Navaroli:

The house is closed, everything went well.

Dr. Timnit Gebru:

Everything's good.

Anika Collier Navaroli:

Everything is good. And I remember having that sort of feeling and saying, "I didn't know you that well." And I was like, I don't know if you'll understand that when the elder comes to me and ask for help.

Dr. Timnit Gebru:

I remember it was a few years ago. It was like 2022 or '23.

Anika Collier Navaroli:

It was a while. I don't remember.

Dr. Timnit Gebru:

Yeah, I don't remember exactly, but I was like, I think it was during the shutdown maybe 2021 or '22. I don't remember. Yeah.

Anika Collier Navaroli:

I don't remember either. But it was one of those moments. And I think having gone through that, I'm so happy to sit down and be able to have this conversation with you more about technology and elders and what it means and what it can look like to build technology that serves and honors our ancestors. I want to talk first about a New York Times op-ed that you wrote. I should have looked at the exact date, but it was this year, correct? Am I making that up?

Dr. Timnit Gebru:

Well, I think it was released this year or no, no, it was the end of last year because it was like a reflection of 2024, I think. So it was released kind of maybe around November or December of '24.

Anika Collier Navaroli:

Okay. So you asked this question, you asked a couple questions in there that I want to talk to a little bit about, but you asked this question of who is tech really for historically?

Dr. Timnit Gebru:

Yeah.

Anika Collier Navaroli:

Who is that?

Dr. Timnit Gebru:

Yeah, I mean exactly. Because I was thinking about these people make me forget that I am a technologist, I studied electrical engineering, I liked math and physics. That's what I got into tech to do. And here I am constantly having to just say, "Don't do this, don't do that." And I was just like, no, that's not what I want my career to be. But the reason is that all of the activities that I think are fun versus the stuff that those activities lead to, I think math and physics are fun. I think that building tech is fun. I think debugging, the way you have to think about the way you have to think in logical sequences. All of the activities about the space that I studied were interesting to me and fun.

However, the things that I was building using those activities, I just never had the opportunity to think about what they are and who they serve. Because the norm is you go to school and you are so busy trying to figure out the technical details that you're not thinking about the big picture. You don't have time to do that. You have all of these classes you have to take, you're drowning, and then people are discriminating against you and you're fighting that and all of that. And then you go to the next thing. And then you can't ever take a step back and say, "Wait a minute, what am I building and what am I contributing to?" And in my engineering education, I also did not get exposed to people who are asking that. There are some whole fields of people and scholars asking this question.

So then of course now it's very clear who tech is for. I mean, I took it for granted that, for example, in all of my classes, I remember that ... It had something to do with the military, but it was just sort of taken as a fact. It's not something to critique or think about or to think that it could be something different that. Oh, the C programming language, oh, that was developed because the military ... Or the internet, it was ... Yeah, the military ... And it's just, oh, interesting. It seems like the military has a lot to do with this. Thank God that the military gives money to at least ... So you don't get a chance to think maybe there's a different way. It doesn't have to be this way.

So when we look at who tech is for, especially now with AI and what OpenAI and all these people are doing, we are living in someone else's imagination. A few people who have lots of money, imagine what could be done, what they want to do, how they want to live, what kind of world they want to live in. And then they get money and they get power to execute on that imagination. Or empires, want to make sure they have their grip on empires and execute on that imagination. So if we take a step back and we follow the scholars who have given us some things to think about, then we arrive at this conclusion and then we can arrive at a conclusion that we could do something differently.

Anika Collier Navaroli:

So much in there to unpack, there's this thing that you said about living in someone else's imagination. I tell folks that one of my favorite things that I used to do when I worked inside of industry and we had all hands meetings and we got to see inside of execs houses, and look at their bookshelves and see what books they had on their bookshelves. And inevitably it was some science fiction book written by some white guy who was-

Dr. Timnit Gebru:

Not Octavia Butler.

Anika Collier Navaroli:

Exactly. Exactly. Right. And so I made it a habit to start reading those books and to see what are these futures that these folks are actually trying to build? And asking the question, does it include me? Does that thing, because here's kind of like the blueprint that's sitting on the bookshelf?

Dr. Timnit Gebru:

Yeah.

Anika Collier Navaroli:

Am I included in that? And I think that was something that absolutely continued to ... It was one of my favorite things to do when I was sitting inside of companies because it really is that sort of-

Dr. Timnit Gebru:

Same.

Anika Collier Navaroli:

These imaginations that are so small. And I think you've also said something about in the roles that so many of us are placed in, and I feel this so deeply in the policy world, that what we get to do is sit around and say, "Don't do this. Please don't do that." We spend our time so much in this sort of reactive stance that we don't get to have our own imaginations. We don't get to have our own sort of ability to look into the future and think, what could this look like? So in this piece that you wrote for the New York Times, you inserted this question that was, "What would an internet that served my elders look like?" When I read that, I literally remember I pulled out my phone and messaged you because it felt like a gut punch. It was just like, yes, that is a question that I had never heard anybody ask before. What made you ask that question?

Dr. Timnit Gebru:

It was based on, similar to your podcast, we talked about this before. We started a Possible Futures series. And that's kind of how-

Anika Collier Navaroli:

Shout out to Dylan.

Dr. Timnit Gebru:

Yeah, Dylan the connector.

Anika Collier Navaroli:

Right.

Dr. Timnit Gebru:

And the reason we started this series was exactly what you said before. We're sitting around being reactive and not, why shouldn't we live in our own imagination, if we're living in someone else's imagination? And if we're always being reactive, we're not giving ourselves the chance to think about that and execute on that instead. We're always going to be in reactive mode. We're not going to be able to forge what we think should be forged. So this Possible Futures series was a way for our own team and maybe some others, a few other people, to have a structured way of thinking about ... To have space for us to imagine and think about how we can execute on that imagination. So we had to have a little bit of structure. It's not so speculative that it's so far. It's kind of like we have a little bit of a structure in there and guidance.

So in this one, I was thinking, so it was me, Alex, Hannah and Asmelash Teka, who's our fellow who wrote it. And I was thinking about how whenever we talk about tech, I mean, I have this kind of conversation with people a lot. They talk about digital literacy. The idea is that people first have to change themselves in some way, and then they can use the tech for something. It's never about, every single human being as they are in this world should have tools that improve their lives in some way. That's not the philosophy. The philosophy is that there's this thing that'll improve your life in some way, but you first have to change yourself and to be able to use that too. And so I've been thinking about this because it's kind of like a paradigm in the tech industry. And I remember, I think I was reading a Paris Marx book, The Road to Nowhere, and there was an example of some Princeton students talking about how ... So of course the future should have self-driving cars. That's just a given for these people. They self-drive. But there's a lot of problems.

Anika Collier Navaroli:

The Jetsons.

Dr. Timnit Gebru:

Yeah, there's a lot of problems. The future should have self-driving cars, but there's a lot of obstacles to making that happen. One of them is obviously people walking around.

Anika Collier Navaroli:

Obviously.

Dr. Timnit Gebru:

Walking around the street as we do, as one does, preventing the self-driving car from just driving around. So how do we prevent that obstacle? And it's like, oh, maybe every person, every human can wear this tag. I don't know if it's ... I don't remember what it was based on, but some tag that then communicates with the car and each of these tags would cost $2,000 or something. And so this was the proposal and the so-called best minds of America are pontificating about how to make sure that each person wears some $2,000 thing because that's the way to potentially make self-driving cars exist. That's just kind of like the logical conclusion of the paradigm we're in, which is the assumption is this thing has to exist-

Anika Collier Navaroli:

It must exist.

Dr. Timnit Gebru:

And then how do people change themselves? And I was thinking about it also, we had a similar Possible Futures workshop like a few weeks ago.

Anika Collier Navaroli:

Oh yeah, I was there.

Dr. Timnit Gebru:

Oh, yay. So we were talking about this and I was thinking about, again, you go back to that philosophy. So for someone to go on the train or for someone to go on the bus, I think the assumption isn't that they need to change something about themselves. It shouldn't be. If you're a wheelchair user, you use a wheelchair, you go on the bus. If you're an elder, you have your cane, you go on the bus or you go on the train, that's it. It's there. There's systems. If it's broken, there's ways in which to figure that out. But a car is different. It's like we all have to be little pilots. Imagine all of us. It doesn't make any sense to me because imagine if every single person could fly a plane and all these planes flying around and all of these rules, it just makes no sense. And I imagine back to when people imagined cars as a dominant mode of transportation.

And first we all have to learn how to drive cars, and then everybody has to know the rules and not drive fast, but not kill. I mean, it's meant for you to drive fast, but you can't drive fast because you're going to kill people. And all of society has to be remade to accommodate the existence of cars. And I never thought, even back to when I was thinking about so-called responsible AI or responsible data model practices, et cetera, I remember even from 2017, I had a talk, a TED Talk, and I gave this example of how it took a long time for seat belts to be a norm. People resisted them, car companies resisted them, et cetera. But if I were to do that talk again, that's not even what I would talk about. I would talk about how cars-

Anika Collier Navaroli:

What would you talk about?

Dr. Timnit Gebru:

I would talk about how cars came to be the dominant mode of transportation, whose imagination was that and how it doesn't make sense. It's kind of when I read Paris's book. So now going back to the whole elders thing, again, public transportation, trains, buses. That is an assumption that you don't have to change how you are. You should have this mode of transportation available to you without changing who you are. So whereas the idea of cars, it's oh, individual freedom. I don't have to be around other people. I don't have to see the masses. I can just take off whenever.

Anika Collier Navaroli:

Very American. Very American.

Dr. Timnit Gebru:

And then you have to plant that imagination into people and you have to fight all of these politicians and get in their pockets to make sure it happens. You have to destroy Black neighborhoods to build highways. And it's just like all of society's remade and suburbs are born. And so when we think about the internet, it's kind of the same. It's like there is this idea, when you think about it or tech in general, who's the quintessential tech user? Oh, are you tech-savvy or are you not? It's not, oh, is there tech that works for anyone as they are? Is that what we're building? And so now we make fun of older people when they can't use the TV well, or when they can't use the phone or when they can't figure out how to do something on social media. But we're not making fun of ourselves for not building something that doesn't work for them.

Anika Collier Navaroli:

That isn't user-friendly. Especially thinking about who our users are, who users of technology are. We're talking a little bit about elders of course. And I would love to ask if you wouldn't mind talking a little bit about some of your elders that inspired you to write and think about a lot of these things.

Dr. Timnit Gebru:

So in this one piece about our internet for our elders, I was thinking about, for example, I mentioned earlier my mother tongue is Tigrinya, and the assumption is that, oh, no one should care about people speaking this language or only this language because it's only a certain portion of languages. Why should I care about them when there's all these other people who speak all these other languages? And you probably would relate to this, I mean given where you worked before, but that attitude is why we have one of those ... The reasons we have so many such toxic online experience for so many groups of people around the world, including a genocide that's fueled by social media, language that spreads like fire on social media. Like when we talk about the Rohingya or the genocide of Tigrayans, that's kind of when I really started thinking about it is because I've read a lot about these kinds of issues and I never really cared about social media. I'm not a social media researcher.

But when you start seeing the toxicity, the level of toxicity in your own language, then you really understand what we're dealing with. When I read about the Rohingya, it doesn't hit as much as home as when I see it in my context and my languages and I try to report it and then what happens? And it's literally, "Well, these people don't really bring in ad revenue. How can we dedicate this many resources to them when there's 8 billion people in the world, that's not a very widely spoken language and all of that?" And then we end up with basically the deadliest genocide of the 21st century thus far, where over a million people have been killed. Imagine 1 million out of 7, at least 300,000 victims of rape as a weapon of war, et cetera.

And whenever I tried to talk about this and get people to care, it was like, "Well, again, why should I care about that one little tiny small group of people?" And so we started working on language technology. Even to do this analysis of social media platforms, et cetera, we had to do this language technology work on language tech, and I had to learn how to justify why I'm working on that specific language tech out of all the other languages and everything. And then that led me to think about why shouldn't I center specific communities that I know that I'm from, why should they not have what other people have?

And then that got me thinking. I'm thinking about my grandmother. My grandmother not only only spoke Tigrinya, but she didn't read or write. And now supposedly with the whole digital literacy thing, if someone doesn't read or write, it's "Oh, they're a lost cause, they shouldn't even have tech that ..." Other illiterate. First they have to bell literate and then they can use tech kind of thing. But there's all these people, oral cultures are very rich cultures, so why shouldn't she have tech that met her needs as she was? And so that's kind of how I started thinking about these things and my work where I started thinking about the philosophy that I see that's widespread, that's taken as a given and how it really causes harm to communities that I know that I'm from.

Anika Collier Navaroli:

Yeah. Yeah. Thank you for talking a little bit about that. It does really resonate with me. I think about my own grandparents. My grandfather also couldn't read or write. And so I think about he's now an ancestor, but what his experience of technology would have been like, which he wouldn't have been able to participate in so many of ... The pieces of this. I also think about my grandmother who's also passed, but you mentioned wheelchairs, canes, these systems that were actually technology systems that were created to meet people where they were at. My grandmother was blind and in a wheelchair and had all of these systems of technology that were helpful for her in so many ways. But I think about the internet, social media, it's not them. These are not the same thing.

Dr. Timnit Gebru:

Yeah. The idea is that these things that are not for them, it's like they first have to upgrade themselves in some way and then they can have ... That's a really unspoken sort of rule in tech. And then if you want to work on tech that works for specific groups of people, then you have to specialize in things like accessibility. It's not the norm, it's not the foundation. Oh, I do accessibility. So that's the part that gets to think about disabled people, of all the different kinds of disabilities. That's basically a majority of the world. I mean, if you really think about it.

And so sure, if you are the quintessential internet user, you basically are the person who has hegemonic views. You have all the money to build tech. So white dudes, Silicon Valley or whatever, who get all them. Again, it's their imagination that's being implemented. So they'll build something that kind of follows whatever ideology that they want to follow. And then the idea is that that has to work for everybody around the world. And if we don't meet that, we're not like that in some way, then oh, then first we have to change ourselves and then we can ...

Anika Collier Navaroli:

This piece on changing yourself or having to upgrade yourself in a certain way in order to be able to engage with technology, I think is really interesting. Especially with this phrase that you used in the piece that you wrote in the Possible Futures series, along with your colleagues referring to grandmothers as technologies of memory themselves. What did you mean when you said that? And can you explain a little bit more about how calling someone a technology and being a technology while also experiencing technologies can be these different things?

Dr. Timnit Gebru:

Yeah, I never really thought about this. It's like you think about things when they're scarce, I think. So now any elder I have, like my uncles or something, I try to spend as much time as I can with them because they're my connection to whatever villages my grandparents are from, all of the information about these things and all the knowledge, all the hyper local knowledge that they have. And the fact that we have oral cultures means that you only get that knowledge when you're with them and you don't have access to that knowledge when you're not with them. It's very interesting, like a gatekeeping mechanism too. It's like data protection.

Only the people who have access to these people have that knowledge. And so they tell you all of these stories about the past, about the present, about your family members and how you're related to everybody. I mean, I don't understand how this is possible. They have a social graph that goes back generations. I've never seen anything like it. My brain doesn't work that way. I guess I lost that part of it. I don't know. Any person on the street, they'll be like, "The way we're related to that person is." And then 20 minutes later you're like, oh my God, what? I need to record this. I can't remember.

Anika Collier Navaroli:

Yeah. Ancestry, yeah.

Dr. Timnit Gebru:

Yeah. It's very interesting. And a lot of history is also recorded through poetry, through there's all these sayings, different phrases, and so you only get access to that knowledge from your elders. And it's something very unique. And I only kind of think more about it as I'm growing older too, and I know I only have limited time with these elders of mine.

Anika Collier Navaroli:

Yeah, I feel the ticking time clock as your folks start to age and you can see that happening and you want to be able to capture the memories that they hold. You talked a little bit about how much of our cultures are oral cultures and how much of our histories are stored through poetry. You asked a series of questions in your piece, and I'm going to ask you these questions because I want to know what you think about them too. But you said, "How can internet bring us closer to family and enable the transfer of wisdom, kindness, and community held by our grandmothers?" One of the things I was thinking about was of course, oral history projects. What would that look like if that the foundation of our internet?

Dr. Timnit Gebru:

Yeah. What's interesting, I would say that one thing I like about this Possible Futures series just for myself, is that it put me in a space of thinking about this. And then based on this piece, we spun out a bunch of research projects.

Anika Collier Navaroli:

Yeah, really, tell me more.

Dr. Timnit Gebru:

Yeah. For real. I don't know if it would've happened if we didn't just put ourselves in this position of thinking about these things. So one of them is a collaboration with ... Do you know Whose Knowledge? There's an organization called Whose Knowledge.

Anika Collier Navaroli:

Tell our listeners who might not know who that is.

Dr. Timnit Gebru:

It's a great organization. And then they just basically interrogate whose knowledge is represented online, on the internet, et cetera. And so we are collaborating currently with them on this research project, and it's called Sovereign Language Technologies for Online Knowledge Repositories. So you're talking about a oral history project, what does it look like to have an online knowledge repository? So for example, Wikipedia is an example of an online knowledge repository. So initially we started saying Wikipedia is really not useful in many languages. It's full of misinformation, it doesn't have any locally relevant information, et cetera. So when we did some research looking into why some of Wikipedia editors in some of these languages, said that they don't have appropriate language tech tools. They don't even have spell checkers, they don't have keyboards. They would like to have speech to text transcription tools that allows them to speak and transcribe to text, et cetera.

And then we were like, "Okay, let's work on that." And then we were like, "Wait a minute. Wikipedia might not even be the right form for some of these cultures. What if there's an oral kind of Wikipedia? What about an oral archive? Maybe someone can edit the archive orally or something." And so then we started doing, we went backwards and we're like, "Okay, what should the form of the online knowledge repository look? And then what are language technologies that we can have to support people in having those forms?" So that's kind of what we're doing research in right now.

Anika Collier Navaroli:

That's fascinating that these questions have been able to spin out into research and that's so much of what I'm trying and hoping to do with this podcast and these conversations that we're having. Policy is deep and sometimes philosophical, and we take ourselves so seriously. And what if we take a step back sometimes and think about things like joy or community or elders and what space can that open up to create things like research projects or new policy ideas?

Dr. Timnit Gebru:

I can tell you another research project.

Anika Collier Navaroli:

Oh, please go. Please go.

Dr. Timnit Gebru:

The other one is the idea that, so I mentioned that I kind of have to justify why put resources in language technology for Tigrinya? Because oh, those are just like 10 million people, whatever. What about this other language? What about this other language?

Anika Collier Navaroli:

Just 10 million people?

Dr. Timnit Gebru:

Exactly. That's it. In the span of ... Not exactly. In the grand scheme of things, who cares? And so that's always going to be the attitude for most groups of people. So most groups of people are collections of groups of a certain number of people, whether it is 10 million or 1 million, whatever. And then that collection expands to the world. Most groups of people are not just lumped into 1 billion people who do X. And so I was thinking, and I was speaking with a number of our fellows and other people about if you wanted to create tools that serve as many people as possible, you're not going to do it by becoming a monopoly and doing everything, building language type for every single language in the world. I will be the person to do that. That's what the OpenAIs and the Facebooks and the Metas and the Googles and stuff claim. We have one thing that'll serve all of these different people.

Anika Collier Navaroli:

One size fits all, yeah.

Dr. Timnit Gebru:

One model for everyone. One size fits all. And so what's the opposite of that? It's okay. Many sizes for many people. Just similarly to how I had to justify why I'm working on this language or some other language that I know, if there is one small company or one small organization, one such organization is Lesan, and the founder of Lesan, Asme is one of the authors of that piece, and he's a fellow at DAIR. So he focuses on Ethiopian and Eritrean languages, just a few of them, not even all of them. There's 100 of them. And at least he started focusing with the languages he knows. And because he knows the context in the community, he created really language technologies that outperforms anybody, any big tech language technology because A, he cares about those people. And B, he knows the context, so he'll create something better and he cares about the quality because he cares about those people. Whereas for Google, it's a 10 million group of people, they don't bring in any money.

So if we want to replicate that for multiple groups of people, we want to have multiple Lesans where each of them are not trying to become monopolies, but they're trying to build products for a small group of people or relatively small group of people that they know, not really small. And so we are working on this federation of small language technology startups and orgs to start in the African context because 3000 of the world's 7,000 languages are spoken in ... One third. Not 3,000, but one third of the world's 7,000 languages are spoken in the African continent at least. So we're working on this federation to have a bunch of small language tech startups that each focus on a small number of languages so that they don't become monopolies. And that means because if you try to increase your linguistic coverage, you're going to decrease the quality of each language.

But if you're just a single company, you can't really survive as a small business because some other company is going to come, "Well, I need coverage in these 10 languages that you don't have." And "Oh, sorry. Facebook has it," even though it's bad quality.

Anika Collier Navaroli:

It's not good. Yeah.

Dr. Timnit Gebru:

Exactly. So this is another kind of thing that we're working on with a number of other partners of ours.

Anika Collier Navaroli:

Amazing. Thank you so much for sharing a little bit about the research that is really very tangible research that has come out of just the process of sitting back and giving ourselves the ability to no longer just be in that reactive space, but to imagine and think about what we could potentially create. You mentioned misinformation, and we've got to talk about the elders and misinformation.

Dr. Timnit Gebru:

I know.

Anika Collier Navaroli:

And the WhatsApp aunties and the conversations that we're having, I'll say, I was talking my same friend's mom is an immigrant to the United States and was telling me about this video that she had seen, and it was saying that these immigrants aren't welcome in these cities, whatever. And I finally was like, "Let me see this." And I look at it and it's completely AI generated. You know what I mean?

Dr. Timnit Gebru:

Oh my God.

Anika Collier Navaroli:

Voice is like robot voice. I check out the YouTube channel and it's just video after video targeting immigrant communities specifically trying to stir up this. And so I sit down and have this conversation and I'm like, "The companies don't have protections for folks that are at the margins, and so therefore these things are being beta tested." And then of course, we have to go back to the WhatsApp group to say, "We looked it up and this isn't true." You know what I mean?

Dr. Timnit Gebru:

Yeah.

Anika Collier Navaroli:

We were dispelling information. And I think that it is something that is hard and something that we don't necessarily talk about enough.

Dr. Timnit Gebru:

This is exactly to me, another instance of what we talked about where, oh, why should we care about this one community? And therefore ... So it's like, oh, this is just one specific diaspora group. And so I'm not going to spend my time talking about misinformation in this one language because it only affects this one community. But then actually the same thing in all these different forums is happening to many different communities, which collectively create a big group of people.

Anika Collier Navaroli:

Yes.

Dr. Timnit Gebru:

So it is happening a lot in so many ways. It's hard because you can't ... You have to find all of these different ways of alerting other people about it and why they should care. "Hey, this is happening." And like you said, this is happening with one group of people and they're a testing ground, and so if you don't care about this one group of people, this is what's coming to you. Or that happened when I was working at Google. I remember one of the people who worked under us was Mahdi, who was a Moroccan journalist and computer scientist. So he got his PhD in computer science, but he was a journalist before that. And he was telling us that YouTube, the most popular YouTube and Facebook channels were arms of the intelligence unit.

Anika Collier Navaroli:

Oh, wow.

Dr. Timnit Gebru:

And they were taking people's private information and publishing it. They were doing all sorts of things. When we tried to alert ... I mean even when I was at Google, YouTube was ... Just everything that we sent there never heard back. They would be like, "Sorry, we're focused on the U.S. election right now." Sure. The U.S. election is very important. I care. I'm in the U.S. I care about the U.S. election, but you are operating in a lot of different countries around the world. So this teammate of ours, his friend, Omar Adi is a journalist, who was in prison, and he was one of the first victims of NSO Group's Pegasus, like the spyware. But you know what I mean? Pegasus was used all over the world, and this person was one of the first victims, the government, et cetera. And I remember El Salvadoran journalists are talking about this too. "Oh, now you guys get to see what we've been saying about our leader, now that people in the U.S. are being sent to El Salvador."

So it's the same story over and over again. This one group of people, we shouldn't care about them because they're a minority. We have to care about the majority and we can't care about this one specific context, et cetera, et cetera. And in so many different ways, we see the fact that this one specific context, A, we should care about it they're people, and we should care about all people as they are, because there's no hierarchy of human life. And B, this one context is always a testing ground. If you allow this to happen, that means you're living in a world where this one thing is happening. So this example that you're talking about of specifically older immigrant communities, diasporic communities, it's something-

Anika Collier Navaroli:

It's a beta test. I was like, "Oh, this is coming."

Dr. Timnit Gebru:

I see this all the time over and over again in so many different ways. And so again, I was thinking similar to the federation of language tech startups that we were working on. I mean, I've been talking to other journalist friends of mine, et cetera. I'm like maybe a bunch of different communities that should form an alliance and have a block that is more powerful so that they can, whether it's lobbying the companies or lobbying the government. Because if you go to the ... I mean, which government? I don't even ... Well, I'm just saying because if I go to someone on behalf of ... To bring us speaking elders in the U.S. they'll be like, "Why should I care about this one little marginal group of people?" But if we all, so many different groups like that, form some sort of alliance, then we can kind of talk about common issues that we have.

Anika Collier Navaroli:

Yeah, I think that piece on not enough people for anyone to care also goes into this piece that we were talking about at the beginning of our conversation around whose imagination are we living in? Because if it's your world and it's your family and it's you and your people, you're going to be concerned. But if it's those folks over there, who cares? Who cares if this thing is being beta tested and eventually is going to come and impact our communities?

Another question I have for you, and we'll wrap it up soon here, is talking to elders about technology, talking to elders about AI. I would love any recommendations you have for us about this. I will say again, I love this lady so much. Same person. I didn't realize-

Dr. Timnit Gebru:

Does she know, she's like your running example?

Anika Collier Navaroli:

I told her, I was like, "I just want you to know I'm going to talk about you on a podcast." She was like, "What?" But yeah, love her to death. Shout out to her. She watches soap operas during the day, like so many human beings. And I was sitting down watching them once and realized that there was this whole storyline about an AI deepfake revenge porn that was happening inside of The Bold and The Beautiful. And I was sitting there, oh my gosh, I think that this is how this individual is being introduced to the concept concept of AI.

Dr. Timnit Gebru:

To AI.

Anika Collier Navaroli:

And I said something to her that that was like, "This is AI." And she was like, "AI who?" And I'm like, "Exactly." But it is that sort of like how do we have these conversations? How do we explain these things that are completely outside of their realm or the sort of knowledge and information that they're getting is coming from really weird sources?

Dr. Timnit Gebru:

Yeah, I mean it's really hard because there's a lot of disinformation. So I've been very frustrated, for example, with 60 Minutes, CBS 60 Minutes, a lot of older people watch CBS 60 Minutes. I was on there once and I barely had any airtime. Most of the airtime was for a Microsoft CEO. And so that's, to me, press release masquerading as journalism. So if the press is going to be kind of just doing lip service for the CEOs who have a vested interest in a specific narrative, then that is the narrative that forget elders. Not just elders, but a lot of most people are hearing from. So what we need is accurate information to be seeded into many of these spaces that people ... so I know that Tamara Kneese from Data & Society went on Teen Vogue and she had an article on Teen Vogue about climate change and AI because that is where teenagers are going to-

Anika Collier Navaroli:

That's where people are.

Dr. Timnit Gebru:

Or TikTok. There's some academics who are really good at all the social media channels. I don't know. I think Casey ... I don't know how to say her last name properly, Fiesler. I'm not sure, but I see her on all the social media channels, having explainers, doing all this stuff because that's where people are. So accurate information needs to be seeded wherever any group of people get, wherever they get their information. And it's very difficult to fight the disinformation, misinformation because you know who is resourced and who isn't.

Anika Collier Navaroli:

Misinformation works. People wouldn't spend so much money on disinformation if it didn't work, and if it didn't work tremendously well, I think that's something that we have to keep in mind, and especially thinking about our elders and how they're being targeted specifically by so much of this. I know that you are writing a book, and I'd love to end off our conversation a little bit talking about this question of we're building these technological futures. How do we build a technological future that serves our communities, our elders and our grandmothers?

Dr. Timnit Gebru:

Well, I don't know if the book title is going to stay the same, who knows? But the current one starts, the View From Somewhere, and it's kind a call back to Donna Haraway's The View From Nowhere, feminists critique of the way in which we learn, especially at least in the Western world about science and tech as being from no one's point of view. It's just the truth out there or the next evolution, the next inevitable evolution. Obviously there was horses and then steam engines and then cars, and then self-driving cars and then flying cars. This is the natural progression of things.

And first we have to really understand that there is no natural progression of things. All of it is about who is getting the resources to execute on what imagination. A bunch of people created a Discord server for people interested in this Possible Futures thing at our Possible Futures community. And I was telling them that I would love to create an alternative curriculum for computer science students or engineering students. If my education was different, how would I want it to be different? In what way? Well, I would've wanted to be indoctrinated, not in this is the way it is, but more, hey, remember that this is someone else's imagination and this is what this imagination is driven by. You were reading the sci-fi books and I had to do the same kind of thing, be like, why are people obsessed with this AGI thing? And a philosopher, Emil and I wrote this paper on where we coined the term, the TESCREAL Bundle. It was like, wow, eugenics. This is what they're motivated by.

So understanding, even though I hate having to spend time reading what these people are writing. I feel like, my God, this is not what I want to spend my time doing. But understanding that foundation at least allows you to be like, wow, these are the principles they're instituting, and what are the principles that I'm interested in?

The first thing has to be understanding the fact that we're living in someone else's imagination and how that is so. Then we can continue to ask, okay, so what are principles that we care about? So for me, Alex Hanna and Tina Park wrote this paper against scale when Alex was actually in our team at Google, and it was like the whole scale, Silicon Valley scale thing is that this one size fits all thing.

Anika Collier Navaroli:

Yeah.

Dr. Timnit Gebru:

And whereas, for example, the federation that we're working on is the complete opposite of that. It's like you achieve so-called scale, but that's not by having one thing for everyone. That's by having many things for many people.

Anika Collier Navaroli:

Many people, yeah.

Dr. Timnit Gebru:

Yeah. And when you think about movements too and how they spread, it's not by one person, I don't know, leading a movement for the whole world. It's one person starting something in one place and that spreads and it gets adopted by somebody else in another place. I mean, that's how things change.

Anika Collier Navaroli:

Yeah.

Dr. Timnit Gebru:

So that's what I would think about. It's first thinking about whose imagination we're living in. Secondly, giving ourselves space to figure out what principles we care about and how we can all use our skills to build on those principles. And three, resourcing. I mean, it's just money. Okay, money. We're always talking up there and okay, let's get back down to Earth. Resourcing and funding the people whose imaginations we agree with, who we think should be instituted. And so Sam Altman builds something bad and fails, but when he fails and we build something bad, he takes the whole world with him and he has billions and trillions and whatever. The rest of us should be able to get to prototype things even if they fail, even if it was a mistake or whatever. But we should be able to experiment and prototype on the things we're imagining at a small scale, and we should be able to resource people to do that. For instance, Rudy Fraser, he's working on BlackSky.

Anika Collier Navaroli:

Yeah.

Dr. Timnit Gebru:

He's been doing this stuff, he told me unpaid, literally he's spending money out of his savings doing that. I'm like, that's not acceptable. It's not okay. We should resource people like him. So many examples like that. And so let's not just resource people to be reactive and talk about the issues. Let's resource them in executing on their imagination.

Anika Collier Navaroli:

Yeah. Well, thank you so much Timnit for joining us and for giving us those three really actionable items that we can do to continue to build a future that would honor and serve our elders and our ancestors. It was really wonderful speaking with you today.

Dr. Timnit Gebru:

Thank you for having me.

Authors

Anika Collier Navaroli
Anika Collier Navaroli is an award-winning writer, lawyer, and researcher focused on journalism, social media, artificial intelligence, trust and safety, and technology policy. She is currently a Senior Fellow at the Tow Center for Digital Journalism at Columbia University and the McGurn Senior Fell...

Related

Podcast
Through to Thriving: Building Community with Ellen PaoApril 20, 2025
Podcast
Taking on the AI ConJune 1, 2025

Topics