'Seeing Like a Platform' — A Conversation with Petter Törnberg
Justin Hendrix / Aug 31, 2025Audio of this conversation is available via your favorite podcast service.
Today’s guest is Petter Törnberg, who with Justus Uitermark is one of the authors of a new book, titled Seeing Like a Platform: An Inquiry into the Condition of Digital Modernity, that sets out to address the “entanglement of epistemology, technology, and politics in digital modernity,” and what studying that entanglement can tell us about the workings of power. The book is part of a part of a series of research monographs that intend to encourage social scientists to embrace a “complex systems approach to studying the social world.”
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
Petter, can you tell me a little more about your research interests and how you came to the point of writing this book? What would you characterize as your intellectual project?
Petter Törnberg:
So I've always been interested in the intersection of the social and the digital world, to a certain extent, in part as I grew up as a computer nerd, so I've always been interested in programming and always existed in that world, but then gradually have become more interested in the social sciences. What faces you when you make that transition, or at least what faced me, was this realization that there's not just a difference in what the topics are, but that there's fundamental differences in how we understand these systems.
I really like the epistemological questions. My PhD was in complex systems, which is very much this application of methods from physics and from computational sciences on the social world. And my brother, he's a sociologist, and so we would always have these kind of discussions, and we would, on the family dinners live this interface between the social world and the digital world and the computational, and notice how we felt differently of things. So that's always been my broader project, using computational tools. And also, at the same time, as I've been growing up and going into research, has been the transformation of society, with society becoming increasingly digitalized and with the social sciences adapting to that fact, and to a certain degree, going through an epistemic shift in itself and how we understand what society is. And so, a lot of my research had been tracing those shifts and tracing how the digital is transforming the social world, how it's transforming our politics and the way we understand ourselves.
Justin Hendrix:
I'll point out, you've been on this podcast before, in particular to talk about social media and polarization, and if I recall, the paper that informed that conversation, one of the things that you do is you often create essentially simulations, you do mathematics to try to understand various dynamics. Can you talk a little bit about that, about how you approach looking at these complex systems, what types of experiments that you get up to?
Petter Törnberg:
Yeah. So a lot of my works I do... I do all kinds of work in terms of using empirical, I've used digital data from the platforms, I've even done the occasional actual interview, actually engaging with humans. But a lot of my work is focused on simulations, and so building artificial models of the social world and having it play out, and especially it's a technique called agent-based modeling that's actually existed since the '60s, '70s, it's existed for a long time, and what this allows you to do is basically to try to build models that are as simple as possible, but captures the phenomenon that you're trying to study. And it very much comes from a physics or a natural science understanding of the social world, in the sense that we're trying to model the social world from the bottom-up, so we're trying to capture humans and what the humans do, and then look at their interaction and how that can lead to often unanticipated outcomes on a higher level.
So it's a little bit like if you would think a murmuration of starlings or a flock of birds, if you would approach that as a social scientist would, you would have a survey and some variables and then try to find correlations. And obviously, that wouldn't be able to capture the complex phenomenon that is a flock of birds, because it's just interaction between many birds, but what it does allow you to capture, in very simple ways, is to basically create a little model where you have the birds as being rule followers and they're interacting, and that leads to this flowing dynamic pattern that is the flock of birds.
And so, it comes with a certain understanding of what society is and how we should understand society, which is this notion that society is a complex system. And so, that understanding, that sense that society is not just a complicated system, but it's a complex system, has been something that's been emerging over the last 30 years or so into social science and has really become quite a dominant understanding in recent years. So a lot of my work is both applying those methods, but also trying to, in doing so and in seeing this spreading of these ideas of society, that it's a complex system, also being critical and trying to understand what that actually means and what assumptions come with that shift.
Justin Hendrix:
So you say in the introduction, of course, that you derive the title for the book from James Scott's Seeing Like a State, which many folks are introduced to in studying political science or studying social sciences generally. You also write that the proposition at the heart of Seeing Like a Platform is that the digital era into which we are now entering signifies the rise of a new epistemology of power. I want to get into that, what you mean by that. But just to start, step back on Seeing Like a State, why was that a operative metaphor for you in going to do this work?
Petter Törnberg:
So first, it's an amazing book, and if the listener hasn't read it already, they definitely should. But what Scott sets out to do, so he has this analysis of the state is trying to pursue some large project or trying to improve the world somehow, and he was looking at the state, the book was written in the '90s and is looking back on the state-associated industrial modernity, so from the '40s to the '70s, '80s, and his book is a critique of that, of the ambitions of that state. It's often been read as a critique of the state in general, but I reread this book a few years ago and I realized that it's not just a critique of state power or something that's innate to the state, but it's a particular way of seeing, a particular way that the state of that era understood and represented society, so the metaphors that it used to understand society. And so, I think it's useful to think of what that metaphor was in order to see how we've moved away from that metaphor in recent years, at least that's what we argue.
And so, this society that Scott looks at is the society of industrial modernity, or the Fordist society. So it's a society that, in many ways, understood itself through the metaphor of industrial production. So the Fordist factory, many sociologists have written about this, but the way that the industrial production, the mass production, and the Fordist production that was central to this society, it became an arch metaphor for understanding society itself. So in many ways, it was exported. So the way that the machines were structured became a way that we structured our production and structured our factories, and that in turn became exported and slipped into other parts of society. So schools, for instance, began to be structured as machines and as factories and large parts of our lives, so a lot of parts of our institutions. So there's this way of understanding society as a machine that can be designed, that can be controlled from the top-down.
Maybe here, it's useful to come back. I mentioned already this notion of complexity, but I think it's a useful starting point to actually explain what we mean by complexity. When we talk about complexity, we often separate, we draw a line between complex systems on one side and complicated systems on the other side. And so, a complicated system would be something like a car or a spaceship, the machine, so it's a system that we can take apart and we can look at the components and we can see what the components do, and the components, they have their own little functions and mechanisms and they interact in quite simple ways. So by studying the pieces of the machine, we can understand the whole and we can design them as a top-down way of structuring a system. So that is the epistemology, this way of understanding, that was very characteristic of industrial modernity.
But on the other side of that, we have complex systems, so that would be like ant colony or a flock of birds or a school of fish. What characterizes them is that you really can't take them apart to understand them. If you would take an ant from an ant colony and you would look at what it does, it would just walk around in circles, it wouldn't really tell you anything, its behavior, and it's, in some ways, very stupid and dysfunctional on its own, it doesn't really fulfill a function. But what happens when you have thousands or millions of them, suddenly it becomes a very intelligent organism on the macro level. You have very simple components, but very complex interaction that leads to types of order and types of function that can be very hard to predict from the underlying behavior of the components.
What Scott and the broader critique that he was part of, he basically looked at this epistemology of industrial modernity, this idea that society is a complicated system, that we can build it like a machine and we can control what outcomes it has, and he was basically saying that this misrepresents society, when we see society as this, we're missing a lot of aspects of it, we have to do some reduction in order to be able to control society. Powers and organization and control always require a simplification to make society and to make systems and make nature bureaucratically digestible, to make them possible to parse, to be able to build a map of the system. But his critique basically focused on this idea that society is a complicated system, it misses a lot of things, and those things that we miss, they come back and haunt the system and they lead to often unanticipated and very problematic outcomes.
And what we're basically arguing is that we have, since the critiques of Scott and so on, we have gone through a transition over time, where we have increasingly seen the spread of metaphors of complexity, like self-organization and these organic ideas, that they have spread and become a dominant way in which society understands itself and in which society is being organized, and that we've basically seen a shift from the metaphors that Scott critiqued to the metaphors that, in some ways, Scott promoted, which are the metaphors of complexity. But we also see that those metaphors and that epistemology also comes with its own limitations, it also leaves out aspects of society that we have to bracket, and that, in the same ways, come back to haunt us.
Justin Hendrix:
So we talked about ideas that, in some ways, are congruent with this in different ways on this podcast in the past, and I feel like I'm in particular reminded of a conversation I had with Joe Bak-Coleman and Carl Bergstrom and other authors of a paper on collective behavior and the ways in which technology is changing the way that humans essentially interact with one another, and raising this question of do we really understand the way that digital technology has changed human organization fundamentally, down to the level of how we interact individually, and then, of course, in groups. So I'm thinking about your ants, I'm thinking about fish, I'm thinking about birds and all of those more complex systems. And then also, of course, thinking about humans now as a species of many billions that attempts to organize themselves in different ways to accomplish collective goals.
You talk about the idea that the digital creates potential for alternative and more equal forms of organization, but it also, of course, at the same time, affords new forms of control and inequality through the monopolization of data power. So I understand your goal here is thinking about the complex system, and then thinking about how power is accrued in it, and thus how decisions get made. You regard technology as Janus-faced and you say it could be employed to undermine democratic power, weaken public services, promote labor precarity, violate privacy to stabilize the world's democracies, but its political possibilities could also potentially enable new forms of democratic governance. I assume your project is hopefully in furtherance of that latter statement. Where do you think we're at at the moment, just on this question? What is the balance at present in terms of how digital technology is being deployed in the world? How far off from that more pluralistic or democratic goal do you think we are?
Petter Törnberg:
I think it's quite clear, looking historically of when digital technology and social media and the internet first arrived and became influential, how we were talking about it, but also how it was actually employed, and we saw the rise of early... I guess they weren't really called platforms back then, but like Wikipedia, culture thing, even early Airbnb, it was a beautiful place, in some way, and we had this idea of the sharing economy and that the digital could allow us to basically function as a way of creating flexible institutions which wouldn't require top-down control, which wouldn't require necessarily monetary exchange, though we could have ways of producing new forms of conviviality and other forms of non-monetary and non-controlled exchange.
And so, there was all of this optimism, and of course, that was also expressed in the Arab Spring and the political movements, Barack Obama's campaign in 2008, which very much employed this grassroots movement and so on. And there was, in that period especially, there was a lot of views in the left of these notions of complexity and of self-organization, and that became of ways of describing how we could organize a society from the bottom-up, and so there was this great deal of optimism there. And what we then saw gradually was that there was an inflow of financial capital in this area, and there was also this sense that, among theorists and sociologists, there was this way of describing that the digital technology would be inherently incompatible with capitalism because the digital is not scarce and scarcity is a precondition for capitalism. And so, we had the pirate movement, the free software movement and information was to be free, and all of these ideas that were entangled in a large shift.
But when financial capital was realized from the position financial capital is, that these platforms not only reduce intermediation, they not only allow people to connect directly, but in some ways, they actually mediate a lot of interaction. So when we are exchanging messages through Facebook instead of doing it in person, we actually have an intermediary part that actually has a lot of power in that relationship. And so, you had a shift, where data started becoming valuable and capital started to recognize that there were actually real forms of control in this intermediation, that you could accumulate data, you could predict user behavior, and you could also shape user behavior in important ways and in ways that very much are invisible to the users and where the outcomes are very hard to predict and it's hard to actually notice that you're being controlled and that your behavior is being shaped.
And so, with that, we've seen a growing influence of these platforms that basically use this monopoly, this position of control of the flow of information and as fundamental infrastructure, social and political infrastructures of our society, they're using that as a way of extracting rent, and it's become characteristic of a new form of capitalism that is very much based on social and political forms of control. And so, I would say that it's very clear that we've moved in the direction of these much more centralized forms of power, where financial capital is very much coming to take control and shaping our society.
And a lot of these consequences, like with Bak-Coleman and so on, I think it's really amazing work, because it also highlights the way that manipulating and having these companies coming in and taking control and reshaping the very fundamental rules of our information ecosystem that shapes how we as society act and how we take decisions and how we solve our problems, it reshapes our society in really unexpected and unpredictable and really dangerous ways, and is also often, in some ways, orthogonal to what the companies are trying to do, because what they're trying to do is basically maximize the data that they can extract from us, maximize how much attention they can extract from us, and then the downstream consequence is that that reshaping of our information ecosystem that that has on our politics and our social life is really something external that they don't necessarily really care about.
Justin Hendrix:
You look at various phenomena across the book, you look at things like anonymous, you look at Wikipedia in particular, I want to pause on Wikipedia. This phrase democratic potential comes up multiple times in the book, and you look at Wikipedia as a case study asking this question about how power concentration works on Wikipedia, how this self-organization or the bureaucratization of self-organization works on Wikipedia. Why is Wikipedia such a useful case study to get at these ideas?
Petter Törnberg:
So Wikipedia is, and really remains, the leading example of how self-organization can go right, and how we can actually create systems that are non-monetized and non-controlled and really amazing and beneficial for society. And so, we wanted to look closer on this case and see whether it lives up to this description of being a decentralized self-organized system and how it actually works. And so, that's the aim when we look at it, and in trying to think of it, it is a very successful example and everyone keeps using it as a metaphor for how we can organize our society differently, but how is it actually organized? We talk about self-organization, but what does it mean?
And I think what we find and what I find really interesting is that the reason that it works is maybe not so much that it's decentralized and it lacks control and institution that is bottom-up, but precisely that it has actually quite a lot of bureaucracy, it has quite a lot of institutions, it has a lot of rules and a lot of control to enable to make it democratic. And so, it's not that it's just a completely self-organizing system without any control, there's actually a lot of institutional, a lot of top-down control taking place, and it's precisely what enables it to be democratic and inclusive.
And in some ways, we can think of this as... There's this classic anthropology book from the '90s written by, I forget his first name, Orr is the surname, talking about machines. And so, in his book, it's very much in the spirit of Seeing Like a State, because it's a critique of this notion of industrial society as being possible to plan from the top-down, and so this idea of society as being a machine. And so, what he does is basically he's talking to, in these anthropological studies, hanging out with Xerox repairmen, and they have this big book that describes everything that they're supposed to do depending on what's wrong with the printer, and they have these exact rules that they're supposed to follow and it's this top-down institution. And what he finds is that, actually, no one is actually following this book, no one is following these rules, and the actual reason that it's working, that these repairmen are successful, is that they're having this horizontal exchange, they're exchanging stories, narratives, and they're building internal norms, and that is, in many ways, a bottom-up institution.
That's just one example of a lot of studies showing the failures of industrial modernity, the way that industrial modernity actually functioned was that there was the presence of a lot of bottom-up structures. And to a certain degree, that critique has become part of the emergence of this digital society that we live in today, that is very much organized around the idea of some complexity and the ideas of self-organization. This chapter, you can think of as a opposite intervention, that what enables these complex bottom-up systems to work is that they're not only bottom-up, self-organized, but that they also have top-down institutions, they have things that prevent them from going in a bad direction and that enable them to be democratic and inclusive.
Justin Hendrix:
I want to ask you a question about larger social media platforms and how you diagnose the problems that we see in those at the moment. You talk about the idea of political discourse on social media as self-presentation, you relate it to this idea that we see the public world as projected on our selfhood in social media, that information, thoughts and stories are all seen in values as ways of expressing who we are. I tie this in my head back to your earlier work on polarization that we talked about. You say, "The political discourse that is afforded by social media is one based on the underlying mechanisms of identity and group belonging." Why does that matter? Where does that leave us stuck? What are the implications of that for democracy?
Petter Törnberg:
So the central argument in that chapter is really this idea of what happens to media when it's reorganized around the interests of data extraction and attention optimization, because what the social media companies are profiting from is selling data about who we are and our personalities to companies because they're part of an economy that is basically oriented around creating demands, which is very different, if we look historically, from how the media was organized during Fordism, during industrial modernity. It was very much a mass media that saw its purpose as catering to a mass audience, and it was very much trying to produce markets for very homogeneous products, so this old slogan, you can buy Ford in whatever color you want, as long as it's black.
So that was the economic structure that defined media, and therefore that, to a certain degree, defined our culture, the need to control a certain market, which is often relatively small, like a city, and the media was trying to basically achieve local monopolies, and that meant that they created incentives to cater to all audiences and being able to communicate to a diverse audience, basically, and being able to, for instance, in terms of politics, being able to communicate to voters of all parties.
That in turn led to the need to produce a separation of what was seen as objective, what was seen as fact and non-debatable, so the shared space of what we all agree on, versus the space of these are things that we can disagree about, that we can have different opinions about, and then they were defining the things that were outside the realm of polite society, of things you couldn't disagree about or you couldn't have debates about. So that distinction between those different parts and this idea that there are shared facts in our society, that was very much a product of this way of understanding of the mode of production, of how industries produce products and how markets were created for those products.
Fast-forward to the society we have today, which is a highly postmodern society, where the limitation is not the production, but it is the need to create demand and identify certain consumer niches, which in turn creates this value of data as a commodity, as a form of capital in contemporary society, because it is understood to be the product that allows us to produce consumer demand, produce identities and consumer groups. And that demand on media, it creates a media where the platforms and the systems are designed for the purpose of extracting information about who we are, and there's a lot of research showing what you need to do if you want to maximize data extraction. So one thing you want to do is make people talk about themselves and who they are, because that's how you find out who they are so you can sell that information to advertisers.
And then, you want to maximize how much they're looking at the screen, how much they're engaging and how much they're writing. And it turns out, just because of human nature, that the way you do that is by making people angry and upset, you're threatening their identity somehow and you're making them talk about themselves. And basically, what you realize when you see what that optimizes for, it's also precisely the things that you would like to optimize for if you want to maximize effective polarization, you want to maximize conflict, you want to maximize sectarianism in society. And so, basically, that produces the conditions that drive societal conflict, that drive fragmentation and a breakdown of social cohesion.
Justin Hendrix:
You point to our political discourse essentially boiling down to a form of self-expression, highlighting difference, fueling identity conflict, you've just spoken to that phenomenon. And you're right, as the boundary between opinion and fact dissolves and both become drawn into the logic of difference, the question of truth becomes submerged into the larger process of cultural fragmentation and difference. This means that the emerging epistemology of digital capitalism treats truth as a question of identity. Information is evaluated based not on common standards of evidence applied to commonly accepted facts, but on its alignment with our social identity. I want to just ask you to explain the cash value of that idea, maybe relate it to issues that we face right now, inequality, climate change, other things that we need to address collectively. How does this hold us back?
Petter Törnberg:
So basically, when you're moving into society where you don't have a shared set of facts and you don't have this value around rationality and reasoning, and you don't have a space where there is the sense that this is a specific space where we can disagree and we can argue, but we're doing so based on a shared set of facts, when you no longer have that and the way that things are communicated and the way that information, that opinions and action is valued, is basically as identity markers that somehow situate you, in the same way that we're buying certain t-shirts to express who we are to say something about our identity, when politics becomes drawn into that space of identity, it reduces or erases any space for rational disagreement where we can have a conversation and arrive at some compromise.
Because there is no purpose for compromise, the idea is not... If you have ideas and opinions as markers of your identity, that leaves no space for arriving at a compromise or at a shared set of ideas, because that's not the point of those ideas. You're not taking on those opinions because of some kind of rational reason, you're doing so because it links you to a certain identity, it links you to a certain social group, and these are just the expressions, superstructures of fundamental group identity and group differences, and it becomes very hard to see how such a society can, in the long-term, continue to function.
Justin Hendrix:
This book is dense, it's a complicated set of ideas here. But coming back to the ants and ant colonies and the idea of resistance or refusal to participate in the incentives and the goals of the larger colony, there is a set of concerns here or a preoccupation with the idea of resistance, how to resist complex power, how to resist the phenomena that you're describing here that platforms how they're shaping our behaviors, et cetera. But you write a little later in the book that this idea that attempts at resistance in this world of complex power, of digital capitalism, are inherently co-opted, and that creates a conundrum, I think, for those who might want to address the fundamental power concerns or the power issues. I don't know, I just want to ask you about that word resistance. What does this book tell us about resistance against these forms of complex power or digital capitalism that you described here?
Petter Törnberg:
Yeah. I think a really central point that we arrive at through this analysis is to question this really inherent association that we've developed, I would say, over recent decades, where we associate the centralized bottom-up self-organized systems to democracy, and we associate top-down systems, institutions, bureaucracies, hierarchy, to totalitarianism. And that separation has really come to characterize, I would say, especially the left, this fear of institutions, fear of structure, and that makes it very difficult to organize resistance, basically.
And what you often end up with is this finding, which we find again and again, is that self-organized systems, bottom-up systems, they don't necessarily lead to positive outcomes, they don't necessarily lead to inclusion or equality. In fact, they have really a strong tendency to become highly centralized and with some people ending up in power. They're also very sensitive to manipulation, they're very sensitive to control. And so, I think a main takeaway for us is this idea that resistance... We shouldn't be so afraid of building institutions, we shouldn't be so focused on or we shouldn't conflate decentralization with egalitarianism or with democracy, because it's a false association and it makes resistance very challenging.
Justin Hendrix:
Towards the end of this book, you start to ask questions about artificial intelligence, how it may change, the phenomena that you described here. You pose a set of questions in the book, they're almost, I suppose, like a challenge to other researchers, other people thinking about these things, questions like, how does AI perpetuate inequalities? Who owns the AI state? What are AI minorities? What is politics in the era of AI? Let me just pause on a question, I think, that underlies a lot of these, which is you seem to be concerned about the combination of state power and corporate power and the extent to which AI blurs that line. How should we be thinking about artificial intelligence as we think about this platform economy that you're describing?
Petter Törnberg:
It's an interesting thing, because a lot of the focus on, especially when these new AI systems first emerged, is this fear of a Terminator scenario, that they will destroy the world in pursuit of paper clips. I'm not saying that that fear is entirely unwarranted, but it does ignore a much more obvious and direct fear, which is the fact that these systems, which are clearly extremely powerful, that they are being implemented and controlled by precisely the same platforms that managed to turn couch surfing into the city-destroying system that is Airbnb, that managed to create things like Facebook and Twitter, that are destroying our social fabric and our politics, and those companies are now in control of these extremely powerful AI systems.
I think that is something we should be very much concerned about, because these are companies that the entire platform business model is founded on leveraging sociopolitical control to extract rent, to basically try to achieve monopoly markets by creating artificial forms of scarcity, by having basically monopoly control. That I find profoundly disturbing, and the way that those systems are now being, in the same ways as social media, coming to seep in through our institutions, as people in the government and companies are starting to use them for everything, more or less condoned by leaders. But that gives an immense amount of control to these companies that I think we should be very much probably more concerned about than any Terminator scenario.
Justin Hendrix:
You write, "By reducing political decisions into optimization problems, algorithmic systems erase the inherent intractability and plura-dimensionality of politics," I think that's the stuff of democracy, "and foreclose the possibility of other ways of understanding the world and other ways of being." What are you looking at out there in the world that would suggest to you we have some way around this? You say that these things aren't preordained, they are, of course, the result of decisions that people are taking, that leaders are taking, that entrepreneurs are taking, that all of us are taking in our engagement with these ideas. What gives you hope?
Petter Törnberg:
I would say that we are, to a certain degree, at a moment of shifts, a moment of change, because we see these large language models and AI undermining the way that we've organized things, that they really seem, in many ways, incompatible with how we structure things, and I think it's very clear and what we're pointing to is the risks of that, the ways that that can really threaten what remains of our democratic politics. But I think that that is just as much... Those moments are also moments of opportunity. There's things changing, and if we're capable of organizing, capable to respond, I think those moments of change can also lead to something positive.
I think, for instance, we see social media as a model is being threatened, we see the media models that have shaped the internet and advertising models being basically undermined by AI models. I'm not saying that that is looking very positive at the moment, but I do think that there is clear signs that this is a moment of change that are also moments of opportunity, that can, if we're able to organize and if we're able to find new models, can lead to something positive.
Justin Hendrix:
This book is called Seeing Like a Platform: An Inquiry Into the Condition of Digital Modernity, it's from Routledge, and I should say it's available open access, so folks can go and simply download the PDF on the internet. I'll make sure to link to it in the show notes here. Petter Törnberg, thank you so much for taking the time to speak to me.
Petter Törnberg:
Thank you, it was great.
Authors
