From Recommendation Engines to Rewilding: Reclaiming Our Agency Online

Tessa Brown / Jun 28, 2024

Unicoi Turnpike Trail at Tellico Plains, Tennessee (National Trails Office)

When it debuted in 2010, Instagram quickly went viral as the place to share photos with friends. The company was snapped up by Facebook only a few years later, keeping the behemoth relevant against new competition for young people’s valuable attention. Yet ambivalence with the platform grew as people missed simpler and more direct ways to keep up with friends.

In 2022, when Instagram introduced Reels, their TikTok-like vertical video feed, dissatisfaction was so widespread that even the Kardashians expressed disappointment with one of their core tools as global influencers. Both Kim and Kylie reposted a viral meme that pleaded for the platform to “MAKE INSTAGRAM INSTAGRAM AGAIN. (stop trying to be tiktok I just want to see cute photos of my friends.)” In her post, Kylie added, “PLEASEEEEEEE.”

Still, Instagram CEO Adam Mosseri insisted that users loved Reels, given that video viewership kept rising. As an Instagram user myself, however, I kept noticing the dark patterns in user experience design that increasingly stranded me in Reels. For example, before Reels, users could mute or unmute a short video in their feeds by tapping anywhere on it. Post-Reels, mute became a small button in the corner of a video, and tapping the video anywhere else opened up a full-screen Reels feed.

Like a casino with no clocks on the wall, in this full-screen view, all menus and notifications were obscured, one’s curated feed was abandoned, and videos played automatically to algorithmically capture our attention. I found myself watching video after video with an ill feeling, until I’d finally be able to wrest myself away.

In a recent essay, technologists Maria Farrell and Robin Berjon use a concept from ecology to critique the fragility that has emerged from consolidating internet service providers. They invoke “rewilding,” the practice of re-introducing ecological diversity to over-domesticated land, a conservation strategy that “restores healthy ecosystems by creating wild, biodiverse spaces.” Applying this concept to an internet and web dominated by a few global monopolies and duopolies, the authors advocate for a diversity of providers, protocols, and services whose emergent interactions create resiliency and optionality for the people and organizations who rely on the internet to communicate.

Farrell and Berjon’s article is about internet services broadly, covering the whole tech stack from cables and servers to protocols, browsers, and search, but I was drawn to their comments about the monoculture of social media and its effect on those who have grown up with it. This consolidation, they suggest, has created digital social experiences characterized by passivity: "the internet has become something that is done to us, not something we collectively remake every day."

As ad-supported platforms converge into a feature singularity optimized for addiction, our culture has narrowed as well. Friends don’t stop by to say hello, and a phone call needs an appointment (yes, please text me first). Instead of exchanging contacts so we can talk, we follow each other on Instagram or LinkedIn, becoming content creators in each others’ feeds. Dissuaded by design from direct interaction, we pirouette through the parasocial panopticon: always watching, never reaching out. As content and friends are algorithmically delivered to us, passivity becomes the default posture.

Protecting our Children ‘From the Internet’

In 2022, we already knew that recommendation algorithms from Meta to YouTube to TikTok favored content that was extremist, politically polarizing, addictive, and agitating—perfect for entertainment platforms that make their money by serving ads. But by last summer, Instagram’s harms had escalated so significantly that a Wall Street Journal headline read simply: “Instagram connects vast pedophile network.” In a world of hyper-targeting and addiction by design, sexual predators had turned out to be whales.

This past May, a New York Times investigation found that even products advertised for girls, like a kids’ jewelry line, were predominantly being shown to adult males who were interested not in the jewelry, but in child models. And just last week, another Wall Street Journal investigation reported on a young teenage dance influencer who, despite her mother’s best efforts to protect her on Instagram, is reaching an audience of 92% adult men.

In the context of rising distress among youth, children’s safety has emerged as the site for national discussions about the harms of social media design and information delivery. While new proposed legislation like KOSA (the Kids Online Safety Act) enjoys bipartisan support, civil society organizations, including the EFF, the ACLU, and some LGBTQ+ advocacy groups, have cautioned that this legislation uses a parochial understanding of youth safety to forward pro-censorship legislation that state attorneys general can weaponize to censor content along politicized lines. As Stanford technology policy researcher Riana Pfefferkorn told me, both KOSA and a range of pending state bills “frame the internet as something children must be protected from,” rather than recognizing a more nuanced and research-backed reality that the internet “can be a vital tool for belonging and growth.” (Note: Pfefferkorn is an advisor to my startup, Germ Network.)

KOSA, the advancing TikTok ban, and too many state-level bills reflect a national mindset that seeks to protect kids by censoring information and limiting access to private spaces that aren’t subject to surveillance and data analysis. As Pfefferkorn puts it, “this is the infantilization of the internet for everyone, relabeled as a gesture solely at youth.” Efforts to legislate safety by dictating what people can find online are the digital correlates of book bans—and progressive legislators should take heed when cooperating with colleagues who also support literal book bans as well.

Yet it’s true that many social media products are dangerous for teens—and for all of us. But they’re not dangerous because they sometimes give young people privacy, as pending legislation in Nevada against Meta’s end-to-end encrypted messages suggests, or because teens can find new information online. They’re dangerous because they make vulnerability to addictive content and predatory individuals the cost of connecting with friends and trying to learn new things.

An Alternative: Building Agency and Rewilding Ourselves

Harvard psychologist Emily Weinstein and sociologist Carrie James warn parents and teachers that what keeps teens and young people safe online, as in the real world, is never increased environments of surveillance and control. Instead, teens need to develop agency— a grounded sense of their own control over their environments and the outcomes of their actions, so that they learn to make healthy choices for themselves—because adults can’t always make decisions for them.

Weinstein and James remind us that agency is core to psychological well-being: “Psychologists have long recognized that individuals fare better when we believe that our actions can influence what happens and when we can shape an outcome through our behavior… Conversely, routinely feeling out of control can threaten our well-being.” Environments that foreclose choices, boundaries, and consent simply aren’t healthy.

Weinstein and James point to three kinds of agency that empower young people and build resilience and well-being: “personal agency” to make individual choices over how they interact; “collective agency” to work together as a community; and “proxy agency” to bring in adults or other authority figures when needed. Yet today’s social media ecosystems systematically deny this agency to teens and their families.

For the teen dance influencer on Instagram featured in the aforementioned Wall Street Journal investigation, her mother found that letting her daughter build her following demanded letting hordes of adult men follow the account and even engaging with their creepy comments. “If you want to be an influencer and work with brands and get paid, you have to work with the algorithm, and it all works with how many people like and engage with your post.”

As Meta slashes content moderation teams that protect its users from rising scams and predation, their rollout of end-to-end encryption in messenger should indeed be scrutinized—not because privacy is intrinsically dangerous, but because it’s being deployed on a platform that systematically introduces dangerous people to their marks. When we drive tech users into addictive feeds, introduce them to people they don’t want to talk to, and fire the teams tasked with keeping them safe, we’re creating docile populations marked for harm on a world-historical scale.

Rewilding offers a vision to build toward as we revitalize digital social interactions to be healthier and more self-sustaining. For Farrell and Berjon, “rewilders build resilience by restoring autonomous natural processes.” We humans are creatures, after all. Healthy humans are empowered to make choices, form secure connections, and set boundaries–without having to navigate recommendation systems that degrade our defenses with every new video and introduction. Teens, families, and all people are desperate for tools that help us discover, connect, share, and disconnect with balance and choice.

What if we built tools that helped teens—and all of us—make our own decisions about who can talk to us and what kinds of information we want to engage with on a given day? What if we developed online systems that encouraged agency and decision-making instead of passivity and vulnerability? What if we used developments in machine learning and cryptography to empower us in our relationships with each other, with information, and with our time?

Safety is narrower than health, which rewilding teaches us comes from complexity, diversity, and choice. Individuals, businesses, and policymakers need to prepare for a new internet that doesn’t force us upon each other but empowers us to find our own ways. Let’s build a digital world with less stalking and more talking. To rewild the internet, we’ll need to rewild ourselves.


Tessa Brown
Tessa Brown, PhD, is the co-founder and CEO of Germ Network, building the DMs of the internet so you can share what you want to, when you need to. She was previously a lecturer of writing and communication at Stanford University.