The Octopus Is Back: The Imperial History of An AI Meme

Antoinette Burton / Mar 25, 2024

Antoinette Burton is Professor of History and Director of the Humanities Research Institute at the University of Illinois, Urbana-Champaign. She is the author of The Trouble with Empire and a Public Voices Fellow with The OpEd Project.

A Shoggoth by Tatsuya Nemoto (Nottsuo). Nottsuo's artwork inspired by HP Lovecraft's short novel At the Mountains of Madness. Wikimedia/CC by 3.0

Artificial Intelligence has an entertaining meme: Shoggy, who circulates in octopus-like form.

Shoggy works well as the AI standard bearer, but not because the octopus is a benign underwater being. It has taken off because the octopus is both the embodiment of an otherworldly intelligence and a stealthy, fast-moving creature powered by tentacles that allow it to master the vast territories of the ocean floor.

In the tech world, the popularity of the octopus form derives from “shoggoth,” a mystical creature who appears in the 1930s fiction of H.P. Lovecraft. If Lovecraft’s weird invention was not exactly an octopus, it came pretty close: tentacles, blobby shape, black goo and all.

A shoggy meme, shared on the social media platform X. Know Your Meme

Shoggy is comic because it’s a goofy image that plays on inside tech jokes about how large language models of the kind that animate ChatGPT work. And it’s serious because insiders know that the playful creature hides the “scary parts” of AI, the ones that are known and those yet to be revealed.

The AI octopus has become so popular that you can even buy Shoggy merch on Etsy.

What’s striking is how little many in the tech world know about the deep, dark history of imperialism. If they were familiar with that history, they’d know that in one form or another, the octopus has been the symbol of world domination for at least 150 years.

Thanks to the popularity of 19th century satirist Fred W. Rose’s war maps, for example, readers of newspapers in the Victorian era could see the all-encompassing destructive power of British military ambition embodied by the octopus in living color. Rose’s octopi advertised the clear and present dangers of imperial expansion through the baggy, bloated, and always sinister reach of the many-legged creature.

Beyond Rose’s popular illustrations, the octopus was represented as the “devil-fish” in cartoons of the 1880s. It was depicted with the head of that most recognizable Englishman, John Bull, and each of its legs carried the name of a possession or colony (India, Egypt, Jamaica, Malta) firmly in the grip of the smiling warlord.

Rose called his images “serio-comic” allegories. That was undoubtedly part of their appeal for Victorian readers who were both fascinated by and uneasy about the accelerated speed with which the British imperial war machine seemed to be conquering the globe.

By 1900, Rose was depicting John Bull and the octopus as one and the same, enabling readers to visualize Britain’s colonial possessions as helpless in the face of the pitiless creature who tossed them across the globe at his own grim pleasure.

These precursors to Shoggy were picked up by German and Japanese media as depictions for all stripes of imperial power and ambition in the years leading up to World War I. The predator octopus as an analogy for imperialism is so well documented that a century later a teacher developed a lesson plan called “teaching with tentacles.”

If the imperial octopus tradition remains invisible in current invocations of Shoggy, perhaps it is because like many of the creators of AI themselves, many tech writers promote a narrative which evokes the mystic, even soulful, aspects of this most recent technological revolution rather than visualizing its dangers.

But AI has a dark side, including the perpetuation of racial bias, with consequences for the criminal justice system, healthcare and medical diagnostics, and educational outcomes, among other domains where machine learning and its predictive capacities involve risk at scale that is unequally distributed, typically disadvantaging communities of color.

Safiya Noble’s 2018 book Algorithms of Oppression was one of the first studies to document how negative biases against women of color are embedded in search engine results.

A 2023 research study on bias in AI radiology applications shows how algorithms referred fewer Black patients than white patients with similar disease indicators for further care. The consequences of such systemic bias are potentially fatal for patients subject to models which depend on “the race signal” to predict or direct clinical outcomes.

Of course, AI is transforming the world. And of course it prompts consideration of the meaning of human and nonhuman systems. It is undeniably revolutionary, and may in fact change how people live and work forever.

But as a financial-technological complex, AI is also the consequence of a wide-ranging set of imperial histories. Empires are the original racial bias machines, and the world continues to be shaped by the structures they put into place – politically, economically, culturally, and technologically.

Consumers, policy makers, leaders of organizations, and tech workers need to acknowledge the origins and legacies of this phenomenal technopoly. It is necessary to be as vigilant about the systemic dangers of AI as many are seduced by the wonders of technology.

Otherwise, Shoggy the octopus will amuse rather than expose the exploitative consequences of AI. It is important to understand the rigorous histories which illuminate where Shoggy came from so that everyone can determine where AI should go, and where it shouldn’t.


Antoinette Burton
Antoinette Burton is Professor of History and Director of the Humanities Research Institute at the University of Illinois, Urbana-Champaign. She is a historian of British imperialism and co-editor, with Renisa Mawani and Samantha Frost, of Biocultural Empire (forthcoming, Bloomsbury). She is current...