As I say in my TED Talk about Vilem Flusser, the most pressing cultural question is: “why are things so weird?” Or as Anna Shechtman describes it:
“that feeling—floating somewhere between mania and motion sickness—that everything has changed.”
It seems like everyone really fucking wants the answer to be “The Algorithm.”
The New Yorker internet and culture columnist Kyle Chayka gives them that answer in his new book Filterworld: How Algorithms Flattened Culture.
I’ve spent years articulating why this a bad answer. “The Algorithm” is the answer that Susan Wojcicki and Mark Zuckerberg desperately want us to give. It feels like critique but it in fact reifies the premises and business models of the tech platforms: it implies that the platforms are in some computer-genius fashion holding the reins of culture and brainwashing their users. Advertisers, famously, would love to hold the reins of culture and brainwash potential customers.
And Senator, Facebook sells ads.
This is an ideological explanation for why “The Algorithm” is a bad answer. Ideological explanations are red meat for the kind of people who read Substacks, tweets, and The New Yorker, which is why I led with that. But the problem is more fundamental.
To answer the question of “why does everything feel so weird?”, it’s helpful to investigate why everyone everyone really fucking wants the answer to be “The Algorithm.”
I tackle this topic in today’s article in Mother Jones:
“The Algorithm” does not exist. Wide use of the phrase implies a false hope that there is a human who understands our dizzying information system. If it was only the algorithm on YouTube radicalizing us, or the algorithm on Facebook weaponizing misinformation, then we would know how to fix these things. We would just need regulators to pressure Mark Zuckerberg into fiddling with the parameters of some code, and things would go back to normal.
Anxieties about “The Algorithm” reveal how our lives are already governed by systems we don’t understand and can’t control. We are living with technology moving at an inhuman speed, operating at scales simultaneously smaller than we can detect and larger than anyone can comprehend.
This inversion is a powerful analytical tool. Consider: anxieties about LLMs taking the place of humans primarily reveal how society has already made humans replaceable. Only human communication which has already become routinized and rationalized is at risk of being replaced—but unfortunately, that’s most of it.
So, what is the answer? Demography doesn’t help — the generation vibe-gap has never been larger, and the institutions that should be helping us make sense of the world are still steeped in Boomer Realism. But the Sage of São Paulo has the beginnings of the answer. Rather than talk about “The Algorithm,”
media theorist Vilém Flusser has proposed we use “The Apparatus,” arguing that the emergence of new media has caused a mutation in how humans relate to each other—and to their environment. From the fullness of our physical being we are reduced to mere “operators,” experiencing primarily through the apparatus, which “programs” both the producers and consumers of media.
When we communicate via social media, we are not communicating with other people. We are communicating with The Apparatus. But it doesn’t listen. It responds—and trains us to respond back. As we accept the content presented to us, we react as if it were the product of humans rather than human-accounts; we accept our role as operators engaged in what Flusser calls “unconscious functioning.”
The content we produce and consume doesn’t mean anything because it’s not supposed to mean anything; it’s supposed to function, to cause the desired response.
The Apparatus reaches far beyond the screens through which we interact with the (powerful! nigh-ubiquitous!) algorithms that route inputs and outputs through online systems like social media. And though Flusser’s critique was developed in the context of television, social media is an intensification of the trends he identified. Social media is much more satisfactory as a proximate cause of the weirdness.
The most important technological component of social media is quantified audience feedback. Indeed, I think this is the correct definition of “social media.” It’s not a binary—media is more social the more the audience is present, the more that the media object facing the consumer is co-created by the original author and their audience. American Idol’s call-in voting thus made it more social than previous television. Twitch livestreaming chat is perhaps the most social media, conducive to connective effervescence.
The effects of quantified audience feedback have been identified in various industries. Art critic Ben Davis notes how we live in an era of “quantitative aesthetics.” Media mogul Ben Smith describes his efforts to accelerate the destruction of online news media through better audience measurement. And theorist Ben Jamin1 allows us to see how the quantification of attention, accreted to digital objects, creates unique value in the eye of the viewer — the modern aura not through uniqueness but through ubiquity.
The “social media” frame allows us to see how The Apparatus transforms existing communication technologies. Yelp makes restaurants more social, sure. But higher education is more social thanks to the efforts of the U.S. News & World Report. And books are more social thanks to Goodreads, sure, but this is simply an intensification of a trend that includes the New York Times Bestseller List.
Kyle Chayka’s book is still printed on dead trees. But, like, all books today, it is still more social media than books in 1800 were. The audience surrounds you even if you walk down Prince St and pick up a hard copy off the McNally Jackson Bestseller table (itself a form of social media). And one thing that operators have come to understand is that the audience loves to talk about “The Algorithm.”
The reviews of Chayka’s book clearly understand this:
Algorithms rule everything around you
Can We Free Ourselves From Algorithms?
The tyranny of the algorithm: why every coffee shop looks the same
'Filterworld' explores how social media algorithms 'flatten' our culture
How to Take Back Your Life From Algorithms
Have we all become slaves to algorithms?
This is what Flusser describes as the “circular progress” of The Apparatus. Unlike linear historical progress, which is going somewhere, this progress means an intensification of what already exists. It is the circularity of a whirlpool, tossing us around and dragging us down.
Not all of the reviews take the bait. Michelle Santiago Cortés identifies the precise point where the book goes off the rails: Chayka interviews the anthropologist Nick Seaver, who tells him that ‘the algorithm is metonymic for companies as a whole…The Facebook algorithm doesn’t exist; Facebook exists. The algorithm is a way of talking about Facebook’s decisions.’ Cortés:
However, just one sentence after Seaver’s quote, Chayka loses focus. He adds that the technology itself ‘is not at issue’, and that the cultural flattening he bemoans in the book’s title is because we’ve outgrown these algorithmic recommendations and are now ‘alienated by them.’
The pull of The Apparatus is strong; I otherwise don’t see how you can fuck it up this bad. Nick Seaver is exactly the right person to talk to, an anthropologist who has been studying the role of algorithmic recommendation for over a decade. His article on “recommender systems as traps” is a delight (seriously, read it, it’s quite accessible for an academic paper), and it identifies the crux of the problem: the switch away from metrics of recommendation quality in favor of ‘captivation metrics’ like the famous “time spent on site.” Which is the audience metric that maximizes profits.
Another review, by the incisive Anna Shechtman quoted above, nails the tone in her piece in the Yale Review:
It’s said to be quite powerful. I won’t pretend to know how it works—my understanding is that they don’t know either, which is a clever alibi. I hear that it’s a specter haunting our world. In fact, I hear that it knows that specters haunt worlds (but only at the start of an essay), which could be another way of saying that it has an uncanny grasp of cliché. It’s something like a meta-specter, really, haunting our hauntings…The Algorithm. Never has something—some agent—had such a determining force on human actions and desires, at least since the discovery of the libido or the invention of the printing press or the belief in God.
So, you get the sense that this isn’t going to go well for Chayka, that he will prove to be out of his depth, Leibniz-wise. That sense is correct. This article rules. And her conclusion nails Flusser’s conception of circular progress, though not explicitly, instead calling it “Algorithmic Culture: updating American norms, phobias, incentives, and risks—and staying the same.”
But it’s telling that even Shechtman (or her editors) chose Life in the Algorithm as the title of the review.
Cortés notes that “Chayka relies on the metonymic algorithm to step in on behalf of deeper explorations into the myriad actors, motivations and incentives.”
Or: “The Algorithm” is a metonym for The Apparatus. And it’s not an innocuous metonym, if there’s such a thing. This is the kind of linguistic sloppiness which enables analytical mistakes to propagate.
The irony of this critique is that in the book itself, Chayka’s thesis is clearly consistent with Seaver’s point that “The Algorithm” is today used to disguise the larger systems in which social media are embedded. It’s as if it has become impossible to actually read the book, to follow the through-line of the argument, to use the tools of linear conceptual reason that this media technology requires. Instead, everyone already knows what it’s about.
This isn’t a book, in the way we were raised to expect. To an ever-intensifying degree, even books are produced by and for The Apparatus. The current case makes this especially clear: Chayka wrote an essay for The Verge in 2016 about “Airspace” and the sterility of modern aesthtics. It went viral — The Apparatus demanded more. And so this book was produced.
It’s also not a book on the terms it presents itself, as deeply researched, historically-informed technology criticism. There is not a list of citations at the back, or footnotes, or endnotes. Perhaps I’m being snooty, but I have a PhD and I’m a professor who studies exactly this topic, so I’ve got to believe that all the extra work we put in on those dimensions is important. And they are! Books are an incredibly robust medium for storing and cataloguing thought—but they need a technology for pointing outside of themselves, something that has been refined over centuries. Filterworld doesn’t need that technology — because it’s more natural to use the smartphone camera instead. Like Natasha Stagg discussed on New Models, even book writing is meant to be screenshotted and to circulate on social media.
To be fair, Chayka’s website describes the book as a “reported critique,” and I guess you don’t have to cite any sources for those. But it certainly has scholarly pretensions. Chapter 1 starts with like ten pages of the history of algorithms, from Euclid to Ada Lovelace. None of this is “reported,” obviously — except in the sense of a “book report,” a homework assignment done just to check the boxes. By a clever college student, who knows how to use Wikipedia without technically plagiarizing anything.
The Wikipedia article on Algorithm is an incredible resource, both in the content and the way the knowledge is networked: from the past, with citations, and to the present, through hyperlinks to other pages. And there’s no information (that I could find) in the first pages of Chapter 1 of Filterworld that can’t be found within one click of the Wikipedia article on Algorithm.2
That’s fine! Again, this is an incredibly comprehensive resource; it’d be hard to actually find something relevant and interesting for your popular nonfiction book about “The Algorithm” that wasn’t already here. But then why have this in the book at all?
Nobody gives a fuck about Robert of Chester, bro. This isn’t written to be read by humans; it’s demanded by The Apparatus, as some kind of “proof of work.” This is the kind of content that gets you from a viral essay to a viral book. But it doesn’t make any sense; it doesn’t mean anything.
Take the first section, “Early Algorithms.” Two dense pages of proper nouns that the reader has no context for, concluded abruptly by a connection to the book’s ostensible thesis: “The long arc of algorithm’s etymology shows that calculations are a product of human art and labor as much as repeatable scientific law.”
Does it show that? What if instead of the the word “algorithm,” the book were about the word “recommender system”—probably a better term for the actual technology of interest here, certainly the one that Seaver uses. Wikipedia tells us that “Elaine Rich created the first recommender system in 1979, called Grundy.”
What does the short arc of recommender systems’ etymology show? That calculations are not a product of human art and labor as much as repeatable scientific law?
Whatever. There’s no meaning here. The effect of two pages on the etymology of “algorithm” is to reify the concept, to insist that we should keep using this term, which is indeed the message of the book, as well as the message of the reviews of the book. It’s the message everyone already wanted to hear.
I’ve got further beef with how Chayka interprets my hero Stafford Beer. As encouraging as it is that the cultural tides are turning so that Beer merits almost two pages in a book like this, to summarize Beer’s critique of the misuse of computers in human organization as “As with the Mechanical Turk, the human persists within the machine” is just insulting. And the summary of the academic debate over the existence of “filter bubbles” is facile. But that’s all missing the point.
So here’s the answer for why “The Algorithm” is a popular answer now—it was a popular answer before, and the gyre of The Apparatus continues. How did this start? The Mother Jones piece explains my reasoning (please go read the whole thing!), but here’s the kicker:
The fact that we see agency in the algorithm reveals our fundamental need for interpersonal connection. It lets us imagine someone like Zuck in control of “the algorithm,” instead of adrift in “the apparatus” like the rest of us. We wish Big Brother was watching. But we might just be alone with our phones.
The Algorithm is why we’re lonely, and “The Algorithm” is because we’re lonely.
Restated, riffing on the Baudrillard quote about The Matrix:
“The Algorithm” is the only critique of “The Algorithm” that “The Algorithm” can produce.
But I think that Flusser’s answer would be less psychological and more media-theoretic, more communicological. The answer to why we feel the weirdness, that feeling somewhere between mania and motion sickness, is mirrored in the answer to why we want the answer to be “The Algorithm”:
The Apparatus has turned us into algorithms.
Writing for real outlets is nice but they’re never gonna let me make jokes like this, so…it’s a tradeoff.
I recognized the beats because I had just scrolled through that article after the editors at Mother Jones asked for a nugget of historical context.
Ben Jamin hahah lesssgoooo
Thank you for the link to Seaver's article on traps - truly insightful. Also, found it to be quite funny when i clicked through the link to your Mother Jones article and a pop-up implored me: "Don't let an algorithm decide what news you see." Doesn't quite stick when the headline is "The Algorithm" Does Not Exist