
Discover more from Never Met a Science
Everyone wants to talk about THE ALGORITHM as a mechanism of whatever effect of social media they're interested in. Literally everyone wants to do this. Including me— but under two conditions:
1) We're talking about TikTok, whose algorithm-first design makes it the first post-network social media platform;
2) We talk about the effect of THE ALGORITHM on PRODUCERS of content, not CONSUMERS. Producers are always (more than) half the story.
(Programming note: I’m helping to host an online reading group for Flusser’s Communicology this summer, details at the end of the post.)
TikTok has gotten worse. I don't use the platform directly — it's too powerful, I fear what it would do to me — but I stay up to date through content recirculating on YouTube, Discord and Reddit. And the initial burst of creative energy unleashed by TikTok seems to have leveled off. At the recent Metascience '23 conference held in the stunning National Academy of Sciences (on which much more later), Stuart Buck made my vague unease concrete: everyone on TikTok has become a caricature of themselves.
The trajectory of the ideal-type aspiring TikToker goes something like this. They begin by trying to imitate existing “meme formats” of TikTok content. If they are “a baddie,” they might dance along to popular songs or make seductive faces directly into the camera; if they are homely, they might attempt unlikely physical feats, upload clips of copyrighted television or movies, or perform some kind of comedy routine. Most of these accounts stay within a generic yet narrow range, with the occasional trend or challenge thrown in for good measure.
But sometimes, the creator has a breakthrough: a tiktok goes viral, a few tens of thousands of views. The creator is inundated with social feedback. This feedback is the entire point of posting, a “public measure” of their value. Public measures, remember, are currency in the social media status game—and because they are encoded into the architecture of social media, the audience is always with us. As the Web2 Millennial joke from the early 2010s goes, “They say that going viral feels better than sex. I wouldn't know, I’ve never done either.”
The internet pop-psychologists have a name for what TikTok is doing to this creator. TikTok is “lovebombing” this creator: at the very beginning of their relationship, after having coyly dismissed them with a few dozen likes at best, TikTok has decided that IT LOVES YOU THE MOST, that it will overwhelm you with attention and affection.
In the eyes of internet pop-psychologists like the sages at Cosmo, lovebombing is a form of abuse. The premise of this facially absurd claim is that it is normal for young people today to be desperate for affection, that they are so alienated that courtship 101 is such a powerful form of psychological manipulation as to constitute abuse.
My Gen X+ readers, I’m sure, are rolling their eyes: “you’re telling me teenagers are insecure?! What’s next, the pope shits in the woods?” I think that quantified feedback does cause insecurity in young people and also in all people --- but bracketing the general point, the ABSOLUTE SCALE of TikTok’s lovebombing is beyond comparison and indeed beyond belief. We’re talking tens of thousands of views. And again, perhaps even more importantly, we’re talking public proof that you can show your irl friends. It’s like having sex and everyone else knows about it. (Recall that this is supposed to be appealing to teens, dear reader, and not your dignified self. The metaphor was also clearly coined by a cishet guy.)
But so this is a game-breaking amount of feedback. The creator has learned what THE ALGORITHM values, and after the initial high wears off, they’re immediately off chasing another hit. They have reached a local optimum, and any step in another direction means a step downhill. After my impression of Norm Macdonald goes mega vi, I can rely on my newly-created audience of Norm-heads to come through with some percentage of the initial views for each new Norm post. But if I go back to trying out new filters and new dance moves, my audience craters.
Algorithms don’t have feelings. The designer of the algorithm can prevent it from getting stuck in a local optimum by imposing some kind of penalty. This is the well-known problem of the explore-exploit tradeoff, a specific version of the foundational bias-variance tradeoff. Platforms seem to turn the dial towards “explore” when they’re in a growth phase, and then back towards “exploit” (that is, be conservative with your recommendations) when they’re in the “lying to Congress” phase. So part of what I’m describing with TikTok is likely the adjustment to safer recommendations.
But human creators DO have feelings, much to rationalists’ chagrin, and so they are likely to stick to their breakthrough content strategy well past the point where everyone has become tired of it — themselves most of all. Any experimentation on their part inevitably produces worse results in the form of fewer views. Partly, this is because it is likely to be “objectively” worse than their trusty formula (for you sickos who believe in objective quality). But mostly because the audience that the algorithm’s private metrics — the internal weights generated through collaborative filtering or some fresh uninterpretable hell — estimates for that creator is not likely to share a taste for the novel content, being defined only by their taste for the original content.
Except. On TikTok, consumers are also being trained. Here, they’re being trained by both the platform and the algorithm: they’ve been behaviorally conditioned to a degree that would make BF Skinner salivate, scrolling through content with no commitment and no memory of the details. The re-appearance of Norm Macdonald Guy sparks a moment of recognition: this content has faint echoes of the original content that made you chuckle all those aeons ago, perhaps your scrolling thumb pauses just long enough for THE ALGORITHM to register that this creator-consumer connection still has some juice in it.
THE ALGORITHM’s job gets easier the more it pacifies both the creators and consumers! At the beginning of TikTok, we saw a creative outpouring with few analogues in history. The democratization of high-quality video content (props legitimately due to TikTok's impressive suite of video filters and sound library) created an outlet for untapped human creative potential.
But THE ALGORITHM doesn’t care about quality or creativity; it can’t even detect it. It just wants to match consumers and videos. If anything, diversity of content and of taste makes this task more difficult. THE ALGORITHM is happiest when every viewer is watching the exact same video on repeat: Total Entertainment Forever.
So here is where TikTok’s algorithm finds itself. It is performing marvelously on its internally-defined metrics. But it can’t tell that everyone --- on both sides of the creator-consumer market --- is finding things increasingly stale. Am I really on this app to watch the 25th iteration of “Average Redditor does X”? Am I really on this app to CREATE the 25th iteration of “Guy with a Podcast”?
The problem is that data science engages with a simulation of reality within which it performs a simulation of social science. My Penn State colleague, the philosopher of data science Fred Fonseca, applies Baudrillard’s concept of hyperreality to understand what data science is. It is literally a “science of data” — that is, science which takes data as its phenomena.
This is incredibly appealing, from a philosophy of science perspective. Reality is too large, too messy — the complexities are endless. But data science doesn’t have to deal with reality. It only deals with a simulation of reality, the simulation defined by the data. These simulations, created by humans, are dramatically less complex than reality and are thus far more tractable for social science.
TikTok, as a platform, is composed of relational databases hooked up to machine learning models that output video media on apps from billions of smartphones and encode data from those apps and smartphones. The TikTok scientist — and the algorithms they deputize — is trying to get a high score within the simulated, simplified reality created by those data and those Human/App interactions.
The standard description of the task of the recommendation algorithm is to match up videos with the viewers who want to watch those videos—to make good recommendations. But the algorithm doesn’t know what any of that means. The “objective function” is maximized within the simulation by receiving the best possible inputs from the apps. The data that construct the simulation, from the perspective of the algorithm, are things like watch time, reactions and overall time on the app.
The specific objective function has to be defined by the business-side people in the company. This decision is based on a subjective evaluation of the mapping between the data in the simulation and the data in one of the larger simulations in which we all participate — capitalism.
With the objective function defined, the data scientist’s aim is simply to fine-tune the performance of the algorithm to get that high score. What if the meaning of the data changes, though? This is what data scientists call “concept drift,” and it is treated as a nuisance: when reality re-asserts itself within the simulation, the illusion is revealed and the high score becomes meaningless. The hyperreal “world” the data scientist had been studying and optimizing ceases to exist.
The human actors within this hyperreality, the TikTok “users,” can likewise re-assert themselves by changing their relationship to the metrics and recommendations that the platform shows them — yes, exactly like Neo from The Matrix.
The Wachowskis, who directed the film, were explicitly inspired by Baudrillard’s theories of hyperreality. But as the hat above shows, Baudrillard hated the film. The problem is phenomenological: using the standard cinematic conceit, the characters in The Matrix are shown experiencing an exact copy of reality. Their sensory experience, their eyes and ears, are indistinguishable from our actual embodied experience. This simulation is indeed all-powerful, as it is in the plot of the film, but that’s very much not the situation that Baudrillard describes.
There are real simulations in which we are all embedded today, but they’re still pretty far from eclipsing all of our direct, embodied experience of reality. We interface with these simulations through non-totalizing media objects like TikTok. And the two-dimensional smartphone screen, tinny speakers and quantified audience measures that comprise the sensory world of this simulation are not the world.
I think that TikTok has “overfished” its reservoir of creativity, that the rapid growth it achieved by pacifying audiences and Skinner-boxing creators disguises a fundamental weakness. TikTok’s The Matrix is starting to glitch, the colors are becoming less saturated, fading to a dull gray. As powerful a force as THE ALGORITHM is, it can only act within this tiny simulation that relies on human attention. Agent Smith is scary, but he’s trapped in an app in your smartphone that you can delete in less than five seconds.
As promised: Flusser book club! Here’s a short medium-length video introducing the theory he advances in Communicology.
If you’d like to join, head over to New Model’s page and subscribe at the “Community” tier, mentioning in your application that you’re interested in the Flusser reading group. New Models is one of best things on the net, in my opinion, and I’m hoping that more people opt out of the platform simulations and into genuinely helpful communities like this one. If you want a taste, here’s a podcast I did with them last year.