Algorithmic Addiction
Imagine an algorithm where the sole purpose is to find you something you like, want, or need. Now ask yourself: How different is that from the algorithms you encounter on a daily basis?
The answer is more uncomfortable than many might expect. The thing you imagined is nothing like today’s content recommendation engines, and it’s because of a straightforward but unsettling bottom line: you’re not meant to feel satisfied.
If a user “gets it” and feels there’s nothing more to learn, they tend to leave the site. But that’s not the normal experience with today’s endless timelines; people spend hours and hours looking at the next piece of content. At most, we are supposed to feel a myopic catharsis that prevents us from investigating further.
If what you found satisfied you, the session would end. Understanding creates closure, which ends engagement (and engagement is when people are served ads, generate data, and make purchasing decisions). So, platforms are incentivized to prevent resolution. This creates a situation in which nothing is supposed to be genuinely explained.
And don’t get me wrong, there is content that fully explains things, at least as much as a person can in one go. However, again: once users have seen that content, they typically leave the site or app because they feel satisfied. This behavior signals the algorithm to stop recommending that content.
Today’s algorithms are not geared towards giving a user what they want; they are geared towards the user’s continued use.
What does that sound like to you? Cigarettes? Booze? Drugs?
To me, it sounds like the logic of addiction, and there is a wealth of behavioral research (and legal action) that supports this. Indeed, there are reasons that addiction to substances is more serious (and a much more immediate and fundamental threat to a person’s well-being), but the similarities should not be ignored.
The same pattern that drives compulsive behavior in gambling or substance use appears here: unpredictable rewards, intermittent novelty, the constant possibility that the next thing will be the one that satisfies you. The algorithm doesn’t serve a meal; it serves hunger.
This isn’t a mystery of human weakness; it’s a business model. The same logic that drives casinos and free-to-play games drives social platforms. Most users provide data, but a smaller class of “whales” is cultivated whose behavior funds the entire ecosystem (1-2% of users accounting for 50-70% of a game’s revenue).
The same principle applies to everything built on attention. A handful of extreme users, creators, and outrage engines generate most of the engagement that keeps platforms profitable. Algorithms quietly capture and train these whales, purely because their compulsive behavior is more valuable.
To sustain a few, you need the many. The whale model only works if millions of ordinary users keep the water churning, scrolling, posting, and reacting, creating the illusion of an infinite ocean.
But make no mistake: the goal is for everyone who will sink deeper… to sink deeper. The system obviously doesn’t merely depend on whales; it’s designed to create them. Each interaction is both bait and self-conditioning, teaching users to invest a little more time, a little more attention, a little more money.
It’s not just behavioral design; it’s class.
People understand that “The Media” is an establishment institution; they may not use these words, but it is owned by capital and is an instrument of the ruling class. People do not realize that Social Media is a more advanced, individual-level evolution of mass media.
“Media” comes from the Latin “medium,” meaning “the middle, midst, center; interval.” Legacy media is very obviously a mediating force between you and the world, but social media is less obviously so.
Social media isn’t merely a place where people share content, and it’s certainly not the people doing the sharing. Social media is the platforms. Social media is all the addictive features and characteristics previously mentioned; it’s the algorithms.
And the algorithms are not mystical forces nor are they beings of any kind. They do not have a will of their own. They are not ethereal; they are not an unexplained, natural force. Algorithms are machines that produce something; under the current paradigm, algorithms are productive property (a means of production).
The concepts of an algorithm or machine learning are neither good nor bad. They have the potential to be either, but a machine implemented to exploit people will do so regardless of its potential as a force for good.
This is to say, “algorithms” are not the problem (which, to be fair, is different than with addictive substances in certain ways). The people who own the algorithms, who spend capital producing them to a spec that trains us (via the logic of addiction) to endlessly “doomscroll,” are the problem.
And it’s not even really the people themselves. In most cases, these people are doing what is logical for someone in their position.
It’s the positions they occupy.
Imagine an algorithm where the sole purpose is to find you something you like, want, or need, rather than to keep you on a site or app as long as possible. This is the logic of abundance; all the bills paid, all the mouths fed, and a surplus of time not committed to simply surviving.
All of the shit people are endlessly sifting through and arguing about starts sounding like a different kind of problem, doesn’t it?



Love this perspective! You truely hit the nail on the head. It's crazy how these systems are engineered with objective functions prioritising 'time-on-site' over actual user satisfaction or closure. As a CS teacher, I see this paradox clearly – engagement metrics are king, and genuine understanding often comes second.
Did you just explain how the meme where the guy is 99 pages deep in his p*rnhub search is our entire lives?