How recommendation systems turned human curiosity into a closed loop.
Hey culturonauts—
Let’s slow this down. Because this story didn’t break all at once. It drifted into place, quietly, like a habit you don’t notice until it owns you.
There was a moment — a very real one — when the internet felt expansive. Not because it offered infinity, but because it offered accident. You could trip over an Icelandic jazz trio, stumble into a Korean indie film, fall headfirst in love with a painter from Nairobi who didn’t care about being “algorithm-friendly.”
Discovery wasn’t gamified. It wasn’t a KPI. It wasn’t domesticated. It just… happened. Like every good thing in culture always has.
And then the machines learned what we didn’t want them to learn: that curiosity is volatile, expensive, slow. Repetition is none of those things.
The Silent Reversal
The biggest cultural plot twist of the last decade is that the very systems built to liberate our taste ended up compressing it. At first, gently. Then aggressively. Then structurally.
Not by limiting choice — that would be too obvious. Too crude.
Instead, they limited surprise.
And surprise is the oxygen of human taste.
We didn’t notice the shift because the interfaces stayed friendly. The playlists got prettier. The thumbnails got cleaner. The recommendations got “smarter.”
But beneath that surface, something fundamental broke:
the relationship between what we like and what we could like.
That gap — that gorgeous, essential, maddening gap — is where culture grows.
And the algorithm closed it.
The Optimization Trap
Let’s name the villain correctly: optimization.
Algorithms didn’t collapse culture because they’re evil. They collapsed it because we asked them to keep us watching, listening, scrolling.
We handed them the keys and said:
“Show me more of what works.”
And the machine — obedient, literal, ruthlessly efficient — did exactly that.
It learned the shape of your attention.
Then it built a cage around it.
Spotify’s Discover Weekly turned into an echo chamber with better cover art. Netflix became a hall of mirrors where the mirror gets cleaned weekly. Instagram stopped showing you the world and started showing you a refined simulation of your previous week.
This is not personalisation. This is pattern reinforcement.
And patterns do not evolve. They calcify.
Culture Under Compression
Here’s the part we underestimate: algorithms don’t just react to culture — they pre‑empt it.
Artists now create with a third collaborator in the room:
- the algorithm’s taste for hooks in the first seven seconds
- the algorithm’s preference for familiar genre blends
- the algorithm’s impatience with ambiguity or slowness
- the algorithm’s obsession with clarity at the expense of texture
This creates a new kind of cultural gravity: everything pulled toward the median.
The bold becomes softened. The strange becomes normalized. The difficult becomes invisible.
We call it personalization, but what it really is — if we’re honest — is cultural flattening disguised as convenience.
And the tragedy is that we’re not losing quality. We’re losing range.
Prediction Is Not Understanding
Here’s the uncomfortable truth we avoid because it feels too philosophical for a Tuesday morning:
Algorithms do not model who you are. They model who you’ve already been.
You are not being understood. You are being reduced. Compressed into categories. Distilled into clusters. Smoothed into probabilities.
Taste, in its natural form, is chaotic. Contradictory. Incoherent. Embarrassing. Evolving.
The algorithm hates all of that.
Its job is not to explore you — its job is to stabilize you.
A stable user is predictable. A predictable user is profitable. A profitable user is easy to retain.
And retention is the only religion the algorithm believes in.
The Death of Serendipity
Serendipity isn’t randomness. It’s an intrusion — a disruption — a friction that forces your inner world to expand.
Think about the last time you encountered something that didn’t fit your profile, but stayed with you anyway. That piece of music you weren’t supposed to like. That film you almost skipped. That book recommended by a human who didn’t know your “taste graph” and didn’t care.
That’s how taste evolves: through discomfort. Through contradiction. Through exposure to what sits outside your predicted perimeter.
But machines don’t like those edges. Machines optimize for consensus.
And consensus is where culture goes to die.
Reintroducing Friction (A Human Luxury)
If we want to reclaim our taste — really reclaim it — we need to reintroduce the very things our systems are engineered to delete:
- the off-note
- the difficult book
- the unmarketable film
- the song that takes three listens to open up
- the idea that challenges instead of soothing
- the artist who refuses to be legible
Discovery was never supposed to be convenient. It was supposed to be transformative.
And transformation never comes from the familiar.
We have to allow culture to surprise us again — to break our patterns, disturb our certainty, and remind us that curiosity is not a demographic metric but a human impulse.
Because a system that only feeds you what you already like doesn’t serve you.
It cages you.
Until next time, stay unpredictable.
Alex
At Kredo Marketing, we help companies think beyond the algorithm—designing strategies that protect curiosity, not just performance.