
Spotify Wrapped is a brilliant marketing play. Every year, millions of people gleefully share their top songs, favorite artists, and most-listened-to genres, essentially turning their personal data into free advertising for the streaming giant. But while it’s fun and feels personalized, it also sheds light on something deeper—and a little unsettling—about the world we live in today.
This year, my #1 song was “Boulder to Birmingham” by Emmylou Harris. It’s a beautiful, haunting tribute to Gram Parsons, her mentor, who died of a drug overdose. My wife and I sing along every time it plays. The thing is, I never asked Spotify to play that song. Not once. And yet, it kept showing up in my mix, again and again. Along with it were several John Prine tracks I didn’t seek out either. In fact, I didn’t actively choose any of the top songs Spotify says I loved this year.
This might sound like a minor quirk in an otherwise delightful digital experience, but it’s actually symbolic of a much larger issue. Increasingly, the world around us is being curated not by us, but for us, by algorithms that interpret our past behavior and then decide what we should see, hear, and engage with next.
Sure, this applies to music. But it also affects our news feeds, our product recommendations, our search results, our social media content, the ads we’re exposed to, and even the lies we’re told. The more we click, the more the algorithm “learns” about us. And the more it learns, the narrower our world becomes.
This is the filter bubble in action. Over time, our exposure to new ideas, unfamiliar perspectives, or even just different kinds of content diminishes. We’re not discovering anymore; we’re being fed what the machine thinks we already like, or worse, what will keep us clicking.
On the surface, this kind of personalization feels convenient. But in practice, it can be dangerously limiting. It traps us in echo chambers, reinforces existing biases, and makes it harder to challenge our assumptions or grow intellectually and emotionally. When the only ideas we hear are the ones we already agree with, how do we grow?
This isn’t just a tech problem. It’s a human one. And it has consequences that reach far beyond our Spotify playlists. It’s affecting how we think, how we relate to others, and how we understand the world. It’s part of what’s driving polarization, misinformation, and a culture of outrage. We’re not just being shaped by what we consume; we’re being shaped by what we’re allowed to consume.
It doesn’t have to be this way.
The good news is that we still have agency. We can choose to seek out opposing views. We can read books from outside our usual genres. (I’m not an architect, but Christopher Alexander is one of my favorite authors). We can listen to podcasts that challenge our thinking. We can actively resist the passivity that algorithms encourage. But we must do it intentionally.
Because left to their own devices, the algorithms will not feed us what we need—they will feed us what keeps us comfortable, entertained, and clicking.
I didn’t always see it this way. For a long time, I appreciated the convenience of curated content. But now I’m convinced: algorithms, as they are currently used, are making us intellectually lazy. They’re dulling our curiosity. They’re making us stupid—not in terms of raw intelligence, but in terms of awareness, perspective, and growth.
So here’s my challenge to myself—and maybe to you too: go out and discover something. Don’t wait for it to be served up by a machine. Curate your own experience. Choose what you want to see, hear, and learn next.
Because if you don’t, someone, or something else, will do it for you. And you might just find yourself singing along to a song you never chose in the first place.

Comments are Closed