In the introduction to Susana Tosca’s book ‘If You Liked That, You Will Love This: On Sameness-Based Algorithmic Recommendation Systems,’ Tosca tells the tale of the Greek myth of Echo.
According to the Roman poet Ovid, Jupiter needed to distract his wife, Juno (pictured), to go on romantic adventures with the nymphs he admired. Echo, a nymph, was charged with the task of having long conversations with Juno to distract her.
Juno found out about Jupiter’s strategy. Livid, she took away Echo’s voice and the ability for her to have her own thoughts. Echo’s frustrating fate was to only repeat the sounds and last words of others.
For Tosca, Echo’s fate is our own, condemned to an algorithmically-driven echo chamber of other people who are just like us. Pictured is a statue inspired by Echo's tale.
Given our ease of connectivity with people living very different lives than we are, we should be in an age of constant newness, but cultural critics seem to think we are, instead, becoming less interesting.
Brand consultancy strategist Beth Bentley argues that it’s more difficult than ever to put “new, original, interesting, provocative, strange, challenging things out into the world.”
Why is the algorithm more keen on hiding the innovative in favor of the familiar? People engage with the palatable with greater ease—and quicker.
According to Bentley, Meta’s Q3 2024 report notes how Instagram prioritizes recommending content that’s considered “high-engagement format alignment.”
What does “high-engagement format alignment” even mean? Essentially, it replicates content that’s already performing well.
Fashion industry executives are using algorithms and AI tools to predict what’s next for fashion, essentially reducing the creativity and capacity of designers to reproduce what’s been profiting the most, according to data.
By investing in what’s familiar rather than risking anything new and original, fashion is betting on our long-term commitment to sameness.
A survey conducted by consultancy firm Deloitte found that over 70% of Gen Z customers prefer to buy stuff they can recognize.
Bentley says it’s not just algorithms, it’s a social “meh-chinery” that “rewards conformity over creativity,” rewarding high-performing sameness.
In fact, Bentley points to Pantone Color Institute’s choice for color of the year. The color—PANTONE 17-1230 Mocha Mousse—can be described as a mix of pinkish-brown, a very muted color.
The Institute’s choice rightly reflects how it describes the hue, which they note, answers “our desire for comfort.” Indeed, the color choice is at the very least uninspiring, at most placating to any aesthetic challenge in favor of “comfort.”
In fact, Spotify’s Top 100 tunes in 2024 had the most similar scores between them than ever recorded before in the company’s history.
It’s clear that this repetitive reproduction of culture is not just homogenizing consumers, which is likely to have unfavorable societal repercussions, it’s also making it particularly challenging to come up with interesting and innovative concepts.
The selling point of AI is its potential to push us into a new age of technological advancement that will radically change how we live, think, and engage with the world, but it's becoming a machine of reproduction.
The epic tool of creative possibilities that has the capacity to create “imaginary worlds” can be characterized as a talking parrot, mimicking derivatives of the same.
AI functions on the basis of machine learning, taking human input and, often, rehashing. Unless AI can independently learn and sell innovation to consumers, it’s hard to imagine how much AI could push us into a future that’s different, inspiring, or particularly interesting.
A visit to your local hipster café informs the same sentiment. Flat white lattes and digital nomads linked to the Wi-Fi network chomping down avocado toasts tells us a lot about how our online behavior is directly reflected in our social realities.
Since politics and art, thus aesthetics, cannot be disentangled from one another, how we live, our values, are certainly determined by what we consume, why, and how.
French sociologist Pierre Bourdieu (pictured) argued that people’s taste is largely class-based, that taste is, according to Tosca’s interpretation of Bourdieu’s work, “a question of belonging.”
Perhaps the online world changed that strong differentiation between classes, as many cultural products are available to us at the click of a button, but the algorithm is grouping people together based on their online behavior, location, and tendencies.
This means that recommendations that are tailored to you suggest not only that the algorithms have deep insight into your specific personal taste, but also that their choices contribute to your personal fulfillment or happiness.
Tosca makes an important distinction of how the algorithmic suggestion differs from suggestions emerging from your friend group. She notes the difference is in “scale and filter.”
Your friends, for example, will only tell you about cultural products that left significant impressions on them. They’ll likely share their recommendations, in addition to their critiques.
The algorithm may suggest a cultural product, like a film on Netflix, for example, that is seemingly appealing—helping us work through an overwhelmingly mass amount of options—but, as Tosca rightly argues, that doesn’t imply that their suggestions are actually suited to our personal taste.
Tosca notes that there is a “mutual domestication” of both us and the algorithm. We tell it what we like and it continuously draws narrow boundaries to reiterate ever-shrinking perceptions of our taste.
Sources: (‘If You Liked That, You Will Love This: On Sameness-Based Algorithmic Recommendation Systems’) (Sociology of Business) (D1A) (FlowingData)
See also: Subliminal shopping? How brands secretly seduce you in movies
Is popular culture making us less interesting? Brand consultancy strategist Beth Bentley, founder of Tomorrowism, seems to think so. In what Bentley calls "innovation stagnation," she argues that there's a "cultural flattening" of sorts happening in which our engagement with social media and the way algorithms shuffle information to users is ultimately resulting in a sort of general sameness.
Not accounting for rich cultural diversity, we're left with a rather sad, one-dimensional sense of bland repetition in which design, style, and even opinions, seem to exist in an echo chamber. Aesthetically and intellectually, we are gravitating more and more toward what is easy to consume, refusing any and all attempts to broaden our perspectives and putting personal taste to the wayside in favor of what's familiar.
Intrigued? Click on to know more.
Cultural meh-fication: why everything looks the same
Why does everything look the same?
LIFESTYLE Algorithm
Is popular culture making us less interesting? Brand consultancy strategist Beth Bentley, founder of Tomorrowism, seems to think so. In what Bentley calls "innovation stagnation," she argues that there's a "cultural flattening" of sorts happening in which our engagement with social media and the way algorithms shuffle information to users is ultimately resulting in a sort of general sameness.
Not accounting for rich cultural diversity, we're left with a rather sad, one-dimensional sense of bland repetition in which design, style, and even opinions, seem to exist in an echo chamber. Aesthetically and intellectually, we are gravitating more and more toward what is easy to consume, refusing any and all attempts to broaden our perspectives and putting personal taste to the wayside in favor of what's familiar.
Intrigued? Click on to know more.