If you live in modern times you've probably experienced Algorithm Anxiety—the feeling that your hard work to tailor a personalization engine will be easily undone. Engines that govern content recommendations are tailored by our interactions, even micro-interactions, whose signals are so plenteous they cannot be numbered.

I faced this feeling recently. I was accidentally logged in to Spotify on the family iPad, and my beautiful nine-year-old kiddo decided to marathon CG5, BTS, and a bevy of Minecraft—and Among Us—themed music.

It took me 4 years to perfect my Spotify.

And it took my daughter a single afternoon to ruin it. My recommendations are now irreparably damaged.

The paranoia associated with Algorithm Anxiety often leads us to depend more on the algorithm than on human interactions, which leads to isolation. We've become conscious of the ever-omniscient algo, driven by our dependency on the recommendation engine, leading us to a latent, low-humming, unceasing anxiety.

The result? We're less experimental, reticent to take suggestions, and inevitably become trapped in a genre prison.

Seeding and Serendipity

The average Spotify subscriber is acutely aware of the fact that their past listening habits shape future recommendations, they may not realize the depth of the data that Spotify has on them. Spotify doesn't just track what you "heart", or like, but also how much of a song you listen to before skipping, as well as genre affinity.

Spotify is merely good at it, but Instagram perfected the art of behavioral analysis in their recommendation engine.

Your micro-interactions give off signals to the algo about your nuanced tastes. Then they take that data and begin to feed you suggestions based on other people's behavior—often people that have similar tastes to your own.

This process, called seeding, is tested against the control, which is your normal, everyday behavior. Based on your interaction, new information can be gained. Not just about your tastes and interests, but also your ability for those tastes and interests to be manipulated. Often, lookalikes aren't fully anonymized. If the algorithm in question knows about your social graph (e.g. you logged in with Insta or FB), it may take the behavior of your closest contacts into account.

In Episode 105 of Future Commerce we discussed how their engine shapes taste over time:

Phillip: [00:09:07] If that's something that caught your eye [Instagram] knows contextually [that you scrolled back to it, and paused...] so they can feed more information to you and they can reinforce that over and over and over.

It's the reason why I believe I am into sneakers. It's not because... I've loved sneakers my whole life. That happened when I was 35. It's because Instagram kept shoving it down my throat.

Maybe I'm happy being a sneakerhead. Maybe. Or maybe it's the reason that I buy anything nowadays.

This awareness that the algorithm is "watching" leads to a category of cognitive biases called the Hawthorne effect. The Hawthorne effect occurs when people behave differently because they know they are being watched.

I often think about a tweet from 2018, about a Lyft driver giving a form of praise to an unknown force, the algorithm, for its provenance:

What we gain in experience, we trade-off in a background static of worry and paranoia about ruining some perceived gain or progress. A squandered time investment made in "perfecting" the engine. When the engine works in our favor, it feels like divine blessing. When it works against us, it feels like punishment.

It seems to me then, that Algorithm Anxiety has similar symptoms and effects as psycho-spiritual stressors. We've created new invisible gods whose blessings are fleeting, and whose curses are everlasting. Except instead of life eternal we discover more bands like Anderson.Paak.

Creating new gods to please. How very human of us.

This week I realized the power of my chronophobia over not just my Spotify algorithm, but of all the algorithms that power my experiences. This include, but are not limited to:

  • My TikTok FYP
  • Ad recommendations on various websites
  • Relevant Google News article push notifications
  • Instagram Explore
  • My Poshmark style

The more "artificially intelligent" the world becomes, the more cognitive weight it places on the consumer to continuously maintain a garden that a single pest could destroy.

Algorithmic Predestination

Greek poet Ovid wrote of Pygmalion, a sculptor, who became so enamored with his own creation he begs the gods to bring her to life, and they do so.

Have you ever found it curious that science fiction writers of the early 1900s were eerily accurate in their depiction of the future? It's no coincidence. The Pygmalion Effect would state that we built that future on purpose based on our foreknowledge; making science fiction into science fact. The technologies we have today are the fulfillment of the work of creative imagination that took place decades in prior decades.

High-speed rail, global telecommunications, pocket-sized touch screen devices—these are all examples of The Pygmalion Effect.

The Pygmalion Effect is an example of an other-imposed self-fulfilling prophecy that states the way you treat someone has a direct impact on how that person acts. If another person thinks something will happen, they may consciously or unconsciously make it happen through their actions or inaction.

We're living out the Pygmalion Effect today with current talk of the Metaverse. While Instagram has the power to shape consumer desire, its chief executive officer, Mark Zuckerberg, has the power to shape where its competitors make investments. Meta, the new parent company of Facebook, is making investments into VR and ancillary technologies, virtually willing the Metaverse into existence.

Figure 1: The Pygmalion Effect in action.

When thinking about these past ten months of 2021—the collective excitement around crypto, the bull market in NFTs, the hordes of people using services like Discord to organize, and the rise of Decentralized Autonomous Organizations and the new breeds of corporations they enable—it’s astonishing the pace of adoption and willingness for disruption that developers and community members are welcoming. Their efforts have pushed Zuckerberg into a full-on change of narrative and he’s betting the future of his company on it.

The Metaverse becoming self-fulfilling, based on external stimulus, is analogous to a 1968 study performed by psychologists Rosenthal and Jacobsen. In it, they conducted an experiment to see whether student achievement could be self-fulfilling, based on the expectations of their teachers. From Simply Psychology:

Rosenthal and Jacobsen gave elementary school children an IQ test and then informed their teachers which children were going to be average and which children were going to be ‘Bloomers’, the twenty percent of students who showed “unusual potential for intellectual growth”

However, unknown to the teachers, these students were selected randomly and may or may not have fulfilled that criteria. After eight months, they came back and retested the children's intelligence.

The results showed that Bloomers' IQ scores had risen (experimental group) significantly higher than the average students (control group), even though these academic bloomers were chosen at random. The bloomers gained an average of two IQ points in verbal ability, seven points in reasoning, and four points in overall IQ.

The Twitter algorithm may be to blame for Zuck's Meta-pivot.

For years, the Twitter algo has been uniquely tuned to provide contextual suggestions for tweets from people outside of a user's immediate network. But during Covid, lockdowns and stimulus contributed to an unusual acceleration in interest in speculative assets like crypto and aggregation of subsidiary communities. Crypto suggestions began to fill timelines and by the summer of 2021, DTC Twitter became NFT bull market Twitter. Ecommerce Twitter avatars changed to PFP (profile pic) images of cats and monkeys. VC Twitter turned into pumping various PFP projects. The general tulip mania of "internet money" achieved escape velocity, and the Twitter algorithm created the environment that we're now in.

What I witness now is a general anxiety of being left behind as a technological revolution takes hold. If you spend any time on Twitter, you've no doubt witnessed the NFT bull market in motion; and you may have begun some research on Google, or joined a Discord, or participated in a Twitter Spaces discussion to learn more about web3 and NFTs. If so, you've participated in training a host of algorithms to proliferate the sense of rapid adoption, the need to become educated, the FOMO, and eventually, "aping in".

The metaverse as a concept is so broad that it's effectively all-consuming; as it can mean community, economy, currency, pseudo-anonymity, corporate and governance structures, and even play-to-earn video games that double as a form of income and work.

In our version of this reality, Meta (the company) will bring the metaverse (the thing) to consumers whether they’re ready for it or not, based on the external stimuli of an incredibly passionate and vocal community of crypto and defi enthusiasts.

Why? All because of algorithmic predestination.

The Meta-verse and Algorithm Anxiety

To be fair, the bet seems like a sound strategy. In the shift to mobile devices, Facebook ceded its fate to carrier networks and device manufacturers; namely, Apple, Google, and Samsung. Now it is manifesting the possibility to own the hardware gateway to the Metaverse, with its ownership of Oculus, positioning AR and VR as a meta-layer on top of the real world.

Meta’s attempt to own the devices and the platform; which seems counter to the narrative that makes most Web3 proponents so bullish; which is c. To date, the crypto community relishes the ability to remain anonymous. VC firms are investing in pseudonymous founders of crypto projects. This is fundamentally at odds with Facebook’s view of the metaverse: real identities, and real people, in a virtual world.

Facebook has survived at least three fundamental shifts in technology in just twenty years: it won social media, it survived the smartphone wave, and it changed culture forever (for better or worse). Now, for it to survive the next cultural shift, it must fulfill our predetermined future.

This poses a problem for people suffering from Algorithm Anxiety.

The awareness of being watched, and our actions tattling on us, is shaping how we spend money, who we interact with online, and eroding our tolerance for risk. It's changing how we interact socially. Without controls in place to allow us to have behavior "modes"—or the ability to take on pseudonymous personalities while in the metaverse, we run the risk of having our virtual world experiences shape our real-world behavior. Algorithm Anxiety could intensify and spill over IRL.

Awareness that these apps are passively altering our behavior is the beginning for brand, eCom, and business leaders to create more equitable experiences for customers.

Platforms like Spotify lack simple user controls to prevent Algorithm Anxiety—for instance, the ability to clear browsing data from a particular window of time. Heck, let's put Incognito Mode into everything. Isn't that what it's for? So that we're not tracked when we're behaving outside of the norm?

The stakes are low when our children ruin our musical suggestions. The stakes are much higher when we make business decisions in VR on Facebook Horizon.