How the Algorithmic Gaze is Leading to the End of Human Autonomy

Exploring the Creation of Panopticism, the Problem with Behavioral Prediction, and the Destruction of Bourgeoisie and Proletariat with AI

A few days ago, my 13-year-old brother and I talked about adopting a puppy. Later that night, as I was scrolling on Instagram, I saw ads for pet toys which really creeped me out. The creepy part wasn't the pet toys, but the realization that my phone had been listening to our conversation.

Have you ever felt that your phone was listening to you too?

These eerie coincidences are not just random. They are part of a grand design to predict and influence your behavior, almost like a Panopticon.

The Panopticon: A Modern Reincarnation

The concept of the Panopticon was developed by the English philosopher Jeremy Bentham in the late 18th century. It featured a circular building with a central watchtower surrounded by inmate cells. A single guard could observe all inmates without them knowing if they were being watched at any given moment.

Panopticon blueprint by Jeremy Bentham, 1791. (Courtesy: researchgate.net)

This design created a psychological effect. The constant possibility of being observed was meant to encourage inmates to regulate their own behavior. The uncertainty of when they were observed compelled them to internalize the gaze of authority.

Today, we live in digital Panopticons. We are constantly under surveillance of not just one single guard, but millions of algorithms designed to predict and influence our actions. Social media was already predicting our behavior sneakily, but AI has taken this to new heights with Reinforcement Learning from Human Feedback (RLHF).

RLHF allows AI models to learn directly from human preferences, helping them align more closely with human behavior and expectations. As these models get more accurate, they can easily be exploited to serve hyper-personalized custom content distorting our realities.

We've stopped questioning whether this should even exist, and now we're incentivizing RLHF to train AI models. This short-sighted approach is like drinking poison only because it tastes sweet.

We’re embracing technology with far-reaching consequences without examining its long-term impacts on society and our autonomy.

The Noble Lie: The Illusion of Freedom

The tale doesn’t stop at behavior prediction. The rise of social surveillance has given birth to a new phenomenon: choice hacking.

What possible explanation could justify taking away choice and autonomy from another human being?

Orestes Pursued by the Furies by William-Adolphe Bouguereau (in black and white filter), 1862. It depicts a dramatic scene from Greek Mythology showing Orestes, the son of Agamemnon, tormented and chased by the Furies (Erinyes), who are ancient goddesses of vengeance. Furies pursue Orestes because he killed his mother, Clytemnestra, to avenge his father's murder. The artwork captures the feeling of intense guilt and fear highlighting the psychological torment of conscience. (Courtesy: wikipedia)

Plato thought about it deeply and introduced the “Noble Lie” within the context of his ideal society, the Kallipolis, in "The Republic."

The Noble Lie is a myth propagated by rulers. It asserts that people are born with different types of metal in their souls, determining their social roles as rulers (gold), warriors (silver), or producers (bronze/iron).

This myth promotes social harmony and ensures stability. It's called "noble" because its intention is to achieve a greater good. However, it undermines individual autonomy and promotes deception.

Today, we live in an age of the Noble Lie. We are given the illusion of freedom, but our choices are being engineered.

Our predictability is exploited with subtle nudges, targeted ads, personalized content, and neuromarketing tricks. As globalization pushes the world into a more homogeneous state, the exploitation of consumer behavior leaks into our political opinions, social interactions, and even our self-perception.

In the process, we become products illusioned into being consumers. It’s a classic Master-Slave dialectic!

But how does this affect the world we live in?

The Economic Impact: A New Class Divide

In 1848, Karl Marx published "The Communist Manifesto," introducing the idea of bourgeoisie and proletariat while critiquing capitalism.

The bourgeoisie is the class that owns the means of production, such as factories, land, and capital. They benefit from the labor of others. The proletariat is the working class who do not own the means of production and must sell their labor to survive. They are the wage laborers in factories, mines, and other industries.

Fun fact: Karl Marx loved storytelling and often entertained his children with tales and jokes. He loved nicknames and his friends called him “Moor”. (Image courtesy: britannica)

Today, big tech companies have become the new bourgeoisie, controlling the digital means of production. The proletariat, in this context, are not just workers, but users of these platforms whose data and attention are the commodities being traded.

Basic skills are being rendered obsolete as machines take over roles in customer service, content creation, and even emotional labor. The proletariat’s value is diminishing, as their labor is co-opted by algorithms that can perform their jobs more efficiently and at a lower cost, while their data is literally stolen from them.

So with the yin-yang dissolving into shades of gray, where is this world headed?

The Evolution of Labor: From Physical to Emotional

Historically, human labor was predominantly physical. Our ancestors exerted immense physical energy to survive. As societies evolved, so did the nature of work.

Intellectual labor began to replace physical labor, emphasizing cognitive abilities. Problem-solving, creativity, and analytical skills became paramount, driving us into a knowledge economy.

Today, we find ourselves in a new era where emotional labor has emerged as a critical component of the workforce. The ability to understand, manage, and effectively express emotions is increasingly valued. From customer service to leadership, emotional intelligence is becoming a cornerstone of work.

The Rock Drill by Jacob Epstein (1913). It combines human and machine elements, showing the blend of man and technology. While machines are becoming more and more like us, we too are also turning more like them with trends like cyborgism. (Courtesy: blarse)

Paradoxically, while emotional intelligence is gaining recognition, a contradicting trend is emerging with AI companions and chatbots. Designed to provide emotional support, they diminish the value placed on human connection and empathy. This raises profound questions about the future of work, relationships, and the essence of human experience.

The question then becomes, if we’ve already commoditized physical, intellectual, and emotional labor, what else can a human offer? How far are we from having nothing left to capitalize on? What is the meaning of creating “value” then? And what “value” can a human really offer? What does it even mean to be a human?

Reply

or to participate.