Noetic Sovereignty in the Age of Experience Machines

 

In 1974, the philosopher Robert Nozick posed a thought experiment about a neurostimulation device that could perfectly simulate any imaginable experience. This device, which Nozick dubbed the Experience Machine, could simulate the experience of writing a great novel, making friends, or being a celebrity. The Experience Machine offers its users boundless simulated pleasures that are so real the user believes they are all really happening. The question for Nozick is: “Should you plug into this machine for life, preprogramming your life’s experiences?”

Nozick’s Experience Machine was an argument against ethical hedonism, a widely held moral belief that pleasure is the ultimate good and anything that is not pleasurable doesn’t improve a person’s well-being. At the time Nozick was writing, the Experience Machine was purely a hypothetical. The core technologies required to bring it to life—hyperreal computer-generated simulations and precise neurostimulation devices—were decades away from commercialization. 

But as we approach the 50th anniversary of Nozick’s famous thought experiment, the Experience Machine has evolved from a philosophical problem to a practical one. Today, there are a growing number of devices that combine virtual reality with neurofeedback systems to deliver fully-immersive virtual experiences that are increasingly indistinguishable from reality. Although their simulations are not yet as perfect as Nozick’s Experience Machine, their designers clearly aspire to this lofty goal. For now, they are mostly used as medical devices, but they are rapidly expanding into the consumer market as demonstrated by Apple’s recent unveiling of the VisionPro. 

The question facing users of these devices is slightly different from what Nozick posed half-a-century ago. Whereas Nozick imagined that users would be able to choose and “preprogram” their Experience Machine, the experience machines being built today deny their users even this pretense of agency. What Nozick failed to anticipate in his thought experiment was the arrival of generative AI, which will dynamically generate simulated experiences tailored to each individual user based on their neurological and physiological data. These real-world Experience Machines, in other words, will totally determine the user’s subjective conscious experiences (ie., qualia). 

These real-world Experience Machines did not spring from nothing. We can see traces of their engineering and design philosophy in technologies as disparate as smartphones, social media, and television. For decades, consumer technologies whose primary purpose is serving content have perfected the art of engagement. A large body of research shows that the reasons television, social media, and video games are so engaging is because they activate the brain’s mesolimbic pathway: the dopaminergic system that is largely responsible for our feelings of pleasure. VR devices enhanced with generative AI and neurofeedback systems are the logical end state of this technological paradigm: by monitoring a user’s brain state the device can generate hyperrealistic content that is perfectly tailored to maximize the individual user’s pleasure. 

The problem with these Experience Machines is that they deny the user’s noetic sovereignty: their inalienable right to determine the content of their consciousness. Although path dependency in the evolution of technology is real, it is not inviolable. We must insist on protecting our noetic sovereignty in the coming age of experience machines, which means exploring alternative technological paradigms that respect the right of the user to determine the content of their conscious experience. 

Noetic sovereignty is a concept that is downstream of the broader movement for neurorights and cognitive liberty. It is critically important to recognize that noetic sovereignty is not an argument for Luddism in the face of emerging technologies such as neuro devices, generative AI, and computer simulations. These technologies are already having a profound and largely positive impact on the world.  Instead, noetic sovereignty rejects the confluence of these technologies in devices that deny the user agency in determining the content of their consciousness. 

Noetic sovereignty is also not a reaction against the use of technology to induce desirable brain states in users. Consider, for example, the recent research from a team at the University of Arizona that used focused transcranial ultrasound to induce deep meditative states in users. This is a positive use of neurostimulation where the user is still ultimately in control of the content of their conscious experience, but is aided in achieving a specific brain-state. By way of analogy, this type of neurostimulation is like giving the user a window upon the world (i.e., a brain state), but the user is ultimately deciding where to direct their attention within that field of view (i.e., the content of their conscious experience).

At Prophetic, we believe that using AI and neurostimulation technologies to induce and stabilize lucid dreams can retain all of the positive aspects of Experience Machines without violating the user’s noetic sovereignty. A user that chooses to induce a lucid dream is opting into a well-defined brain state, but they are ultimately in control of the conscious content of that experience. Instead of a device that collects neurodata to generate content that is designed to target a user’s reward system and keep them engaged with the system for as long as possible, a lucid dream is a naturally time-limited and unmediated phenomenon where the user creates the content of their conscious experience. 

This is a radical departure from an Experience Machine where the user’s conscious content is algorithmically determined in ways that are opaque to the user. This is a paradigm that is ripe for abuse. If the recent history of proto-Experience Machines (e.g., social media) is any guide, then we should expect that the architects of Experience Machines will ruthlessly optimize them for engagement and gradually erode the user’s desire for non-simulated experiences. 

As Nozick noted, “plugging into the machine is a kind of suicide” because simulated experiences deny the user the opportunity to develop as a person.  In lucid dreams, by contrast, this relationship is flipped on its head. It is precisely what we bring to the dreams as individuals—our courageousness, intelligence, wittiness, affectionateness—that will determine how we interact with our dream worlds. Lucid dreaming fundamentally encourages our development as people in the real world, rather than rejecting it. The Experience Machine offers endless pleasure and assumes that this is the deepest desire of the user. But our natural revulsion to the idea of plugging into an Experience Machine for the rest of our lives suggests there are other elements of our existence that are important to us besides pleasure. With lucid dreams, we can engage with those other elements that are important to us and they will only enhance the quality of the dream. 

This is the promise of the Halo and other technologies that respect the user’s noetic sovereignty. It is critical that we use this emerging framework as a guide to the brave new frontier of consciousness technology in the age of Experience Machines. Consciousness is the most precious gift we possess and we must endeavor to protect our individual capacity to direct it as we see fit. 

If you want to help us answer this question reserve a headset and join our movement.

Previous
Previous

Consciousness is a great mystery. Its definition isn’t.

Next
Next

The Four Epochs of Lucidity: From Prophecy to Technology