Brain-computer interfaces are no longer science fiction. For most, the idea brings to mind “Severance,” Apple’s unsettling series about surgically divided minds. Yet beyond television, companies like Neuralink are already conducting human trials, Chinese firms are investing heavily into neurotechnology, and research labs are learning to decode brain activity in real time. The question is no longer whether BCIs will arrive, but who will control the signals they produce—and what that ownership will mean.
At their core, BCIs are direct links between neural activity and machines. Scientists have been recording brain signals for decades, but only recently have implants and algorithms become advanced enough to translate thoughts into actions. Applications remain limited, but the potential is vast. Today, these devices can enable a paralyzed person to type by imagining hand movements, control a robotic arm with their mind, or even play chess using only their thoughts. In the future, BCIs could restore speech to those who have lost it, restore autonomy for patients with paralysis, and even unlock new forms of creativity by translating imagination directly into art or music. Few technologies hold more potential to expand human ability—and with billions in private investment, progress is rapidly accelerating.
That acceleration raises an uncomfortable truth: brain signals are data. And in the 21st century, data is money. With social media companies already profiting from likes and clicks, imagine the value corporations can derive from decoding attention, emotions, or intent before action. A BCI does not just track where your eyes linger; it can expose whether you’re bored, anxious, or distracted. Naturally, a critical question arises: who owns that information—the individual or the company?
Today, U.S. law offers no clear answer. Medical records are covered by HIPAA, but neural data collected by a commercial BCI does not necessarily fall under those protections—only medical BCI data may be protected. A company could claim ownership of the signal streams its device generates, just as media platforms assert rights over user content. Similarly, in the 1990 case Moore v. Regents of the University of California, the California Supreme Court ruled that a hospital patient’s discarded blood and tissue samples are not his personal property, and thus do not have rights to a share in any profit earned from commercial products or research derived from their cells. This precedent could be used to defend the selling of neural data by companies to other entities.
On the other end, governments might demand access in the name of security. Individuals may assume the data is theirs, but without legal precedent, that assumption is fragile. Until legal frameworks address newfound concerns, the most intimate data humans produce could be the least protected.
Rapid evolution of technology has been a cyclical pattern throughout history. The printing press unleashed mass communication in the 15th century. Centuries later, the telephone, the internet, and then social media each redefined how humans connect across distance. Artificial intelligence simmered in academia for years before suddenly spilling into daily life, transforming how people work and learn. BCIs could follow the same arc: clunky one year, mainstream the next. As these technologies have entered the mainstream, they have changed how we think about privacy and ownership. But unlike social media or AI, which only analyze the information we feed them, BCIs tap directly into the raw material of thought itself. That makes them not just another tool, but the ultimate surveillance machine.
Geopolitics will raise the stakes. The Chinese government is already outlining policy for investment in neurotechnology companies with both medical and military uses. In Washington, this alone may be enough to end any talk of restraint. As with AI, lawmakers could justify pushing forward, whatever the ethical cost, for fear of falling behind. The real race won’t just be about implant speed or accuracy, but about who controls the flow of brain data. Similarly to how the United States relied on Palantir to compile datasets on Americans, BCI firms could be enlisted to funnel neural information to the state—framing it as a necessary response to China’s intelligence apparatus. In that sense, the contest over brain data may become less about innovation than about national security, driving a new kind of ‘data race.’
But the competition won’t just be between nations—it will play out within them. If enhancement BCIs become real, it is unlikely access will be equally distributed. Affluent users may face options to enhance memory or focus with upgraded implants, while the rest risk becoming sources of raw data—mined, monitored, and sold. The same technology that promises empowerment could also deepen class divides, creating a world where some lease out their thoughts while others pay to sharpen theirs.
Nonetheless, the impact will not be confined to wealth—it will extend into classrooms. Education already struggles with AI, which has blurred the lines between learning and outsourcing. BCIs could erase that line entirely. Why memorize formulas if knowledge could be downloaded instantly? Why study history if dates are already preinstalled? The danger is not academic laziness so much as it is commodification. Universities already mine student writing with AI-powered plagiarism detectors—BCIs could push this further, turning even private thought into a dataset to be tracked or distributed.
From there, the implications become existential. Religious critics warn that a chip governing thought resembles the “mark of the beast”—a loss of autonomy disguised as progress. Philosophers wonder whether a self extended into machines is still a self, or something new entirely. Both perspectives converge on the same fear: once thought is externalized, it can be manipulated, copied, and controlled by outside forces. If our ideas become just another dataset, then identity, memory, and even free will risk becoming commodities in a marketplace.
With all of the drawbacks, the optimistic vision of BCIs remains compelling. Imagine composing a symphony simply by thinking about it, or bridging language barriers with real-time neural translation. For patients with neurological disease, BCIs could restore lost senses, reconnect damaged circuits, and transform quality of life. But every advance comes with a trade-off.
Severance imagined a future where the mind could be split into two. The reality ahead may be stranger still: a world where human thought blends with machine code, and the essence of being human is for sale. BCIs promise breakthroughs in medicine, communication, and human potential, but they also threaten to expose the most private part of ourselves to corporate profit, government oversight, and spiritual crisis. We’ve already surrendered our clicks and searches. If we lose control of our thoughts, what do we have left?
Patrick Sliz ’27 (psliz@college.harvard.edu) is the Multimedia Director for the Independent.
