clock menu more-arrow no yes mobile

Filed under:

Your brain’s privacy is at risk. The US just took its first big step toward protecting it.

Colorado passed legislation to prevent companies from selling your brainwaves. But is it enough to stop the likes of Meta and Apple?

An illustration of small overlapping squares of paper in the shape of a brain. Getty/Paige Vickers for Vox
Sigal Samuel is a senior reporter for Vox’s Future Perfect and co-host of the Future Perfect podcast. She writes primarily about the future of consciousness, tracking advances in artificial intelligence and neuroscience and their staggering ethical implications. Before joining Vox, Sigal was the religion editor at the Atlantic.

If you take it for granted that nobody can listen in on your innermost thoughts, I regret to inform you that your brain may not be private much longer.

You may have heard that Elon Musk’s company Neuralink surgically implanted a brain chip in its first human. Dubbed “Telepathy,” the chip uses neurotechnology in a medical context: It aims to read signals from a paralyzed patient’s brain and transmit them to a computer, enabling the patient to control it with just their thoughts. In a medical context, neurotech is subject to federal regulations.

But researchers are also creating noninvasive neurotech. Already, there are AI-powered brain decoders that can translate into text the unspoken thoughts swirling through our minds, without the need for surgery — although this tech is not yet on the market. In the meantime, you can buy lots of devices off Amazon right now that would record your brain data (like the Muse headband, which uses EEG sensors to read patterns of activity in your brain, then cues you on how to improve your meditation). Since these aren’t marketed as medical devices, they’re not subject to federal regulations; companies can collect — and sell — your data.

Luckily, the brain is lawyering up. Neuroscientists, lawyers, and lawmakers have been teaming up to pass legislation that would protect our mental privacy.

In a first for the US, Colorado passed new legislation this week that amends the state’s privacy law to include the privacy of neural data. Now, just as fingerprints and facial images are protected under the Colorado Privacy Act, the whisperings of the brain are, too. Signed into law by Gov. Jared Polis, the bill had impressive bipartisan support, passing by a 34-to-0 vote in the state Senate and 61-to-1 in the House.

California is taking a similar approach. The state’s Senate Judiciary Committee passed a bill this week that brings brain data into the category of “sensitive personal information.” Next, the bill heads to the Appropriations Committee for consideration.

Minnesota may be next. The state doesn’t have a comprehensive privacy law to amend, but its legislature is considering a standalone bill that would protect mental privacy and slap penalties on companies that violate its prohibitions.

This type of legislation is coming not a moment too soon. With companies like Meta and Snapchat exploring neurotechnology, and Apple patenting a future version of AirPods that would scan your brain activity through your ears, we could soon live in a world where companies harvest our neural data just as 23andMe harvests our DNA data. These companies could conceivably build databases with tens of millions of brain scans, which can be used to find out if someone has a disease like epilepsy even when they don’t want that information disclosed — and could one day be used to identify individuals against their will.

Already, several consumer neurotech companies are gathering brain data — and perhaps selling it, according to a major report released this week by the nonprofit Neurorights Foundation. Analyzing the privacy policies and user agreements of 30 companies, the report found that a majority could share neural data with third parties.

“So if you’re worried about what might happen with your neural data and mental privacy, you need to be worried right now about that,” Jared Genser, general counsel at the Neurorights Foundation, told me. “Because people are buying these devices all around the world.”

And while the legislation in states like Colorado is promising, preventing a company from harvesting brain data in one state or even one country is not that useful if it can just do that elsewhere. The holy grail would be federal — or even global — legislation. So, how do we protect mental privacy worldwide?

Your brain needs new rights

Rafael Yuste, a Columbia University neuroscientist, started to get freaked out by his own neurotech research a dozen years ago. At his lab, employing a method called optogenetics, he found that he could manipulate the visual perception of mice by using a laser to activate specific neurons in the visual cortex of the brain. When he made certain images artificially appear in their brains, the mice behaved as though the images were real. Yuste discovered he could run them like puppets.

He’d created the mouse version of the movie Inception. And mice are mammals, with brains similar to our own. How long, he wondered, until someone tries to do this to humans?

In 2017, Yuste gathered around 30 experts to meet at Columbia’s Morningside campus, where they spent days discussing the ethics of neurotech. As Yuste’s mouse experiments showed, it’s not just mental privacy that’s at stake; there’s also the risk of someone using neurotechnology to manipulate our minds. While some brain-computer interfaces only aim to “read” what’s happening in your brain, others also aim to “write” to the brain — that is, to directly change what your neurons are up to.

The group of experts, now known as the Morningside Group, published a Nature paper later that year making four policy recommendations, which Yuste later expanded to five. Think of them as new human rights for the age of neurotechnology:

1. Mental privacy: You should have the right to seclude your brain data so that it’s not stored or sold without your consent.

2. Personal identity: You should have the right to be protected from alterations to your sense of self that you did not authorize.

3. Free will: You should retain ultimate control over your decision-making, without unknown manipulation from neurotechnologies.

4. Fair access to mental augmentation: When it comes to mental enhancement, everyone should enjoy equality of access, so that neurotechnology doesn’t only benefit the rich.

5. Protection from bias: Neurotechnology algorithms should be designed in ways that do not perpetuate bias against particular groups.

But Yuste wasn’t content to just write academic papers about how we need new rights. He wanted to get the rights enshrined in law.

“I’m a person of action,” Yuste told me. “It’s not enough to just talk about a problem. You have to do something about it.”

How do we get neurorights enshrined in law?

So Yuste connected with Jared Genser, an international human rights lawyer who has represented clients like the Nobel Peace Prize laureates Desmond Tutu and Aung San Suu Kyi. Together, Yuste and Genser created the Neurorights Foundation to advocate for the cause.

They soon notched a major win. In 2021, after Yuste helped craft a constitutional amendment with a close friend who happened to be a Chilean senator, Chile became the first nation to enshrine the right to mental privacy and the right to free will in its national constitution. Mexico, Brazil, and Uruguay are already considering something similar.

Even the United Nations has started talking about neurotech: Secretary-General António Guterres gave it a shoutout in his 2021 report, “Our Common Agenda,” after meeting with Yuste.

Ultimately, Yuste wants a new international treaty on neurorights and a new international agency to make sure countries comply with it. He imagines the creation of something like the International Atomic Energy Agency, which monitors the use of nuclear energy. But establishing a new global treaty is probably too ambitious as an opening gambit, so for now, he and Genser are exploring other possibilities.

“We’re not saying that there necessarily need to be new human rights created,” Genser told me, explaining that he sees a lot of promise in simply updating current interpretations of human rights law — for example, extending the right to privacy to include mental privacy.

That’s relevant both on the international level — he’s talking to the UN about updating the provision on privacy that appears in the International Covenant on Civil and Political Rights — and on the national and state levels. While not every nation will amend its constitution, states with a comprehensive privacy law could amend that to cover mental privacy.

That’s the path Colorado is taking. If US federal law were to follow Colorado in recognizing neural data as sensitive health data, that data would fall under the protection of HIPAA, which Yuste said would alleviate much of his concern. Another possibility would be to get all neurotech devices recognized as medical devices so they would have to be approved by the FDA.

When it comes to changing the law, Genser said, “It’s about having options.”

A version of this story originally appeared in the Future Perfect newsletter. Sign up here!

Update, April 18, 12:05 pm ET: This story was originally published on February 21 and has been updated to reflect news about the Colorado legislation.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.