Nomadsoul1 | Istock | Getty Images
The query “What’s a thought?” is not any longer strictly a philosophical one. Like anything measurable, our thoughts are subject to increasingly technical answers, with data captured by tracking brainwaves. That breakthrough also means the information is commodifiable, and captured brain data is already being bought and sold by firms within the wearable consumer technologies space, with few protections in place for users.
In response, Colorado recently passed a first-in-the-nation privacy act geared toward protecting these rights. The act falls under the prevailing “Colorado Consumer Protection Act,” which goals to guard “the privacy of people’ personal data by establishing certain requirements for entities that process personal data [and] includes additional protections for sensitive data.”
The key language within the Colorado act is the expansion of the term “sensitive data” to incorporate “biological data” — inclusive of various biological, genetic, biochemical, physiological, and neural properties.
Elon Musk’s Neuralink is probably the most famous example of how technology is being embedded with the human mind, though it is not alone within the space, with Paradromics emerging as an in depth competitor, alongside devices which have returned speech to stroke victims and helped amputees move prosthetic limbs with their minds. All of those products are medical devices that require implantation, and are protected under HIPAA’s strict privacy requirements. The Colorado law is concentrated on the rapidly growing consumer technology sphere and devices that do not require medical procedures, don’t have any analogous protections, and could be bought and used without medical oversight of any kind.

There are dozens of firms making products which can be wearable technologies capturing brain waves (aka neura data). On Amazon alone, there are pages of products, from sleep masks designed to optimize deep sleep or promote lucid dreaming, to headbands promising to advertise focus, and biofeedback headsets that may take your meditation session to the subsequent level. These products, by design and necessity, capture neural data through use of small electrodes that produce readings of brain activity, with some deploying electric impulses to affect brain activity.
The laws in place for the handling all of that brain data are virtually non-existent.
“We’ve entered the world of sci-fi here,” said lead sponsor of the Colorado bill, Representative Cathy Kipp. “As with all advances in science, there should be guardrails.”
‘ChatGPT-moment’ for consumer brain tech
A recent study by The NeuroRights Foundation found that of thirty firms examined who’re making wearable technology that’s able to capturing brainwaves, twenty-nine “provide no meaningful limitations to this access.”
“This revolution in consumer neurotechnology has been centered on the increasing ability to capture and interpret brainwaves,” said Dr. Sean Pauzauskie, medical director at The NeuroRights Foundation. Devices using electroencephalography, a tech available to consumers, is “a multibillion-dollar market that is ready to double over the subsequent five or so years,” he said. “Over the subsequent two to 5 years it isn’t implausible that neurotechnology might see a ChatGPT-moment.”
How much data could be collected depends upon several aspects, however the technology is rapidly advancing, and could lead on to an exponential increase in applications, with the tech increasingly incorporating AI. Apple has already filed patents for brain-sensing AirPods.
“Brain data are too necessary to be left unregulated. They reflect the inner workings of our minds,” said Rafael Yusuf, professor of biological sciences and director, NeuroTechnology Center, Columbia University, in addition to Chairman of the NeuroRights Foundation and leading figure within the neutotech ethics organization Morningside Group. “The brain isn’t just one other organ of the body,” he added. “We want to interact private actors to make sure they adopt a responsible innovation framework, because the brain is the sanctuary of our minds.”
Pauzauskie said the worth to firms is available in the interpretation or decoding of the brain signals collected by wearable technologies. As a hypothetical example, he said, “in case you were wearing brain-sensing earbuds, not only would Nike know that you simply browsed for runners’ shoes out of your browsing history, but could now understand how interested you were as you browsed.”
A wave of biological privacy laws could also be needed
The priority targeted by the Colorado law may result in a wave of comparable laws, with heightened attention to the mingling of rapidly-advancing technologies and the commodification of user data. Up to now, consumer rights and protections have lagged behind innovation.
“One of the best and most up-to-date tech/privacy analogies is perhaps the web and consumer genetic revolutions, which largely went unchecked,” Pauzauskie said.
An identical arc could follow unchecked advancements in the gathering and commodification of consumer brain data. Hacking, corporate profit motives, ever-changing privacy agreements for users, and narrow to no laws covering the information, are all major risks, Pauzauskie said. Under the Colorado Privacy Act, brain data is prolonged the identical privacy rights as fingerprints.
In accordance with Professor Farinaz Koushanfar and Associate Professor Duygu Kuzum of the department of Electrical and Computer Engineering at UC San Diego, it remains to be too early to know the restrictions of the technology, in addition to the depths of the possibly intrusive data collection.
Tracking neural data could mean tracking a broad range of cognitive processes and functions, including thoughts, intentions, and memories, they wrote in a joint statement sent via email. At one extreme, tracking neural data might mean accessing medical information directly.
The broad range of possibilities is itself a problem. “There are too many unknowns still on this field and that is worrisome,” they wrote.
If these laws change into widespread, firms may don’t have any alternative but to overhaul their current organizational structure, in response to Koushanfar and Kuzum. There could also be a necessity for establishing latest compliance officers, and implementing methods equivalent to risk assessment, third-party auditing and anonymization as mechanisms for establishing requirements for the entities involved.
On the buyer side, the Colorado law and any subsequent efforts represent necessary steps toward higher educating users, in addition to giving them the required tools to envision and exercise their rights should they be infringed.
“The privacy law [in Colorado] regarding neurotechnology might stand as a rare exception, where rights and regulations precede any widespread misuse or abuse of consumer data,” Pauzauskie said.






