They Can Read Your Mind-Your Brain Data May Be Sold, and It's Totally Legal
A new generation of consumer neurotechnology is collecting sensitive brainwave data under the radar of existing privacy laws. Three U.S. Senators are urging federal regulators to investigate and establish clear safeguards before this form of biometric surveillance becomes widespread.
Although promoted as wellness tools, these devices can gather deeply personal neural information without strong consent frameworks or oversight—leaving consumers vulnerable to unregulated data exploitation.
Wearable neurotech headsets are being sold directly to consumers with promises to improve focus, reduce stress, or enhance emotional balance. Marketed as non-medical gadgets, they bypass the regulatory frameworks that govern clinical devices. This positioning has created what lawmakers describe as a legal “gray area,” allowing companies to collect and use neural data with little restriction.
According to ZME Science, Senators Chuck Schumer, Maria Cantwell, and Ed Markey recently sent a formal letter to the Federal Trade Commission, demanding an investigation into how these companies handle the data they gather. The lawmakers argue that neural data is not only highly sensitive but uniquely revealing of mental health conditions, emotional states, and cognitive patterns—even in anonymized form.
Unlike medical brain-computer interfaces, wellness neurotechnology products are not covered by the Health Insurance Portability and Accountability Act (HIPAA). This leaves a loophole that allows companies to operate without the consent standards required in clinical settings. ZME Science reports that only two U.S. states—Colorado and California—have passed legislation to protect neural data, defining it as “sensitive” and subject to more rigorous privacy controls. These state measures, however, remain isolated efforts in the absence of federal regulation.
According to the Neurorights Foundation’s 2024 report cited by the media, a review of 30 consumer neurotech companies revealed that 29 collect user brain data with almost no restrictions. Less than half of the companies provided users with the ability to delete their information, and many lacked clear options for withdrawing consent. Lawmakers argue that such conditions represent a significant risk for misuse or unauthorized sharing of this intimate data.
Stephen Damianos, executive director of the Neurorights Foundation, compared the extent of neural data collection to searching a person’s home without knowing what might be discovered. He emphasized the difficulty in explaining to consumers exactly what could be decoded from their brainwaves—now or in the future. This, he warned, makes informed consent nearly impossible to obtain in a meaningful way.
According to Damianos, the difference between medical and wellness neurotechnology is often blurred. While some devices claim to “optimize mood,” they avoid the regulatory scrutiny of medical treatments for depression. This ambiguity can mislead users into believing these tools are medically approved when they are not.
Senators say that without transparency and clearer rules, consumers may unknowingly expose their most private information to third-party use. Schumer warned in a quote to The Verge, that companies are already gathering neural data “with vague policies and zero transparency.”
In their letter to the Federal Trade Commission, the three Senators outlined several specific measures. They requested an investigation into whether companies are in violation of consumer protection laws and recommended the application of existing children’s privacy regulations to neurotech devices. They also called for new rulemaking to set standards for data collection, use, storage, and consent.
One of the most pressing concerns, according to the Senators, is the potential for secondary uses of brain data—such as feeding artificial intelligence systems or targeting behavior-based advertising. They argue that without firm boundaries, the commercialization of brain data could outpace protections for individual privacy.
Damianos emphasized that the goal is not to halt the development of neurotechnology. As quoted by the source, he stated that while these tools may offer transformative benefits, “enormous risks come from that.” The priority, lawmakers say, is to put safeguards in place before those risks become consequences.