HomeOpinionColumnsGuarding our minds: The battle for cognitive liberty

Guarding our minds: The battle for cognitive liberty

- advertisement -

In a world of mass surveillance that is increasingly becoming closer to the dystopian visions of George Orwell’s “1984,” the right to privacy is rapidly diminishing. Technological advancements have led to the collection of big data, giving governments and corporations access to unprecedented levels of personal information. From the websites we visit and the social media posts we like to the in-store purchases we make and our financial records, almost every facet of our daily lives are being monitored. Now, this surveillance extends further to one of our most intimate domains: our brains, through the collection and analysis of brain data. 

As these technologies become more widespread, the balance between innovation and ethical consideration becomes integral. Cognitive liberty — the freedom of an individual to control their own thoughts and cognition — emerges as a key right that should be explicitly protected. Neuroethicist Wrye Sententia and legal theorist Richard Glen Boire coined this term in response to the growing surveillance capabilities of technology extending to the mind. The current regulatory environment is lacking to address these challenges, with the government being glaringly illiterate on technology. 

Traditionally, the collection of brain data has been done through academia with the goal to further understand the underlying neural mechanisms and cognitive processes.

Emily Grossman, a UC Irvine professor of cognitive sciences, elaborated on academia’s past utilization of brain data collection. 

“Two main branches that have historically partnered with academia. One is defense … It’s usually in cognitive research of development of new ideas, therapy, Air Force, Army … And then the other branch has always been the public health system … tasked with disease treatments, but also interventions and preventative medicine,”  Grossman said in an interview with New University. 

In the age of big data, a new player has emerged: the tech industry, driving innovation and expanding the scope of brain data collection. Companies such as Neurable, Neuralink and Muse are a few in the growing field of neurotechnology enterprises that are utilizing brain data for consumer products. For example, Neurable is developing headphones that read and interpret brain signals to provide insights into wellness and enhance productivity. With this expansion, the commodification of brain data has emerged as a potential privacy concern, as these technologies gather information on the most personal aspects of our bodies.

What was once the dreams of science fiction — the monitoring of cognitive processes — have now become a tangible reality. In China, reports have come out of companies implementing safety helmet-like caps that monitor brain waves to detect emotional spikes for factory workers and train conductors. While the stated aim of such devices is to boost productivity and increase safety, this raises serious concerns for potential misuse of sensitive data. 

To effectively combat the invasion of privacy, there is a critical need for the advent of an independent third-party organization to regulate brain data commodification. 

“I think third-party oversight is crucial. It can be federal, or even better, it can be independent … I think all entities that are conducting research where there’s personal information being reflected should have to pay and support that agency … that’s all pressing,” Grossman said. 

Fortunately, established frameworks for ethical oversight already exist within academia. 

“In academia, we are really careful. We have a lot of restrictions on how data can be coded,”  Grossman said.

These academic principles can be adapted to guide the development of regulation on brain data, ensuring ethical standards are maintained as neurotechnology advances. For example, Institutional Review Boards often have a requirement for research consent forms to be written in language that an eighth grader can understand to guarantee participants are fully informed in how their data is being used. This is a stark contrast from the lengthy, complex legal jargon-filled terms of services in the tech industry, which few people actually read through and understand. 

Adopting such transparent and accountable consent practices would cultivate a culture of trust and belief in the government’s handling of brain data.

“If we can cultivate a process by which people feel like there’s a sense of fairness, privacy and transparency and unbiased decision making, then, even if the outcome is suboptimal, people will nonetheless have respect, for you know the intent and the process,” Grossman said.  

The current environment of mistrust needs to be changed. We live in an era where public confidence in the handling of personal data by governments and corporations is alarmingly low.  Scandals involving data breaches, misuse of information and questionable data practices have significantly damaged trust. Companies must be transparent about how they collect, store and use brain data, while individuals should have the capacity to understand and control their own cognitive information.

With the constant advancements in neurotechnology, establishing clear ethical guidelines and independent oversight is crucial in preserving cognitive liberty. Drawing from existing frameworks in academia ensures that innovation respects autonomy and privacy. Now is the time to act to restore public confidence and safeguard our cognitive freedom in this rapidly evolving digital age.

Sriskandha Kandimalla is an Opinion Staff Writer. She can be reached at skandima@uci.edu

Edited by Zahira Vasquez, Trista Lara and Annabelle Aguirre.