top of page

Neuroethics: Where Do We Draw the Line?


Black Mirror: a hit show that analyzes the not too distant future in which neurotechnology has large power, with equally large ethical implications. For example, one episode, “The Entire History of You,” shows the dangers of a hypothetical technology that would allow you to rewind and rewatch everything you see and experience. While some episodes of Black Mirror may seem futuristic, is it really all science fiction? Are we reaching an age where people are becoming more like cyborgs than actual humans? With the many advancements happening today, it is worth analyzing some of the areas of ethical concern that are associated with neurotechnology.

 

Neuromarketing:

Neuromarketing began in 2002 when experiments using fMRI and sometimes EEG targeted consumer brain activity during advertisements to see which ones peaked people’s interests. While undue influence and the marketing of illegal or deceptive products might be a concern in neuromarketing, oftentimes these companies do not have as much influential control over consumer choice as we think, and they also are not usually trying to advertise dangerous products. Additionally, neuromarketing does not typically conduct tests on toddlers, which could be another area of concern since childs are more easily manipulated and swayed. Most ethical issues in neuromarketing have more to do with the ethical issues of the advertisement itself, rather than the technology used to monitor its effectiveness.

While neuromarketing may not have the most heated debates about ethical concerns, there is another area of neurotechnology that warrants more consideration, and that area is Brain Computer Interface.

Image from: http://s3-ap-south-1.amazonaws.com/startupuploads/wp-content/uploads/2017/01/17114404/Capture19.png

 

Brain Computer Interface:

BCI is a powerful technology, but the intentions of BCI itself already elicit concerns over responsible usage. BCI allows your thoughts and brain activity to be translated to an external output, such as moving a computer cursor with your mind, so it is understandable that merging such technology with humanity may raise some red flags if used irresponsibly. There are many ethical considerations particularly with invasive BCI. Such concerns include device dependency (which then elicits the concern that cognitive functioning may be altered or even diminished once the BCI is

Image From: https://s3.amazonaws.com/dsg.files.app.content.prod/gereports/wp-content/uploads/2017/06/17150117/GettyImages-157313890-e1497726422726.jpg

removed), technological failures, and brain tissue infection from the implanted devices. Additionally, people may become very frustrated when using BCI, since the technology does require a lot of cognitive concentration. There are also issues of autonomy involved. Who should be able to use BCI, and when is it warranted as necessary for medical purposes? Obviously, someone in a vegetative state or with paralysis would benefit immensely from BCI, but where do we set the threshold for warranted BCI usage? BCI could grant autonomy for people who have impairments, but what about enhanced autonomy for people who function “normally”? And what is normal functioning anyway? For example, neurotechnological advancements in the military may allow soldiers to effortlessly control machinery with their minds, giving them enhanced autonomy in a more interactive and advanced type of warfare.

Image from: http://cyborgdatabase.org/images/cossy6.jpg

Not only is their concern for the power neurotech gives humans, but what about the power that humans give to the BCI’s that they create? Artificial intelligence is obviously advanced, and BCI, especially invasive BCI, is often bi-directional, which in very simplistic terms means that the device can “learn.” For example, many BCI systems have learning patterns so that they can adapt, accommodate, and assimilate to the user’s brain patterns and intentions. This happened in the case of Matthew Nagle, who I discussed in my blog post “How Did We Get to This Point?” Nagle, who is paralyzed from the neck down, was the first human test subject of the BCI “BrainGate.” This invasive BCI started to pick up on Nagle’s patterns of neuronal firing and started to adapt and improve itself, allowing the machine to produce better and more efficient outcomes based solely on Nagle’s same levels of concentration and same intentions. This adaptive intelligence and the nature of such invasive BCI as an integrated part of someone’s nervous system definitely raises concerns about responsible usage.

While invasive BCI produces clearer signals with less background noise, the ethical lines start to blur more and more as we move from non-invasive to invasive BCI. Reversibility is a large focus of ethicists, many of whom take the stance that neurotechnology should be reversible so that if the effects are undesirable or get out of hand, then the device can be easily removed. Another concern of invasive BCI is the issue of privacy, since people can have uninformed access to someone’s cognitive information, and there is the chance for interference by hackers who may be able to manipulate the BCI or extract sensitive information.

 

Neurostimulation:

Neurostimulation can range from technology that sharpens and improves memory to technology that regulates abnormal electrical brain activity. Of course, you can probably tell the potential ethical issues associated with technology that can enhance your cognitive functioning, if it were to be used irresponsibly. But I am going to focus on neurostimulation, specifically Deep Brain Stimulation (DBS). DBS uses invasive electrodes to produce electrical impulses that can regulate abnormal brain activity or even target certain problem cells or chemical imbalances in the brain. While Deep Brain Stimulation has been approved to use as treatment for such diseases as Parkinson’s Disease and epilepsy, there are two main ethical considerations to keep in mind. First of all, patient consent is a huge part of using DBS. Take, for example, the case of Dr. Robert G. Heath, who used DBS as an attempt to unethically “cure” a homosexual man in the 1950’s. The other big ethical concern in DBS is neuroaugmentation, similar to the ethical concern I introduced earlier regarding BCI. For example, when deciding who should be treated with DBS, which can improve cognitive functioning, where do we draw the line as to who has reduced cognitive functioning that warrants neurostimulation in the first place? Where should the boundary be placed to separate “abnormal” mental states form “normal” mental states?

Image from: http://www.neurosurgery.pitt.edu/sites/default/files/centers/dbs/dbs-or.jpg
 

What Can, or Should, We Do?

Image from: http://www.peterbartreiner.com/uploads/1/2/0/0/120094820/published/mulitplce-minds.jpg?1528548764

Because neurotechnology gives people the potential power to access, alter, and distribute people’s cognitive and neural information, Marcello Ienca and Roberto Andorno, two Swiss professors of biomedical and health ethics, argue that current human rights must be tailored to protect people against potential misuses of devices produced by the burgeoning neurotechnology industry. The proposed extensions in human rights they introduce are the right to cognitive liberty, the right to mental integrity, the right to psychological continuity, and the right to mental privacy. Now, I am by no means saying that neurotechnology is a bad thing. I believe that neurotechnology is a powerful tool that can be used in medicine and other sectors to learn valuable information in uncovering the mysteries of the brain, as well as to help people with physical or mental disabilities achieve a better quality of life. However, with any technology, it is important to consider both benefits and risk, and to make informed judgements about where we should draw the line in terms of ethics.

 

Works Cited:

https://lsspjournal.biomedcentral.com/articles/10.1186/s40504-017-0050-1

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5680604/

https://www.wired.com/2005/03/brain-3/

https://journalofethics.ama-assn.org/article/ethical-and-social-challenges-brain-computer-interfaces/2007-02

https://www.mayoclinic.org/tests-procedures/deep-brain-stimulation/about/pac-20384562

https://www.express.co.uk/news/science/991411/army-technology-mind-neurotechnology-DARPA-usa-military-news

http://nuffieldbioethics.org/wp-content/uploads/novel_neurotechnologies_consultation_TipuAziz_AlexGreen.pdf

https://www.ama.org/publications/MarketingNews/Pages/what-are-the-ethics-of-neuromarketing.aspx

bottom of page