Bioethicists call for oversight of direct-to-consumer “neurotechnologies”

The marketing of direct-to-consumer “neurotechnologies” can be enticing: apps that diagnose a mental state, and brain devices that improve cognition or “read” one’s emotional state. However, many of these increasingly popular products aren’t fully supported by science and have little to no regulatory oversight, which poses potential health risks to the public. In a new piece published in the journal Science this week, two bioethicists from Penn Medicine and the University of British Columbia suggest the creation of a working group that would further study, monitor, and provide guidance for this growing industry – which is expected to top $3 billion (€2.63 billion) by 2020.
“There’s a real thirst for knowledge about the efficacy of these products from the public, which remains unclear because of this lack of oversight and gap in knowledge,” said lead author Anna Wexler, PhD, an instructor in the department of Medical Ethics and Health Policy at the Perelman School of Medicine at the University of Pennsylvania. “We believe a diverse, dedicated group would help back up or refute claims made by companies, determine what’s safe, better understand their use among consumers, and address possible ethical concerns.”
The group, made up of researchers, ethicists, funders, and industry experts, among others, the authors wrote, would serve as a clearinghouse for regulatory agencies, such as the U.S. Food and Drug Administration (FDA) and the Federal Trade Commission (FTC), third-party organizations that monitor advertising claims, industry, social and medical scientists, funding agencies, and the public at large.
While some of these techniques are used in clinical and research laboratory settings – for example, electroencephalography (EEG) devices are used to diagnose and treat epilepsy — many consumer-grade versions of neurotechnology devices are only loosely based in science. It is unclear whether the laboratory data collected to test them is applicable to consumer-grade products, leading many in the scientific world to question the efficacy, and advocate for increased regulation of these readily available techniques and products.
For example, some consumer neurostimulation devices may pose dangers, such as skin burns. There are also potential psychological harms from many consumer EEG devices that purport to “read” one’s emotional state.
“If a consumer EEG device erroneously shows that an individual is in a stressed state, this may cause him or her to become stressed or to enact this stressed state, resulting in unwarranted psychological harm,” the authors wrote. Also, a smartphone wellness app that diagnoses symptoms of depression does so without medical support structures, such as a psychologist or mental health counsellor.

Penn Medicinehttps://tinyurl.com/ydfk5fdg