LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Reply to “Separating neuroethics from neurohype”

Photo from wikipedia

Ienca et al. reply — In her response to our Commentary in the September 2018 issue1, Wexler makes incorrect statements on factual issues, misrepresents our analysis, and suggests a perspective… Click to show full abstract

Ienca et al. reply — In her response to our Commentary in the September 2018 issue1, Wexler makes incorrect statements on factual issues, misrepresents our analysis, and suggests a perspective on the (neuro)ethical debate that we find deeply troubling. First, on the facts, we have numerous disagreements related to the status of neurotechnology. For instance, Wexler is wrong in asserting that “most consumer EEG devices do not provide access to raw EEG data.” The real-time display of raw electroencephalography (EEG) data is reported among the key features of several software packages built for directto-consumer (DTC) neuroheadsets. For instance, EmotivPRO, an integrated software solution built for Emotiv EPOC+ and Insight, advertises “a real-time display of EMOTIV headset data streams including raw EEG” (emphasis added). Similarly, eegID, an Android app for NeuroSky Mindwave, claims to enable users to “view EEG data,” including “EEG raw values.” NeuroSky’s FAQ section states: “Our technology is remarkable in how well our headsets read raw brainwave data.” Access to raw data has also been obtained by reverseengineering these software tools. In 2010, a hacker going by Cody Brocious cracked Emotiv’s encryption, built a decryption routine and created an open-source library for reading EEG data directly from the headset. In addition, although Wexler claims that no single tDCS (transcranial direct current stimulation), tACS (transcranial alternating current stimulation) or tRNS (transcranial random noise stimulation) device has obtained US Food and Drug Administration (FDA) approval, the Fisher Wallace Stimulator, a wearable tACS device2, has an FDA clearance for cranial electrotherapy stimulation and Activadose II from Caputron is cleared for iontophoresis and does not require modification or additional hardware to be used for tDCS. Wexler also claims we are incorrect in referring to “wearable neuroimaging headset[s].” However, wearable neuroimaging headsets is not our phrase but one coined by prototype developers. Indeed, a 2018 paper entitled “Moving MEG towards real-world applications with a wearable system”3 reflects a longterm interest in advancing research in the direction of wearable neuroimaging devices. These devices are not here—nor did we claim they are here. But they are under active research—which is all we claimed. It is ethically imperative to take this direction seriously. Perhaps Wexler thinks that the title is wrong, but we reject the old-fashioned role of philosophy as ‘word police’—that is, telling researchers which concepts apply to their own work and which ones do not. Second, Wexler embraces a fallacy of composition. The author rejects our claim that brain-derived data should be entitled to special protections because she contends that data sources, such as EEG, do not constitute a valuable source of personal information. This, however, is only true for EEG data taken in isolation and processed using conventional analytic techniques. However, in the consumer neurotechnology domain, it is extremely hard to separate EEG signals from the myriad pieces of contextual information about users easily available on the internet. Although consumer neurotechnology service providers usually de-identify the EEG data, these data are typically merged with other information, such as demographic data, geolocation, and data from linked online profiles in social media. In addition, advances in big data analytics have shown that privacy-sensitive information can be obtained by combining heterogeneous datasets that are individually not sensitive4. A recent systematic review has shown that more than 30 studies have applied deep learning algorithms to EEG signals to make inferences about emotional and cognitive states, including mental workload5. And the insights will only become greater with the increasing online availability of EEG data, more sophisticated sensor technology, and further AI research on classification algorithms. Finally, the privacy implications of side-channel attacks to DTC brain–computer interfaces should be evaluated not only on the basis of their predictive power, but also on their reusability for malware programming in real-world settings (for example, via abusive third-party apps). Most importantly, we have a fundamental disagreement with Wexler on the role of bioethics and neuroethics. Wexler argues that “[i]f we, as bioethicists and neuroethicists, become concerned with every theoretical feasibility demonstration, then we are a field that is very much in trouble.” We advocate the exact opposite approach. If bioethicists do not take current feasibility demonstrations seriously, it is then that our field will be in trouble. Neuroethics must be proactive and anticipate—and propose norms to effectively regulate and govern—the future, not simply react to what technology has already achieved. Certainly, there is hype in neurotechnology, but one does not need to be “credulous” to critically examine potential consequences of current and probable future research paths. We take the aspirations of scientists as setting research agendas that will produce technologies, and consider Wexler’s proposal to ignore them perilous. It is precisely the scarcity of ex ante inspection of novel technological developments in the consumer domain and the absence of adaptive oversight mechanisms that caused ethical lapses in areas such as social networking6 and facial recognition technology7. Today, these technologies are already too entrenched to be substantially modifiable via ethically aligned design and enforceable governance. Privacy may already have been irreversibly jeopardized by the digital data ecosystem8. Current data privacy scandals, such as those involving Facebook and Cambridge Analytica, have warned the scientific community about the need to update data ethics standards and to put them in place before technology becomes widespread rather than after9. Despite the current data privacy landscape and the reported penetration of actors like Facebook in the neurotechnology domain, Wexler suggests that “it may not be the appropriate time for regulatory recommendations.” Quite the contrary. It is particularly when foundational research and commercial interest seem to coincide that ethical challenges should be elucidated and regulatory oversight put in place. We simply do not share Wexler’s view on the role of ethics and regulation in addressing problems that are likely to arise in the next decades. As we explicitly stated in our article, the hype surrounding consumer neurotechnology must be deflated through scientific evidence. But doing so should not entail abdicating proactive ethical scrutiny. FOCUS | correspondence

Keywords: eeg; research; eeg data; consumer; neurotechnology; wexler

Journal Title: Nature Biotechnology
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.