The Many Ways Elon Musk’s Neuralink Could Go Wrong
The brain-machine interface will require innovative ways of thinking about risk. We mapped it out.
--
Several weeks ago, I was asked to write a commentary for the Journal of Medical Internet Research (JMIR) on the ethics of Elon Musk and Neuralink’s much-touted brain-machine interface. JMIR had accepted Musk’s paper on the technology for publication and wanted to accompany its release with a number of invited papers exploring different aspects of the technology.
That commentary has just been published alongside Musk and Neuralink’s paper. Rather than writing another commentary on the ethical challenges of cutting-edge brain tech, I set out to apply our work around risk innovation to the development of the technology. And as a result, my colleague Marissa Scragg and I carried out a unique assessment of what it might take for Musk and Neuralink to create a product that is good for society as well as the company’s bottom line.
What is risk innovation?
Risk innovation is a concept acknowledging that emerging technologies and trends are creating a risk landscape that is so different from what existed in the past that conventional ways of thinking about risk are simply not up to the task of navigating them.
Just as innovation is ultimately about creating value through the practical application of creativity to unmet needs and opportunities, risk innovation focuses on the creation of value through creative approaches to potential dangers and pitfalls.
This is at the heart of the Arizona State University Risk Innovation Nexus, where we’re transforming the ideas behind risk innovation into practical tools and resources to help organizations plan for and navigate potentially blindsiding risks. We call these “orphan risks” as they are often ignored, yet are frequently pivotal to an enterprise’s success or failure. And many of them involve hard-to-quantify social risks that have a tendency to upset an organization’s pathway to success — risks such as the consequences of ethical missteps, threats to privacy and autonomy, or actions that undermine social justice.