Exciting times – and big headaches – for neural technology.

It’s mind-blowing for those of us who were born before microwave ovens and cellular phones.

Both Facebook and Neuralink are building technology that literally reads your mind. Facebook is funding research on brain-computer interfaces (BCIs) that can pick up thoughts directly from neurons and translate them into words in real time. And Elon Musk’s Neuralink has created implantable flexible ‘threads’ that could one day allow you to control your computer or smartphone with your thoughts. Musk is hoping to start human testing by the end of next year.

It’s a gob-smacking new frontier. There are indeed ethical applications for this type of technology. Cases in point: this technology has already begun helping people suffering from paralysis regain some autonomy via robotic limbs.

These are wonderful developments. But as per this Vox article: “Your brain, the final privacy frontier, may not be private much longer.”

Marcello Ienca is a researcher at ETH Zurich who, in 2017, released a paper outlining the four rights for the neurotechnology age he feels should be enshrined in law. They are (as outlined by the Vox article above):

  • The right to cognitive liberty,
  • The right to mental privacy,
  • The right to mental integrity, and
  • The right to psychological continuity.

Clearly there is a lot of be considered before widespread implementation of this technology. And just as the Vox article hit our inbox, so did this one from TechNewsWorld that followed a similar vein.

The article outlines a recent conversation between Musk and Jack Ma (Alibaba) that touts AI (and Neuralink’s technology) as a way to achieve a 12-hour workweek and greater leisure time. But as the article points out, neither has considered how salaries and compensation figures into this plan or – in the case of neural AI – the natural capacities of the human brain.

Both articles are worth reading because in this exciting new frontier, knowledge is power. And careful consideration will be key.

View our disclaimer.