Brain-computer interfaces (BCIs) create “a bridge between your brain and an external device.” (Gonfalonieri, 2020) Today, we are able to monitor brain activity through non-invasive devices such as headbands or earbuds. Sensors and algorithms can analyze brain activity and find patterns. This technology was originally developed and used for helping paralyzed individuals use their thoughts to move assistive devices such as prosthetic limbs. It’s also being used to help those who can’t speak, i.e. individuals who have lost speech as a result of having ALS. (Gonfalonieri, 2020)
But more recently, companies are looking at using BCIs for non-medical reasons. This Harvard Business Review article dives into scenarios where managers could use this technology to monitor employee’s attention and productivity during work (I don’t know about you, but I’d prefer my employer not see how attentive I am throughout the day). Researchers are also exploring the idea of “passthoughts” – using your thoughts to login to an account rather than typing in a password.
“When it comes to collecting brain data, the potential for abuse is frightening: Even when used with the best of intentions, companies could risk becoming overly dependent on using brain data to evaluate, monitor, and train employees, and there are risks associated with that.” (Gonfalonieri, 2020)
In addition to the risks that companies may face when using this technology for their employees, there are security risks from hackers, too. BCIs can be hacked, and then sensitive brain data can be accessed including those “passthoughts”.
Aside from using BCI technology in the workplace, companies are exploring other uses for the tech. Facebook (of course, it always has to be Facebook) is extremely interested in BCI technology and supports research efforts concerning it. Facebook Reality Labs (FRL) is developing AR glasses and could potentially use BCI technology as a way for people to use the glasses. Facebook will have challenges regarding privacy when attempting to launch this product.
All of this brings up serious concerns when it comes to privacy. I can think of a myriad of risks and possible ill-effects of this tech, even though it’s really beneficial for those who can’t move or speak.
The Future of Privacy Forum recommends 5 safeguards to privacy when developing BCI technology.
“Because the collection and use of neuroinformation involves a number of privacy and ethical concerns that go beyond current laws and regulations, stakeholders working in this emerging field should follow these principles for mitigating privacy risks:
- Employ Privacy Enhancing Technologies to Safeguard Data – BCI providers should integrate recent advances in privacy enhancing technologies (PETs), such as differential privacy, in accordance with principles of data minimization and privacy by design.
- Ensure On/Off User Controls – Wherever appropriate, BCI users should have the option to control when their devices are on or off. Some devices may need to always be on in order to fulfill their functions—for example, a BCI that treats a neurological condition. However, when being always on is not an essential feature of the device, users should have a clear and definite way to turn off their device. As with other devices, there are considerable privacy risks when a BCI is always gathering data or can be turned on unintentionally.
- Enshrine Purpose Limitation — BCI providers should state the purpose for collecting neuroinformation and refrain from using that information for any other purpose absent user consent. For example, if an educational BCI gauges student attentiveness for the purpose of helping a teacher engage the class, it should not use attentiveness data for another purpose—like ranking student performance—without express and informed consent. Additionally, BCI providers should also consider limiting unnecessary cross-device collection.
- Focus on Data Quality — Providers should strive to use the most accurate data collection processes and machine-learning tools available to ensure accuracy and precision. Algorithmic explainability and reproducibility of results are critical components of accuracy. It is important for BCIs to be both accurate (turning neural signals into correct neuroinformation) and precise (consistently reading the same signals to mean the same thing).
- Promote Security — BCI providers should take appropriate measures to secure neuroinformation. BCI devices should be secure against hacking and malware, and company servers should be secure against unauthorized access and tampering. Furthermore, data transfers should be accomplished by secure means, subject to strong encryption.” (Ringrose, 2020)
What are your thoughts on using this tech outside of the medical community? Do you think the benefits of this technology outweigh the risks?
- Gonfalonieri, Alexandre. 6 Oct. 2020. Accessed 25 Oct. 2020. https://hbr.org/2020/10/what-brain-computer-interfaces-could-mean-for-the-future-of-work