Dark Data 2023 StarAbout StarEditor's Note StarArticles

Sharing Your Heartbeat?

Sidsel Ostbjerg

For years patient’s with embedded technology such as pacemakers and insulin pumps have been begging for an improvement in their data privacy and security. This fall new regulations from the FDA have entered into force and they might affect patient's and Elon Musk differently than you think

Header Image

I was 9 when I learned what a pacemaker was. On a February night in 2005, my family and I were watching a Champions League handball game when the Danish folk hero and Olympic champion Anja Andersen collapsed. I remember thinking she would break into little pieces.  Terrifying images unfolded across tiny  pixels on the TV as we all held our breath. Silence spread from the commentators’ booth into the thousands of living rooms tuned into the game. It felt like forever before Andersen finally moved again.  My mom, a nurse, explained what the seemingly all-knowing commentators could not. Kneeling down to meet my 9-year old eyes, she described how Anja Andersen’s heart wasn’t like hers or mine: it needed support from a machine to keep beating properly. 12 years later, this machine would make it off the distant pixelated handball court and into my own family: when my grandfather’s heart weakened, he underwent a pacemaker implantation to survive. 

The first successful implantation of a pacemaker happened in Sweden in 1958. A lot has changed since then: implanting the devices is now an ordinary operation all over the world, with more than 3 million people having ever had one, and approximately 1.43 million people actively carrying one. Unfortunately, these technological improvements have been made without appropriate accompanying regulation.  As cloud- and wireless technology evolve, regulations regarding data privacy, protection, and security are limping behind. Until now. 

Supposedly.

New regulations regarding cybersecurity in embedded medical devices have arrived. But do they actually provide desperately-needed safeguards?

Neta Alexander was 33 years old when she received her pacemaker. Ever since, she has been acting as an ambassador and activist for fellow people with  pacemakers. She complains that she has no idea who has access to her data. Why is this a problem? Pacemakers collect vulnerable and personal information - such as sleeping schedule - and share it with complete strangers. The information is tracked for multiple reasons. One is for the pacemaker to eject an electrical shock if the heart beats less than 25 times per minute. This is to maintain a healthy heart rhythm. Another reason is for doctors to monitor the patterns of a patient's heart over a longer period of time to draw medical conclusions. Neta Alexandre describes how the very data that might be keeping her alive by being available to doctors, could, in the wrong hands, make her a victim of a potentially lethal hacking attack. 

In 2013, CNN and Cylance CEO Stuart McClure reported that pacemakers and insulin pumps were at high risk of being hacked by bad actors. The report revealed that it only takes a small antenna and a line of code to hack into nearly any wireless medical device such as pacemakers, insulin pumps and defibrillators. In 2018, at the Black Hat convention in Las Vegas, this issue was emphasized through a live enactment of this exact hacking technique. Pacemakers were easily hackable because their parent company Medtronic failed to incorporate encryption in their programming. This enabled anyone to see and modify the code. This carried potentially disastrous consequences: an intruder in the system could adjust when to give electric impulses into the heart - and when not to. Both scenarios could be lethal. Even though Medtronic resolved one of the cloud vulnerabilities mentioned at the Black Hat conference, researchers Billy Rios and Jonathan Butts remained convinced the security was lacking. 

It has been over a decade since the first concerns regarding patient safety and data security were raised by scientists like Billy Rios and Jonathan Butts and patients like Neta Alexander. And although medical companies have been adjusting their practices, it took until march of this year for the FDA to commit to a new cybersecurity law that will regulate how medical companies secure privacy data from patients with embedded technology in their bodies. Initially, the new FDA regulations are focused on premarket medical devices, ensuring a new higher level of security before a new generation of devices and softwares are placed in patients’ bodies. in the Regulatory Law Review at Penn Carey Law at University of Pennsylvania, Professor Christopher S. Yoo and Bethany Lee reported that the FDA guidelines vaguely defined “trustworthiness” of medical devices. ​​Yoo and Lee argue that‘trustworthy’ medical devices are ones that “(1) are reasonably secure from cybersecurity intrusion and misuse; (2) provide a reasonable level of availability, reliability, and correct operation; (3) are reasonably suited to performing their intended functions; and (4) adhere to generally accepted security procedures.” Moving forward, it will be interesting to follow how new devices on the market will differ from previous ones.  Although the new trustworthy products will improve future  patient’s security, I am skeptical as to how they fix this data security issue for the millions of existing patients. On the 15th of November, the FDA contracted with MITRE, a not-for-profit R&D company, to investigate the next steps in improving legacy implementations. Due to the novelty of this decision, its implications remain to be seen.

These new guidelines are a great first step. However, I assume the FDA is not only motivated by concerns regarding hacking and privacy covered in this and countless pieces published this past decade. Other interests might relate to profits, as the technological development of wearable and embedded technology booms. In September of this year, Elon Musk’ company NeuraLink was approved by the FDA for human trials. NeuraLink intends to embed wireless brain computers in paralyzed patients with the intention to expand to other neurological conditions. On paper, those intentions sound great, but knowing the controversial past of Musk. NeuraLink was accused of animal cruelty when they engaged in animal testing of the brain chips in monkey last year. These new FDA regulations will hopefully help new innovative technology companies like NeuraLink to adhere to a higher ethical codex than previously expected. 

The newly effective FDA regulations will impact how to address NeuraLink and any other medical company’s trustworthiness. Cyborgism and embedded technology is a future we can neither run nor hide from even if we wanted to. However, these regulations will be an important factor in determining how future technology and data handling will be developed.  While this change might not affect Neta Alexander's concerns about her own pacemaker, she can be hopeful that positive change is coming into the space. Although there will be no current change to her personal situation, this change reflects the validity of her concerns. I am excited to follow this space to see how the FDA proceeds to increase patient’s privacy and safety with legacy devices.