Death by Technology: The Need for Law to Mitigate Risk

201506 Vol2 Iss2 CIP Feat

Posted: June 15, 2015 | By: Celeste Richardson

Barnaby Jack was a brilliant New Zealand grey hat hacker, a subclass of ethical hackers. “In general terms, ethical hackers are authorized to break into supposedly ‘secure’ computer systems without malicious intent, but with the aim of discovering vulnerabilities in order to bring about improved protection.” [1] Jack was not typically authorized to hack the systems he infiltrated, but he informed the companies he hacked about his knowledge of the software vulnerabilities and made them more secure. Jack demonstrated his ability to control an insulin pump using radio waves to distribute a lethal dose of insulin. Jack’s next trick was to show the hacker community he could “hack into a wireless communications system that linked implanted pacemakers and defibrillators with bedside monitors that gather information about their operations.” [2] By gaining access to the system, Jack could control what the machine told the pacemaker or defibrillator to do with potentially lethal consequences. [2] A couple weeks before he was to present his findings about how to hack a pacemaker and cardiac defibrillator at the 2013 Black Hat Conference, an annual hacker convention, Jack died from an accidental drug overdose. His work, however, started the discussion on the security of implanted medical devices.

Pacemakers are known to be vulnerable to electronic magnetic pulses. Although standard electromagnetic fields, such as metal detectors, usually have no effect on pacemakers, an intentional electronic magnetic pulse attack is fairly simple to carry out. Other implanted cardiac defibrillators and insulin pumps are vulnerable to hacking with the potential for fatal consequences as well. The holy trinity of technical security is confidentiality, integrity and availability. In biomedical devices, a compromise in integrity and/or availability can cause the device to be unpredictable or a patient to be misdiagnosed, resulting in injury or death. The Food and Drug Administration suggests manufactures take pertinent measures to ensure their devices are hardened against cyberattack.

Figure 1. The image shows a cross-section of a chest with a pacemaker. Image A shows the location and general size of a double-lead, or dual-chamber, pacemaker in the upper chest. The wires with electrodes are inserted into the heart's right atrium and ventricle through a vein in the upper chest. Image B shows an electrode electrically stimulating the heart muscle. Image C shows the location and general size of a single-lead, or singlechamber, pacemaker in the upper chest. (Released)

Figure 1. The image shows a cross-section of a chest with a pacemaker. Image A  shows the location and general size of a
double-lead, or dual-chamber, pacemaker in the upper chest. The wires with electrodes are inserted into the heart’s right atrium and ventricle through a vein in the upper chest. Image B shows an electrode electrically stimulating the heart muscle. Image C shows the location and general size of a single-lead, or singlechamber, pacemaker in the upper chest. (Released)

The FDA audits unintentional threats of cardiac defibrillators and insulin pumps, but does not reprimand manufactures when vulnerabilities are exploited in their devices via cyberattacks that could endanger the patient or compromise patient data. As networked biomedical devices become more commonplace, it is critical that the government develop and enforce security standards to reduce the risk of cybersecurity vulnerabilities in these devices. Should security standards for implanted medical devices have legal ramifications to ensure medical manufactures follow security best practices? If the government does not enforce security in today’s implanted medical devices, the situation could worsen with future advanced technology.

Weak standards

July 14, 2013 (11 days before the death of Barnaby Jack), the FDA admitted:
“The need for effective cyber security to assure medical device functionality has become more important with the increasing use of wireless Internet- and network-connected devices and the frequent electronic exchange of medical device-related health information.
Failure to maintain cyber security can result in compromised device functionality, loss of data availability or integrity or exposure of other connected devices or networks to security threats. These, in turn, have the potential to result in patient illness, injury or death.” [3]

As of February 2015, the FDA has only released recommendations, not requirements, for implanted medical device security. The FDA says the recommendations “can help provide reasonable assurance of safety and effectiveness for medical devices that incorporate radio frequency wireless technology.” [3] Reasonable is undefined. The FDA does define “should” as “something suggested or recommended, but not required.” [3] Should comes up in sentences such as “issues relating to the integrity of the data transmitted wirelessly and safety-related requirements of your device should be considered.” [3]

The Revo MRI SureScan Pacing System delivers standard pacing t h e r a p y t o patients with slow heart rates. This p a r t i c u l a r p a c e m a k e r permits patients meeting certain criteria to receive M R I s c a n . (Courtesy of the U . S . F D A / Released)

The Revo MRI SureScan Pacing System delivers standard pacing therapy to patients with slow heart rates. This particular pacemaker permits patients meeting certain criteria to receive an MRI scan. (Courtesy of the U . S . F D A /
Released)

In 1990, Congress passed the Safe Medical Device Act. This act makes it mandatory that facilities and manufactures report problems with medical devices. The Act has been revised several times, and in some respects, the latest revision seems to give medical manufactures more leeway. The changes made in the 1998 revision are as follows:

“Manufacturers and distributors/importers do not need to submit annual certification.

Domestic distributors are no longer required to file [Medical Device Reporting] reports, but must continue to maintain complaint files. [Importers (initial distributors for devices manufactured overseas and imported into the USA) must continue to file MDR reports.]

User facilities must now file an annual report instead of semiannual reports to summarize their adverse event reports.” [4]

The highest civil penalty for violating this act is $1 million, and there are no criminal penalties in place. The act has not been updated to specifically cover cybersecurity issues.

International Organization for Standardization 14971 does address risk management for medical devices. The standard makes determining, defining and announcing the risk a manufacture obligation. However, there is no penalty for not meeting these standards. Companies can be  ISO certified, which means they proved they met the standard. Once companies are certified, they become more attractive as a business  partner or vendor. Hospitals are not required to work with only certified medical manufactures. In most cases, insurance companies  determine the medical device that will be covered for patient use.

International Electrotechnical Commission 60601-1-2 standard states electronic medical devices should be accompanied by a usability document. This should ensure users are using the product correctly, but adding security best practices is not part of this standard. Jay Radcliff recalls changing the battery in his implanted insulin pump one night before bed and nearly dying because he did not read the sentence on page 147 of 300 of his user manual which stated changing the battery would change user settings. [5] The company’s response was “It’s in the manual,” and the FDA did not take an actionable response either.

It is important to note that many of these standards are not freely available to the public. To view International Electrotechnical Commission Technical Report 80001-2-3, for example, it costs around $239.48, and to view ISO 14708 costs about $185. These, and other, standards are issued by private organizations. These organizations help guide the industry so consumers can expect the same level of service nationally and internationally. Companies benefit following these standards when trading or collaborating with other companies. While these standards have commendable intentions, they cannot now, or ever be, legally enforced because the standard comes from a privately owned entity.

Issues, suggested standards and law

Securing implanted medical devices is a very challenging task due to their very limiting resource constraints in terms of energy supply, processing power and storage space. Adding any security application to the device may drain the battery. Unlike general medical sensors that may use AA-type of renewable batteries, an IMD typically uses silver vanadium oxide batteries. [6]

These batteries are vulnerable to resource depletion due to extra code, and changing the battery inside an IMD requires surgery.

Another unique problem with wireless security in implanted medical devices is the need for emergency access. For machines like laptops and cell phones, there is no life or death situation for emergency access. When dealing with IMDs, a doctor or first responder may need to access the device to save a life, leading to the possibility of encryption and passwords stopping critical access. Common practice security measures, such as passwords, to encrypt traffic to and from these devices cannot be used in this situation.

One possible solution to this security issue is to moderate access based on distance. If an external device, such as a bracelet the user wears, is nearby, the device will not accept incoming commands. When the device is not within a couple meters of the patient, the IMD will accept incoming commands. Emergency responders would then have access to the IMD during an emergency when the external device is not sensed. The user will wear the bracelet daily to protect the IMD from attacks. During an emergency, the doctor can remove the bracelet and have full access to the device. The idea was created by researchers, Tamara Denning, Kevin Fu and Tadayoshi Kohno. The team calls the external device a Communication Cloaker. The concept of a Communication Cloaker solves many of the unique issues with IMD security, “safety and open access in emergencies, security and privacy under adversarial conditions, battery life and response time” [7]

An additional issue with IMD security is upgrading the current hardware already in use. In many cases, access to current medical devices means surgery or risky operations. Security standards must be made, not only for future devices, but also for devices already in use.  Shyamnath Gollakota, Haitham Hassanieh, Benjamin Ransford, Dina Katabi and Kevin Fu created a shield that can provide IMD security for current medical devices by using “a novel radio design that can act as a jammer-cum-receiver. This design allows it to jam the implanted medical device’s messages, preventing others from decoding them while being able to decode them itself.” [8] The shield is similar to the Communication Cloaker, but it works on radio frequencies that are currently used.

In any crime, forensic evidence must be collected to bring the perpetrator to justice. When dealing with cybersecurity, the only way to investigate a crime is to use reverse engineering. Security analysts use audit trails to piece together what happened during a cybersecurity incident. IMDs should also contain audit trails so the culprit can be captured in case of a breech. IMD storage does not contain the space needed to keep archived data of audit trails. An external device, such as a cell phone or wearable technology, could be used to complete this function.

Chest X-Ray with implanted pacemaker device. (Courtesy of U.S. FDA/Released)

Chest X-Ray with implanted pacemaker device. (Courtesy of U.S. FDA/Released)

Conclusion

Security should not slow the progression of technology, only complement its use. Without fail, an insecure system will yield unpredictable results, no matter the intent of the creator. Healthcare technology should not be unpredictable. These devices should allow patients to feel more secure. Strict laws with repercussions for manufactures in the case of security neglect and standards that focus on cybersecurity issues need to be implemented for networked implanted medical devices today to protect the patients of tomorrow.

About the author:

Celeste Richardson began her career and education as a cybersecurity analyst with a Bachelor of Science in computer science from Hampton University. She obtained her master’s degree in Healthcare Informatics from Northeastern University.  Richardson has a passion for being
on the forefront of emerging cybersecurity issues and enjoys all phases of the cyberincident management process. She is pursuing of her doctorate in emergency management from Jacksonville State University and works for the Federal Emergency Management Agency in a security role.

References:

[1] Bodhani, A. (2012). Ethical hacking: bad in a good way. Engineering and Technology Magazine. volume 7 (12).
[2] Finkle, J. (2013) Famed hacker Barnaby Jack dies a week before hacking convention. Reuters.
[3] U.S. Department of Health and Human Services, Food and Drug Administration, Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, & Center for Biologics Evaluation and Research. (2013) Radio Frequency Wireless Technology in Medical Devices. [PDF file] Guidance for Industry and Food and Drug Administration Staff.
[4] U.S. Food and Drug Administration. (2000) Medical Device Reporting (MDR). Medical Devices.
[5] Crenshaw, A. (2013) BSidesLV 2013 1 1 3 Mom! I Broke My Insulin Pump Again! Jay R [Video file].
[6] Du, X., & Hei, X. (2013). Security for Wireless Implantable Medical Devices. Philadelphia, PA: Springer.
[7] Denning, T., Fu, K., Kohno, T. (2008). Absence Makes the Heart Grow Fonder: New Directions for Implantable Medical Device Security. Usenix, pp.4.
[8] Fu, K., Gollakota, S., Hassanieh, H., Katabi, D., & Ransford, B. (2011) They Can Hear Your Heartbeats: Non-Invasive Security for Implantable Medical Devices. SIGCOMM’11, pp.4.

Other References:

1. Arora, K., Luthra, K., & Yeshvini, A. (2013). Cancer Therapies Using Nanobots. International Journal of Computer Science and Management Research. 2(2), 1708.
2. Association for the Advancement of Medical Instrumentation. (2007). Active Implantable Medical Devices—Electromagnetic Compatibility—EMC Test Protocols for Implantable Cardiac Pacemakers and Implantable Cardioverter Defibrillators [Preview]. American National Standard.
3. Dall, T., Mann, S. E., Zhang, Y., Martin, J., & Chen, Y. (2008). Economic costs of diabetes in the U.S. in 2007. Diabetes Care, 31 (3), 596-615.
4. Dubey, A., Mavroidis, C., Sharma, G., & Ummat, A. (2005). Bio-Nanorobotics: State of the Art and Future Challenges. Tissue Engineering and Artificial Organs (3rd ed.). Boston, MA: CRC Press.
5. Ellenbogen, K. A. & Wood, M. A. (2002) Cardiology Patient Page: Cardiac Pacemakers From the Patient’s Perspective. American Heart Association.
6. EPFL News. (2013) Under the Skin, a Tiny Laboratory [Video file].
7. Funck, R., Hesse, H., Kruse, T., Maisch, B., & Wilke, A. (1998) Interactions between Pacemakers and Security Systems [Abstract]. Pacing and Clinical Electrophysiology, 21(9), 1784-1788.
8. Info Security Magazine (2011) Barnaby Jack hacks diabetes insulin pump live at Hacker Halted.
9. Integrated Sensing Systems (2013) Wireless Implantable Medical Products.
10. Kirk, J. (2012) Pacemaker hack can deliver deadly 830-volt jolt: Pacemakers and implantable cardioverter-defibrillators could be manipulated for an anonymous assassination. CIO.
11. Lyle, D. P. (2009) Chatting with Dr. Cyril Wecht. The Writer’s Forensic Blog.
12. Scharr, J. (2013) Hack-Proof Pacemakers: Code Based on Heartbeat Could Thwart Disruption. Live Science.

Want to find out more about this topic?

Request a FREE Technical Inquiry!