Guest Column | January 7, 2019

The FDA, Device Cybersecurity, And What To Expect In 2019

By Carl Carpenter, Arrakis Consulting

insiders-view-CC-Arrakis

As in other industries, medtech cybersecurity has taken on heightened importance in the age of interconnected systems and app-controlled devices. The U.S. Food and Drug Administration (FDA) has pushed to take a stronger stance on medical device cybersecurity, most recently with its Content of Premarket Submissions for Management of Cybersecurity in Medical Devices draft guidance, released in October 2018.  I say, it’s about time!  However, we must ask, what impact does this document have, and what might we expect from the final version?

Hackers of all types, from script kiddies and organized crime to nation state-sponsored hacking groups, actively are seeking ways to acquire your data — whether for leverage, lending an air of legitimacy to a scam, outright theft, or some other nefarious purpose. The FDA’s recent draft cybersecurity guidance follows a precedent: the California Consumer Privacy Act of 2018 (CaCPA), which goes effect on Jan. 1, 2020, attempts to give consumers greater control of their data, and the European Union’s (EU) General Data Protection Regulation (GDPR) took effect in May 2018. Further, in his Consumer Data Protection Act (CDPA) of 2018 draft, US Sen. Ron Wyden has proposed sentencing executives to prison time for failing to follow mandates regarding the use and protection of Americans’ data.

So, is the FDA just following along or are they out front, leading the charge to ensure patient safety?

Tell Me More, Tell Me More…

My initial thought about FDA’s draft cybersecurity guidance — it is far too short. Its 24 pages, if you trim the fluff, comprise about 15-16 pages of guidance; compare that to NIST 800-53, which has over 400 pages. Now, security isn’t about the number of pages in a document, but document length can be indicative of how much clearly understood detail is on those pages. The FDA’s brief draft likely leaves companies with quite a bit of wiggle room to do what they think is adequate, meeting the guidelines but, in reality, far from ensuring reliable cybersecurity. The most pertinent change I think you will see to the final draft is the inclusion of a lot more relative content.

Something Old, Something New, Something Borrowed

The FDA appears to have incorporated into its draft guidance aspects from other guidelines, rules, or regulations, such as GDPR, CaCPA, etc. This is a good thing.

First, regulations can only be viewed in context of what currently is in place, such as HIPAA or GDPR; GDPR, for example, is unbiased whether the data is medically related or not, whereas HIPAA focuses solely on medical data. From the GDPR standpoint, data is sensitive, and violations incur steep penalties for the offending company.  Steep, as in 4 percent of gross global revenue, or €20 million (whichever is greater).  Regardless of a company’s size or revenues, neither option is of a magnitude to brush off.

Second, which regulation takes priority, and what aspects of each regulation are enforceable?  Additionally, what power will each company wield to counteract penalties from those regulations? More to the point, is the FDA counting on this draft reinforcing existing regulations, or will it be expected to stand on its own feet? I found the FDA draft to align with other regulations where a security minded approach is expected throughout the entire product lifecycle, rather than addressing security solely as a postmarket concern.

Having worked in device development and being a former device tester, I can say that I do not believe all devices are thoroughly tested for cybersecurity.  This generally is due to a company wanting to reduce time to market, or lacking funds to perform such testing. Essentially, the company is doing less than it should, and hoping nothing happens, in an attempt to reach market faster. This sometimes can be attributed to the encouragement of “that executive,” whose bonus likely depends on timely release to market.  

So, what do we mean by “that executive”?  Those in the regulatory, compliance, governance, or even IT fields, in general, will easily remember a time the company really needed some mission-critical piece of technology or a process change, but “that executive” denied the request because it either cost too much, delayed time to market, or reduced the profit margin. You can’t really blame these individuals for having that mindset, as their job is to ensure the highest profit margin possible and to make these hard decisions. Unfortunately, hard decisions are not always the right decisions, and more companies are being held accountable for it. 

Testing Doesn’t Make The Grade

Medtech product testing should occur throughout the product development life cycle, as well as postmarket. The FDA requires  lifecycle testing, as well as proof of testing, when devices are submitted for preapproval. Similar to GDPR, the FDA draft guidance includes supplier companies as part of that lifecycle testing — dubbed cybersecurity body of materials (CBOM) — implying that device manufacturing companies must require supplier companies to be equally (cyber)secure.  At least, let’s hope so. 

Examples of device cybersecurity testing include checking for unencrypted or low encrypted connection(s) from the device to the Internet, or from the device to a nearby external control device.  One obvious flaw might be a device with a unique identifier that allows for easy guessing of another patient’s information (e.g., Patient 1 = abcd0001, while Patient 2 = abcd0002). All of this would seem to be common sense, and it is touched upon in the draft guidance, but the draft failed to address testing of multiple connections from the same medical device.  It would seem, to increase security, that a medical device should allow only a single external connection at a time, rather than multiple connections simultaneously. While there may be a medical reason to have multiple, simultaneous connections, such a scenario also presents an increased security risk, and I really didn’t find the FDA draft to be very restrictive in this area. 

I also absolutely love that the draft guidance stipulates that proof of testing from third parties will help the preapproval process. The FDA had little choice in adding this language, I believe, for one simple reason: it would be illogical to expect a medical device manufacturer to suddenly have the in-house cybersecurity expertise to perform hostile penetration testing against its own devices. 

Room For Interpretation

Similar to GDPR, the FDA’s draft cybersecurity guidance is vague enough that the agency can technically say “we told you so,” while not really telling you enough figure out exactly what some of that vagueness means. From the FDA’s standpoint, this is a good thing, because it has the option to enforce whatever its interpretation of the guidance is that day.

One case in point is the draft guidance’s “cryptographically strong” metric, wherein the FDA sets a bar that “…authoritative sources in cryptography would consider sufficiently secure.” However, the FDA does not indicate what is considered an authoritative source. ISO? NIST? This ambiguity is not harmful to the FDA, but poses problems for a company trying to become — or remain — compliant. Consider, instead, if that passage had explained “cryptographically strong” as “equivalent level of cryptographic security as it relates to US Government data found at the SECRET level.” Such wording would be simpler to use, as well as be rooted in a standard already deemed acceptable and in-use.   

Another example of this indeterminate language is where the FDA indicates “…device manufacturers may need to establish a cybersecurity vulnerability and management approach, where appropriate.”    From a regulatory standpoint, using the word “may” never is recommended; it only opens the door to potential confusion. While I honestly can’t fathom a medtech company operating in today’s world without a cybersecurity program in place, I also can’t imagine the FDA giving companies a choice whether or not to have a cybersecurity program in place.

The draft cybersecurity guidance does touch upon aspects of patient harm (in which actual harm could befall the patient), but it doesn’t really explore the potential for patient harm when a hostile party simply has access to patient information. Quite often, compromised patient data can lead to a worse situation that physical manipulation of an actual device. I would suspect that, at some point, the FDA will refer back to other data privacy guidelines, as they relates to the theft of patient data, rather than reinvent the wheel. It would seem logical that the FDA focus on something similar to GDPR as a starting point.

One other area where I think the FDA erred was in defining the different tiers of devices (Tier 1 or Tier 2), designating the higher tier as devices “capable of connecting (e.g., wired, wirelessly) to another medical or non-medical product, or to a network, or to the Internet” and (paraphrased) capable of causing harm to the patient, or multiple patients. Later in the document, the FDA stipulates that such upper-tier devices must have advanced cybersecurity detection, logging, anti-virus scanning, forensic data capture, etc. — unrealistic for a small, for example, implantable device.

While, clearly, the FDA’s intent was to increase security where increased security is possible, I think the intent could easily be confused as meaning that a pacemaker should have each of the listed security measures, which seems quite difficult with today’s technology. The list of security features itself is quite extensive, and logical, but it is unrealistic applied smaller devices, and I suspect the FDA will clarify the guidance in this respect.

General Guidelines Will Become More Specific Regulations

Resultant of the FDA’s draft cybersecurity document — and its eventual final version — the future is likely to include more restrictive device regulation, requiring extensive technical and safety testing of the device, coupled with extensive cybersecurity testing. You are likely to see increased spending in the development process, as well as more independent, third-party certification of devices before they are submitted to the FDA. Areas surrounding encryption (e.g., of patient data hosted on a device, or patient data transmitted) will be increased and strengthened. Device operating system (OS) real-time upgrades are likely to become more prevalent, as well, in order to apply security patches. However, safe methods to do so will be under increased scrutiny. 

I also suspect you will see at least one company held up as an example of as a result of what happens when a medtech fails to meet the envisioned cybersecurity standard. I expect the FDA’s upcoming final cybersecurity guidance will expand greatly to clarify vagueness littered throughout the draft document, as well as add language concerning penalties. 

About The Author

Carl Carpenter is a professional with Arrakis Consulting with decades of experience in Information Security, Governance, Risk, and Compliance working with companies of all sizes regardless of industry.  Carl is also retired military and a former CISO of a $6BN organization with 15,000 employees and was responsible for FFIEC, HIPAA, PCI, and other federally regulatory environments.   Carl specializes in reducing risk, implementing security measures, and increasing compliance for regulated entities.