By John Giantsidis, president, CyberActa, Inc.
Mobile digital health applications store sensitive and personal data, from pulse rate, sleep rhythm records to medication plans, as well as medical prescriptions and certificates. They connect users to appropriate services and act as communication hubs. A compromised smartphone can unintentionally disclose the entire digital life of a patient or user. Thus, establishing suitable security standards for mobile applications is critical.
Two European countries have been leading the way in developing minimum security requirements for digital health applications. In December 2019, Germany introduced the Digital Healthcare Act, which allows physicians to prescribe digital health applications and patients to seek reimbursement from health insurers for apps that meet certain criteria.1 Beyond assessing the digital health app for safety and suitability for use, Germany’s Federal Institute for Drugs and Medical Devices (BfArM) performs an extensive evaluation of data protection, information security and quality, and interoperability. There is an explicit requirement of a state-of-the art implementation of data protection and information security, in which the manufacturer must consider the risks involved in using the digital health app. In April 2020, the German Federal Office for IT Security (BSI), published draft technical guidelines on Security Requirements for Digital Health Applications, providing a minimum set of requirements for safe operation of digital health apps.2
In September 2020, Spain’s National Cryptologic Center (CCN) issued a road map to its own minimum security requirements for digital health applications and e-health mobile apps.3 Based on the BSI guidance, these requirements are now in effect in Spain. This article summarizes the security areas described and the best practices identified in the road map. The goal is to inform and educate developers and manufacturers of health-related mobile apps in their product development and commercialization efforts beyond the U.S., where the Federal Trade Commission (FTC) has developed a mobile health apps interactive tool in conjunction with the Department of Health and Human Services’ Office of National Coordinator for Health Information Technology (ONC), Office for Civil Rights (OCR), and the FDA.4 According to the CCN road map, the minimum requirements for digital health security fall into 11 categories:
- Source code
- Third-party software
- Data storage and data protection
- Platform interactions
- Network communications
The sections that follow highlight important recommendations from each category — and can function as a preliminary checklist when preparing for market entry in Germany and/or Spain.
- The developer must disclose the primary purpose of the app and obtain user consent before any collection of personal data.
- The application shall allow the user to withdraw his/her consent. It shall inform the user to what extent the behavior of the application changes as a result.
- The application shall not display sensitive data on the screen and such data will not be shared with third parties unless required by the intended primary purpose of the application.
- The application shall inform the user in full about the consequences of any transfer of the data and obtain consent.
- Security shall be an integral part of the software development and life cycle for the entire application and reflect the secure collection, processing, storage, and deletion of sensitive data in a data life cycle.
- If the app uses a cloud as a back-end system, the cloud must have a C5 (Cloud Computing Compliance Controls Catalogue)5 or a similar certificate.
- Security functions shall always be implemented, both in the app and back end, as well as on all external interfaces and application programming interface (API) endpoints.
- If the application uses third-party frameworks or libraries, the developer must provide the user with information about the scope of use and the security mechanisms used. The application shall ensure that these features are used securely. The application must also ensure that unused features cannot be activated by third parties.
- The developer must provide the user an option to report security issues.
- User input must be checked before using to filter out potentially malicious values before processing.
- Error messages and notifications shall not contain sensitive data.
- Potential exceptions must be intercepted, handled, and documented in a controlled manner.
- All options to support development must be disabled in the released version.
- The developer must ensure that no debugging mechanisms remain in the released version.
- External libraries and frameworks must be regularly checked for vulnerabilities.
- Functions from libraries and frameworks shall not be used for known vulnerabilities.
- Security updates for external libraries and frameworks must be applied promptly.
- Before using external libraries and frameworks, their source must be checked, and if a third-party software is no longer maintained by its developer, it will not be deployed.
- The application must rely on a proven implementation of cryptographic primitives.
- The choice of cryptographic primitive must be suitable for the use case and correspond to the specifications of the current state of the art.
- The strength of the cryptographic keys must correspond to the current state of the art.
- The user must be authenticated using a second factor before sensitive data is processed in the application (step-up authentication). If there is a deviation from familiar parameters, an additional authentication measure (step-up authentication) must be performed.
- If authenticating by username and password, the strength of the password used can be displayed to the user. Information about the strength of the password cannot be held in the application memory or the back end.
- If the application has been interrupted, reauthentication must be requested.
- The manufacturer must define the minimum quality and characteristics of a biometric sensor and ensure that the available hardware meets the specified requirement and that the sensor has biometric reference characteristics of the user available for comparison.
- The application must determine when the biometric reference features have been changed and refuse the application if biometric reference features have been subsequently changed.
- If an application session is terminated, the session identifier must be deleted, both on the device and on the back end
- No sensitive data can be embedded in an authentication token.
Data Storage and Data Protection
- The factory setting of the application must provide maximum data protection and maximum security.
- All sensitive data must be stored encrypted.
- Sensitive data must not be exported from the component on which it was generated.
- The application must ensure that all sensitive data is encrypted when the device is locked.
- The application must ensure that all sensitive data and application-specific credentials on the device are deleted when they are uninstalled.
- The application must indicate to the user which services are remunerated with additional costs.
- The application must obtain the user's consent before performing paid actions.
- The application must obtain the user's consent before requesting access to paid resources.
- If the application offers paid features, the manufacturer must submit a concept that prevents third parties from tracking the cash flows for the use of application functions.
- The application must offer the user an overview of the costs incurred. If the cost is due to individual accesses, the application must provide an overview of the accesses.
- Any network communication of the application must be encrypted with transport layer security (TLS).
- The configuration of the TLS connections must correspond to the current state of the art and follow current best practice recommendations.
- The application must use either the security functionality of the operating system platform used or security-checked frameworks or libraries to build secure communication channels.
- An aborted start must be logged as a security event in the back end.
- For the use of the application, the terminal device must have device protection. The manufacturer must inform the user about the consequences of unactivated device protection.
- Only the application requests the permissions necessary to fulfill its primary purpose.
- The application must inform the user of the purpose of the permissions and the effects that occur if the user does not grant them.
- The application must implement access restrictions on all data.
- The application must limit broadcast messages to authorized applications.
The application shall:
- Provide the user with low-barrier best practice recommendations for the safe handling of the application and its configuration;
- Detect and respond to rooted or jailbroken devices according to the current state of the art. The developer must present the risks to the user's data if the app continues;
- Abort its start-up if it is launched under unusual user rights;
- Check the integrity of the device before processing sensitive data;
- Check its integrity before accessing the back end; and
- Implement strong measures against reverse engineering and use obfuscation measures such as code obfuscation and string encryption.
It is expected that digital health app users can rely on the developers and manufacturers to implement measures to protect confidentiality, availability, and integrity of data. The minimum security requirements from Germany’s draft guidance and Spain’s road map further illustrate the global emphasis on security by design and instilling security as part of a management process within digital health app developers. For those companies that wish to eliminate multiple development processes, streamline testing and validation, and expand their offerings and services to multiple countries, these guidelines offer a great opportunity to adapt and expand.
About The Author:
John Giantsidis is the president of CyberActa, Inc, a boutique consultancy empowering medical device, digital health, and pharmaceutical companies in their cybersecurity, privacy, data integrity, risk, SaMD regulatory compliance, and commercialization endeavors. He is also a member of the Florida Bar’s Committee on Technology and a Cyber Aux with the U.S. Marine Corps. He holds a Bachelor of Science degree from Clark University, a Juris Doctor from the University of New Hampshire, and a Master of Engineering in Cybersecurity Policy and Compliance from The George Washington University. He can be reached at firstname.lastname@example.org.