By Mark Leimbeck
The expanding use of extended reality (XR) in medical applications presents a number of unique opportunities and challenges. Notably, there are risks of exposure of legislatively controlled1 personally identifiable information (PII) due to the ability of XR devices to capture such information in direct and indirect ways. At the same time, such information has the potential to support important clinical indicators of progressive diseases, such as Alzheimer’s. From a purely clinical standpoint, risk management of XR applications requires demonstration that the benefits to health from the foreseeable uses of the device outweigh the risks to patients or other individuals (e.g., caregivers). The international risk management standard for medical devices (ISO 149712 can be applied in assessing the acceptability of these risks.
Due to the extensive reliance on software and networked data, cybersecurity risks must be addressed during the design, development, and use of XR devices. And, of course, compliance with all regulations concerning privacy and control of PII is essential. That said, after considering the various risk factors and unique trade-offs that may be involved, the benefit/risk balance for these devices must be weighed during design and development and continually monitored in the post-market phase.
Current Uses Of XR Technology
XR technology is currently being used in a variety of medical applications, including:3,4
- medical education
- surgical training
- pre-procedural planning
- clinical trials
- cardiac interventions
- stroke rehabilitation
- physical therapy
- pain management
- awareness of diseases (Parkinson’s, Alzheimer’s, etc.)
Direct risks reported for some of these applications have included user/clinician nausea, headaches, dizziness, and blurred vision. Indirect risks related to failure of the sensors supporting the XR experience have resulted from lack of or poor network connectivity. Additionally, the lack of mechanical feedback can create human factors issues requiring analysis and suitable mitigation.
Clinical benefits of some XR applications have been clearly demonstrated. Medical students have demonstrated improved engagement and a better understanding of anatomy through 3D XR views in comparison to traditional learning methods. With respect to pre-procedural planning, clinicians using 3D displays demonstrated improved interpretation times for CT angiography when compared with traditional tomographic readings.5 With respect to diagnosis, research continues to show promise with respect to the ability of augmented reality devices to aid in identifying individuals at risk for cognitive decline.6
Objective assessment of the clinical risks and benefits is a key element of the validation of any new medical technology. In addition, XR applications present other types of risks and benefits that must be considered, particularly those related to:
- software design and implementation
Privacy Risks Vs. Potential Benefits Of XR
The risks from using software as a medical device have been addressed in international standards (e.g., IEC 62304:2006/AMD 1:2015 7 and regulatory guidance documents (e.g., the U.S. FDA Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices8), so software hazards are outside the scope of this paper. Similarly, risks associated with cybersecurity are addressed by standards (e.g., ANSI/UL 2900-19) and regulatory guidance documents (e.g., FDA’s Content of Premarket Submissions for Management of Cybersecurity in Medical Devices10) and are outside the scope of this paper.
Privacy is another matter. Although international standards for the protection of PII have been developed (e.g., ISO/IEC 27701:201911), the importance of balancing the clinical benefits against the privacy risks has yet to be specifically considered for XR applications. The U.S. Department of Labor defines PII as:
Any representation of information that permits the identity of an individual to whom the information applies to be reasonably inferred by either direct or indirect means. Further, PII is defined as information: (i) that directly identifies an individual (e.g., name, address, social security number or other identifying number or code, telephone number, email address, etc.) or (ii) by which an agency intends to identify specific individuals in conjunction with other data elements, i.e., indirect identification. (These data elements may include a combination of gender, race, birth date, geographic indicator, and other descriptors.) Additionally, information permitting the physical or online contacting of a specific individual is the same as personally identifiable information. This information can be maintained in either paper, electronic or other media.
The following are examples of PII:12
- Government-issued identifiers, including:
- Social Security numbers,
- ID and driver’s license numbers issued by any state,
- Tax ID numbers
- Military ID numbers
- Passport numbers, and
- Any other unique number issued on a government document commonly used to verify the identity of a specific individual.
- Health information, including:
- an individual's medical history, mental or physical condition, or medical treatment or diagnosis by a healthcare professional;
- health insurance information, including an individual's health insurance policy number or subscriber identification number, any unique identifier used by a health insurer to identify the individual, or any information in an individual's application and claims history, including any appeals records.
- Unique biometric data generated from measurements or technical analysis of human body characteristics, such as fingerprint, retina, or iris image, which can be used to authenticate the identity of a specific individual.
PII is regulated and controlled in many jurisdictions, with serious consequences for organizations and individuals who fail to safeguard and protect such information. There are many reasons and examples of why PII should be protected. In a recent survey conducted by the U.S. Veteran’s Administration as part of its Genomic Information System for Integrated Science research program,13 specific risks associated with PII were identified. For example:
- Information could be re-linked to a specific individual, notwithstanding assurances that the information will be anonymous or not identifiable.
- Participants could misunderstand or underestimate the extent to which they have consented to share their personal data.
- Perceived loss of privacy could lead to undesirable changes in behavior.
- Embarrassment or stigma could result from disclosure of certain information if tied to the individual.
- Information could be used to enable discrimination against individuals in certain contexts, such as employment or insurance eligibility.
That said, the use of XR devices and associated sensors required for real-time interaction with the virtual reality often results in the need to transmit, store, and use personal data, which may include biometrically inferred personal data. Although existing laws and regulations should ideally be sufficient to control the use of PII and prevent its misuse, historically such controls have failed to keep pace with the rate of technological change.14 Considering that technological change will most likely continue to grow at an exponential rate,15 and cases have already been documented where de-identified PII has been subsequently re-identified through machine learning techniques,16 it is clear that the privacy of PII is a rapidly evolving problem.
In addition to the inherent privacy risks that exist from the capture, use, and dissemination of medically relevant PII, it is important not to lose sight of the potential benefits to patients that may also exist. For example, certain disorders can be identified through specific symptoms, such as loss of neurological control of body movements or impaired memory, disorientation, and other cognitive problems. If the performance data collected during an XR experience is captured and retained, it may be possible to compare it with earlier data to identify early onset and/or progression of these disorders; further, such comparative benchmarks may be useful for the evaluation of treatment responses.
The medical use of XR sensors combined with artificial intelligence is advancing rapidly and may prove equal to traditional clinical screening methods.17 Although there may be known risks associated with storage and use of PII data, there may also be benefits to using such data for comparison and early diagnosis of certain disorders. Therefore, for some patients at least, the potential benefits of storage of PII may outweigh the risks.
The use of XR in medical applications offers many unique opportunities and challenges. Coupled with evolving data capture and analysis methods (for example, artificial intelligence), the potential uses for XR will likely create new, unimagined trade-offs between risks and benefits that will only be identified as further experience is gained. Evaluating the balance between risk and benefit is always challenging and may require clinical studies in addition to expert medical judgment. This subject will require constant vigilance in the design, development, and post-market implementation of XR devices today and into the future.
- Health Insurance Portability and Accountability Act (HIPAA), General Data Protection Regulation (GDPR), and other legally binding requirements.
- ISO 14971:2019 Medical devices — Application of risk management to medical devices. Available at www.ISO.org.
- Extended Reality in Medical Practice, Christopher Andrews, Ph.D., Michael, MS, Jennifer N. A. Silva, MD, and Jonathan R. Silva, Ph.D., https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6919549/
- Extended Reality in Life Sciences and Healthcare, Sebastian Wurst, September 1, 2020 https://medium.datadriveninvestor.com/extended-reality-in-life-sciences-and-healthcare-3780d06f8cac
- First-in-Man Computed Tomography-Guided Percutaneous Revascularization of Coronary Chronic Total Occlusion Using a Wearable Computer: Proof of Concept PMID: 26608117
- Digital biomarker‐based individualized prognosis for people at risk of dementia, Maximilian Buegler, Robbert L. Harms, Mircea Balasa, Irene B. Meier, Themis Exarchos, Laura Rai, Rory Boyle, Adria Tort, Maha Kozori, Eutuxia Lazarou, Michaela Rampini, Carlo Cavaliere, Panagiotis Vlamos, Magda Tsolaki, Claudio Babiloni, Andrea Soricelli, Giovanni Frisoni, Raquel Sanchez‐Valle, Robert Whelan, Emilio Merlo‐Pich, Ioannis Tarnanas, Alzheimer's & Dementia: Diagnosis, Assessment & Disease Monitoring, published by Wiley Periodicals, Inc. on behalf of the Alzheimer's Association
- IEC 62304:2006/AMD 1:2015 Medical device software — Software life cycle processes. Available at https://www.iso.org/standard/38421.html
- Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices, United States Food and Drug Administration, Document issued on: May 11, 2005
- ANSI/UL 2900-1 Standard for Safety, Standard for Software Cybersecurity Network-Connectable Products, Part 1: General Requirements, First Edition, 2017
- Content of Premarket Submissions for Management of Cybersecurity in Medical Devices, United States Food and Drug Administration, Document issued: October 2014
- ISO/IEC 27701:2019, Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management — Requirements and guidelines
- Privacy Impact Assessment for the VA Information Technology System called: Million Veteran Program – Genomic Information System for Integrated Science (GENISIS), Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development (ORD) (JULY 2016)
- Laws and Ethics Can’t Keep Pace with Technology, Vivek Wadhwa, MIT Technology Review, April 15, 2014
- The Law of Accelerating Returns, Ray Kurzweil, March 7, 2001
- El Emam K, Jonker E, Arbuckle L, Malin B (2011), A Systematic Review of Re-Identification Attacks on Health Data. PLoS ONE 6(12): e28071 doi:10.1371/journal.pone.0028071 [PMC free article] [PubMed] [Google Scholar]
- Ahmed Hosny, Chintan Parmar, John Quackenbush, Lawrence H. Schwartz, and Hugo J. W. L. Aerts, Artificial intelligence in radiology, PMCID: PMC6268174 2018 Nov 30
About The Author
Mark Leimbeck has led and supported the implementation of various corporate improvement and development programs through initiatives that include quality system and regulatory compliance programs, new product development, IT systems and enterprise resource planning implementation, and quality/Lean Six Sigma process improvements. He has served as an operations manager, and has expertise in risk management, quality management, project management, software application development, and engineering. Mark participates in standards and guidance development committees. He has authored numerous papers, presentations, and training programs, including development and implementation of a large-scale training program for a Fortune Global Top 100 company.