By Bethany Corbin, Nixon Gwilt Law
The health technology (health tech) industry has experienced rapid growth and adoption by consumers who are eager to track, monitor, and improve their health and wellness outside the confines of medical facilities. Riding the coattails of the digital health revolution, health technology has redefined consumers’ role in healthcare by allowing them to better understand and control their bodies and health. This newfound autonomy has contributed to the rise of a robust medical device, wearable, and application (app) ecosystem in which consumer data is collected and analyzed to provide actionable health insights. In this landscape, consumers willingly relinquish their data rights in exchange for cutting-edge algorithmic predictions, device functionality, and health tracking capabilities.
The privacy practices and policies of health technology companies, however, have been subjects of debate and scrutiny over the years. Most health technology companies and devices operate outside the bounds of the Health Insurance Portability and Accountability Act (HIPAA) and its Privacy and Security Rules. As such, privacy in the health technology industry is governed principally by state legislation related to health data privacy (if it exists in the relevant jurisdiction) and the Federal Trade Commission’s (FTC) prohibition against unfair and deceptive acts and practices. The lack of an overarching privacy framework for all health data has thus resulted in a wild west of privacy practices, with some health technology companies prioritizing consumer privacy in device design and development, while other companies adopt a race-to-market strategy in which consumer privacy is an afterthought. This wide variance in privacy practices complicates consumers’ choice about which apps, wearables, and devices to trust.
The leak of the U.S. Supreme Court’s draft decision in Dobbs v. Jackson Women’s Health Organization,1 however, has reignited demand for consumer-first privacy standards at health technology companies. The leaked opinion, which included an unflinching repudiation of federal protection for abortion rights, has caused many consumers in the female health technology (femtech) industry to fear that the data privacy and security features and practices of their femtech apps and devices could be used against them to prosecute crimes. Indeed, there have been numerous calls throughout the media for women to delete their data from period-tracking and fertility apps out of fear that law enforcement could access and use this data.2 The result has been an atmosphere of concern and panic for users of women’s health apps and an eroding sense of trust in the health tech community. This article explores the shifting privacy landscape for femtech devices, explains how such shifts will impact the privacy practices of general health technology companies, and identifies strategies for privacy prioritization.
Mistrust And Criminalization: Can Femtech Surf The Privacy Waves?
First coined in 2016 by Ida Tin, femtech refers to health technology solutions, some of which are designed specifically to investigate, diagnose, manage, and cure women’s health conditions. As an essential driver of change for women’s health, femtech affords individuals identifying as female with autonomy and control over their bodies and healthcare decisions. To accomplish this goal, femtech must fight against cultural and societal stigmas and taboos that have historically relegated women’s health to the unspoken private sphere. With a long-term goal of using health data to derive actionable insights for women’s health treatment, femtech seeks to bridge the gender gap in available clinical health data and redefine healthcare delivery for women. Common femtech products on the market today include period-tracking apps, ovulation, and fertility tracking apps, maternal health solutions, menstruation products, and much more.
Following the leaked draft of Dobbs, users of femtech applications and products have vocalized their concerns and distrust of femtech privacy practices. Consumers fear that femtech companies may disclose their reproductive health data to law enforcement officers, who could use that data to prosecute illegal abortions in a post-Roe v. Wade world. This concern is especially heightened for consumers in trigger states – i.e., states that have laws criminalizing or banning abortions, which would immediately go into effect upon the reversal of Roe. The fear is that women’s own health data could be used against them because this data is not properly protected by femtech companies.
This fear of data disclosure is not unfounded. Between 2015 and 2019, police and prosecutors conducted at least 50,000 data extractions of digital devices to prosecute a wide variety of crimes, including shoplifting, car crashes, prostitution, drug possession, and unlawful surveillance.3 In 2017, law enforcement officers obtained data regarding a woman’s online search history, which was used in a criminal case to show her intention to obtain an abortion. In 2019, former Missouri state health director Dr. Randall Williams obtained data in the form of a spreadsheet tracking the menstrual periods of women who had visited Planned Parenthood.4 Most recently, VICE reported that a data broker was selling abortion-related location data for $160.5 This data included the name of the abortion clinic, how long the person stayed at the clinic, where the person came from, and where the person went afterward. While these data disclosure statistics and incidents are alarming, they assume heighted importance in a post-Roe world where women could be prosecuted and serve jail time from a downstream disclosure of their private reproductive health data.
The possibility of prosecution associated with female health data has resulted in unprecedented consumer scrutiny of femtech privacy policies and practices. New studies are examining the data-hungriness of common femtech apps,6 along with the level of privacy protection afforded to consumers under existing femtech privacy policies. The results have forced consumers to rethink their overall health data disclosure practices and risk tolerance and have caused consumers to delete femtech apps from their suite of health monitoring tools.
The deletion of femtech apps and corresponding data inputs is problematic for long-term women’s health insights and solutions. Decreased usage and data present a high risk for decreased data diversity, which can result in health inequities. Specifically, African American and Hispanic women, who constitute a large percentage of overall abortion seekers (particularly in trigger states),7 may be more inclined to remove their health data from existing femtech apps and cease providing data in the future if the abortion landscape changes. This diversity disparity can cause long-term women’s health research to be less effective for excluded populations. Further, digital platforms, including femtech, have historically served to enhance access to healthcare regardless of location, wealth, or social status. Given the decreased consumer trust in digital health platforms, women who forgo femtech apps and devices may not otherwise be able to access necessary healthcare. This can negatively impact women’s healthcare outcomes. Accordingly, femtech companies are under immense pressure to rework and strengthen their privacy frameworks in light of changing abortion regulations.
Impact Of The Shifting Femtech Privacy Landscape On Health Technology Companies
While the eye of the privacy storm is firmly centered on femtech, it is only a matter of time before consumers extrapolate the privacy risks to general health technology. The privacy infrastructure and practices of femtech and health technology companies are highly comparable, with the key differentiator being the sensitive nature of reproductive health data collected by femtech apps and devices. As consumers demand privacy-centric solutions in femtech, such demands will begin to bleed over into general health technology.
There are three important reasons why this shift is expected to occur. First, while reproductive health information undoubtedly qualifies as sensitive data, it is not the only category of sensitive health data that needs protection. For example, mental health data and substance use disorder data are two categories of sensitive health data that require enhanced privacy protections. A recent Mozilla study, however, found that mental health apps (which routinely deal with sensitive issues like depression, suicide, anxiety, domestic violence, post-traumatic stress disorder, and eating disorders) have vague and poorly written privacy policies and routinely share data downstream to third parties.8 Similarly, a study by ExpressVPN found that opioid addiction treatment and recovery apps regularly shared sensitive information with third parties.9 Thus, femtech serves as the case study through which enhanced privacy protections will be demanded and tested, with the expectation that such privacy protections will need to be implemented in other health technology verticals.
Second, while femtech apps are the primary collectors of women’s reproductive health data, they are not the exclusive collectors. General health technology apps and devices may also collect reproductive health data, such as cycle length, while measuring or tracking non-gendered diseases or general wellness. Health tech companies that collect any form of reproductive health data should be prepared to implement more robust privacy practices, especially in a post-Roe world.
Finally, the FTC has identified a renewed focus this summer on consumer data protection.10 This means rulemaking efforts to limit health technology companies’ collection and use of consumer data will likely be a priority. These rulemaking efforts will not be limited to femtech and will likely impact the entire health technology industry. As a result, health technology companies must be prepared to review and overhaul their privacy practices and protections in the wake of a changing regulatory landscape.
Privacy Prioritization Strategies
Given consumers’ increased focus on and sensitivity to data privacy, it is important that health technology companies regain consumer trust. If consumers do not trust the devices, apps, and platforms that are expected to revolutionize healthcare, then healthcare innovation will be delayed. As such, it is crucial that health technology companies begin to view privacy through the consumer lens and rethink existing privacy strategies to ensure market competitiveness and promote consumer trust.
To do this, health technology companies should first realize the value that a consumer-centric privacy framework can provide for their business. Privacy is not merely a consumer protection tool, nor is it simply a reactionary concept designed to mitigate against data breaches. Rather, developing a strong consumer-centric privacy structure can provide health technology companies with at least two business advantages. First, as consumers continue to become educated on privacy, they will increasingly be examining the privacy policies and practices of health technology companies. Companies that develop stronger privacy protections now will have a market advantage. In a time when many consumers are fearful of the shifting abortion landscape, privacy-centric health technology companies can promote their privacy practices and attract positive consumer attention.
Second, a strong privacy framework can help health technology companies and startups secure venture capital funding and close industry partnership deals faster. Venture capitalists and industry partners (especially those in the pharmaceutical industry) require minimum privacy protections, standards, and protocols to be in place before agreeing to fund the venture or enter into a partnership agreement, respectively. Oftentimes, the privacy requirements imposed contractually by industry partners will be much stricter than the general privacy landscape applicable to health technology companies. Companies that do not have privacy-centric structures in place will need to create such frameworks, which can delay deals by six to nine months. Thus, privacy prioritization at early stages can benefit health tech companies.
Although privacy prioritization takes time to implement, there are three steps that health technology companies can take now to begin this journey.
- First, health technology companies should construct a data map that identifies the types of data collected, the entry and exit points for data collection and disclosure, the locations in which data is stored, and all individuals who have access to this data. From the data map, companies should determine whether each type of data collected is necessary for the functioning of the app, device, product, or wearable. In other words, does every data category serve an essential function in effectuating the overall purpose of the device or the app’s algorithmic prediction? If not, determine whether any data categories can be removed. Companies should strive to only collect the minimum necessary data for the app or device to function properly.
- Second, ensure internal access to health data is limited to only those who need the data to perform job functions. Data access should be role-based and should not be available to all employees. Create a chart with each employee’s title and role and determine which roles require data access. If a role does not require data access, terminate access rights as soon as possible.
- Finally, if possible, restrict the sale and disclosure of data downstream. To do this, compile a list of all downstream vendors and partners. For each vendor or partner, identify if data is disclosed or sold. If data is disclosed or sold to a vendor or partner, determine the purpose of such disclosure or sale and whether the disclosure or sale is necessary to fulfill a core business function. If the disclosure or sale of data is not necessary for core business operations, consider eliminating or restricting the sale or disclosure of data to that vendor or partner.
Data privacy is a key consideration for building and maintaining consumer trust. As healthcare innovation continues to revolutionize care standards and delivery, it is essential that health technology companies do not harm their customer base. One way to promote consumer trust in health technology is to prioritize consumer-centric privacy frameworks that minimize downstream data disclosure and emphasize consumer access and ownership over their own data. Change is coming, and while it may start with the femtech industry, it will inevitably impact the entire health technology industry in the upcoming years.
- Josh Gerstein & Alexander Ward, Supreme Court has Voted to Overturn Abortion Rights, Draft Opinion Shows, Politico (May 2, 2022), https://www.politico.com/news/2022/05/02/supreme-court-abortion-draft-opinion-00029473.
- See, e.g., Hannah Norman & Victoria Knight, Should You Worry About Data From Your Period-Tracking App Being Used Against You?, KHN (May 13, 2022), https://khn.org/news/article/period-tracking-apps-data-privacy/; Vittoria Elliott, Fertility and Period Apps Can Be Weaponized in a Post-Roe World, Wired (June 7, 2022), https://www.wired.com/story/fertility-data-weaponized/.
- Cynthia Conti-Cook, Surveilling the Digital Abortion Diary, 50 Univ. Baltimore L. Rev. 1, 36 (2020), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3666305.
- Norman & Knight, supra note 2.
- Joseph Cox, Data Broker Is Selling Location Data of People Who Visit Abortion Clinics, VICE (May 3, 2022), https://www.vice.com/en/article/m7vzjb/location-data-abortion-clinics-safegraph-planned-parenthood.
- Martynas Klimas, The Data Flows: How Private are Popular Period Tracker Apps?, Surf Shark (May 25, 2022), https://surfshark.com/blog/period-track-app-data-privacy.
- James Studnicki et al., Perceiving and Addressing the Pervasive Racial Disparity in Abortion, Health Servs. Research & Managerial Epidemiology (Aug. 18, 2020), https://journals.sagepub.com/doi/full/10.1177/2333392820949743.
- Top Mental Health and Prayer Apps Fail Spectacularly at Privacy, Security, Mozilla (May 2, 2022), https://foundation.mozilla.org/en/blog/top-mental-health-and-prayer-apps-fail-spectacularly-at-privacy-security/.
- Apps for Opioid Addiction Treatment and Recovery: Data Sharing and Privacy Risks, ExpressVPN (2021), https://www.expressvpn.com/digital-security-lab/opioid-telehealth-research?=06072021; see also Carly Page, Opioid Addiction Treatment Apps Found Sharing Sensitive Data with Third Parties, TechCrunch (July 7, 2021), https://techcrunch.com/2021/07/07/opioid-addiction-treatment-apps-found-sharing-sensitive-data-with-third-parties/.
- Andrea Vittorio, Data Privacy Takes Priority for FTC Chief as Dems Break Deadlock, Bloomberg Law (June 9, 2022), https://www.bloomberglaw.com/bloomberglawnews/privacy-and-data-security/XEAMJ72O000000?bna_news_filter=privacy-and-data-security#jcite.
About The Author:
Bethany Corbin, senior counsel at Nixon Gwilt Law, is a healthcare innovation, femtech, and privacy attorney. She empowers femtech and healthcare innovation companies to achieve their goals with legal and strategic guidance. She is CIPP/US certified and has a healthcare LL.M. She can be reached at Bethany.Corbin@nixongwiltlaw.com or on LinkedIn.