Identifying the Privacy & Ethical Pitfalls in Connected Medical Devices

Bethany Corbin details the importance of understanding these issues, citing femtech examples like reproductive health privacy post-Dobbs v. Jackson Women’s Health Organization and potential alienation of intended users.

Susan Shepard

April 26, 2024

9 Min Read
Bethany Corbin

Digital health exploded during and after the COVID-19 pandemic, and with it came some real concerns about data privacy, security, and ethics in connected health devices, said Bethany Corbin, managing partner of Women’s Health Innovation Consulting, and the co-founder of FemInnovation, in a recent interview with MD+DI.

“What happened was a lot of healthcare organizations were forced to incorporate some aspects of digital technology to make them more accessible to patients who weren't able to or didn't want to come into the healthcare office to get device readings or see their providers for device adjustments,” she explained.

During this time, companies were racing to get products to market and many of the regulations around data security were put on pause or relaxed. “It was a little bit of a wild west and a lot of companies quickly adopted technology during that time for their healthcare products and solutions without a lot of thought about privacy and security,” she said.

With the pandemic largely abated, Corbin said companies now need to step back and evaluate their privacy and security protections and ask themselves if these systems are going to make sense for the long term.

Along with privacy concerns, there are ethical issues that the use of artificial intelligence (AI) brings to connected devices, Corbin said. “We know regulation is going to be coming in AI,” she said. “It hasn't hit yet, but there's really a need for companies that are thinking about incorporating AI into their medical devices or that have incorporated it, to right now take a step back and think about transparency, how you're ensuring that implicit bias isn't woven into the medical device with the data sets that you're using to train your device on.”

In her keynote address at IME South, Corbin will talk about these legal and ethical challenges that medical device engineers, founders, and companies face today, and she aims to help them understand the privacy, security, and inclusivity landscape, obligations, and best practices. She’ll use case studies in the Femtech sector to further explore the harms and inequities that may result from the device design process and the increasing use of AI.


What are some of the legal and ethical challenges in the connected health and wellness market that medical device companies are facing?

Corbin: One of the most common legal issues that I see is specifically how these companies are collecting and maintaining the data that they get from their connected devices in a way that will maintain patient privacy, patient security, and data confidentiality, but also do so in a way that's going to integrate in with electronic medical record (EMR) systems and other types of health records.

Another legal challenge that I commonly see for these devices is when it comes to security; specifically, how these interconnected devices are secured so that they're not able to be hacked by third parties and also ensuring that internal employees who don’t need access to the data, can’t.

Part of what we see here is a lot of connected devices that are outdated. They're still in use but the hospital systems and healthcare systems haven't properly updated them. As a result, a lot of these types of legacy devices on a healthcare system’s network don't have proper security and hackers are able to get into those devices and use them as a steppingstone into the larger healthcare organization’s database and network.

And then the general landscape gets into ethics. We do live in a post-Dobbs v. Jackson Women’s Health Organization world. Now, whenever we think about anything that's collecting data from individuals who identify as female, that data has an extra layer to it, because there is the possibility, depending on the state and the jurisdiction, that the data could be used to prosecute a reproductive-health crime.

And so, there's an ethical conundrum about how much data should these companies be collecting in their connected devices, whether that data is adequately protected, and whether they're going to turn that data over to law enforcement if it's requested.

I think how we treat data related to reproductive health has started to get a lot of attention over the last couple of months, and that's really coming down into thinking about the security of those devices, how that data can be used, and whether or not we should be collecting this data under the existing privacy frameworks or whether we need a larger, more comprehensive approach to data privacy that also encompasses devices and healthcare services that may fall outside the bounds of HIPAA.

Speaking of reproductive health data, why did you choose femtech case studies to explore the harms and inequities that may result from the device design process and the use of AI in your presentation?

Corbin: Femtech is interesting in that it's a relatively new subset of the medical technology field. It focuses on those issues that are relevant to women's healthcare or that impact women specifically. The term was coined only in 2016, so it’s pretty new and it's already become an industry valued at over a billion dollars.

It arose from the need to make sure that women's health concerns were being considered in the device design, creation, and solution phase. Just in general to try and bridge that gender health gap that we have in healthcare, because women were so often excluded from medicine.

I'm really interested in Femtech as somebody who uses those devices, who works a lot in this space, and who had a woman's health condition very unexpectedly and saw there are discrepancies firsthand. I'm also one of Femtech’s most vocal critics because I think the industry has so much potential, but there are serious gaps in terms of privacy, bias, and clinical accuracy that must be addressed. And I think that we have the potential with Femtech to also do harm in a way that disempowers women if the right protections aren’t in place.

My goal is to make sure that that doesn't happen.

What are some of those potential harms?

Corbin: Some of the unique things in the Femtech sector are around device design and the use of AI. The incorporation of design assumptions has actually caused microagressions or the othering of certain subsets of users.

For instance, one of the most popular period-tracking apps was actually created by four men, and it made assumptions about women in terms of what they were using it for. The assumption was that they wanted to use it to get pregnant. And so, they would have pink fluffy clouds, they would have those types of design choices that a lot of individuals felt were demeaning, or very feminine in nature and a bit demeaning to the user base. They would ask, for instance, for sexual encounters, and sexual positions, and they would only give users design choices that have bananas or a banana with a condom, and that effectively excluded the LGBTQ community.

It was those kinds of design assumptions, especially at the beginning of Femtech, that actually excluded a lot of the intended user base. We saw this even with Fitbit, when it initially only allowed data inputs for a woman's period up to 10 days. If you had a period longer than 10 days, you could not input any of that data into your Fitbit. It created a lot of false ideas about what was normal for a woman's body and that also led to some of the harms and exclusion that women face.

In terms of AI, we see a lot of bias being built into these products, both from a gender perspective, but also from an ethnicity and racial perspective, because a lot of the data that we have is not diverse. If companies are not being very purposeful in the data that they're using and collecting, then that leads to products that are going to be clinically inaccurate, biased, and scientifically unusable.

There have been studies that show about 85% of the Femtech apps on the market today do not meet certain clinical and quality thresholds. For instance, there's data that shows that Caucasian women and Indian women have different symptoms in years during which they experience menopause. If we have an app that's only trained on data from Caucasian women, it's going to give inaccurate results to individuals of other ethnicities and races. There are also some Femtech applications on the market that use outdated and disproven scientific methodologies or don’t cite any clinical literature.

I don't think a lot of people really think about how that translates downstream into the different types of problems that the user base can experience or the inaccuracies that can happen. There's this race to market when it comes to Femtech and other types of medtech products where we're not necessarily taking that step back to make sure that we've got the unbiased data as best as we can or the inclusivity and the diversity and the data.

What do you hope attendees take away from your presentation?

Corbin: I hope they take away that this is a challenging landscape to navigate and to make sure that they're doing it with as much transparency and proactivity as possible so that their consumer base feels protected. Also, to ensure that they’re designing with privacy and security-centric principles in mind for a target population, considering diversity and implicit bias.

I hope they also come away with an understanding of the cybersecurity risks, especially with medical devices, and how to mitigate those risks.

Last, I hope they learn the importance of continuing to update and monitor their privacy and cybersecurity frameworks so that they’re able to look at the devices that they have commissioned or that are within their companies and understand which ones they need to roll out, and which ones are going to be their greatest security risks, so that they are protecting themselves and their patients and consumers against those bad actors.

Who would you like to see attend your session?

Corbin: Anybody who builds, uses, commissions, or incorporates into their companies these types of medical devices, whether they're FDA approved medical devices, whether it’s Femtech, or whether it’s consumer wearables and consumer devices. Anybody who is building those, using those, selling those, or hoping to integrate those into their healthcare system or healthcare practice. They would be a really good target audience for this session because we're going to really dive into what they need to think about as they build or as they commission these types of products, the harm that can happen, and how you think about building that from the ground up with ethical principles, and also in compliance with the law.

Corbin will present the keynote address: “The Surface Pressure of Medical Device Design: Navigating Legal and Ethical Pitfalls in Privacy, Security, and Inclusivity,” on Tuesday, June 4, from 1 to 2 p.m., in the Medtech Theater at IME South.

About the Author(s)

Susan Shepard

Susan Shepard is a freelance contributor to Design News and MD+DI.

Sign up for the Design News Daily newsletter.

You May Also Like