In recent years, the quickly growing “femtech” industry has transformed how many women monitor and manage their health. This field of technology creates products including everything from period trackers to AI-assisted cancer diagnostics. While these innovations offer benefits, they also raise questions about privacy, bias, regulation, and the ethical implications of new technologies in healthcare.
Cancer Center at Illinois (CCIL) member Sara Gerke is working to navigate these issues. A health law scholar and bioethicist, Gerke focuses on AI and digital health safety. Her recent paper, “Effective regulation of technology in women’s health and healthcare,” published in The BMJ together with co-authors Sara Raza, Eric Bressman, and Carmel Shachar, addresses ethical and legal issues surrounding “femtech,” bringing light to the lack of data privacy protection regulation for health apps trusted with personal information.
Gerke’s role bridges law, ethics, and medicine, emphasizing how early collaboration among experts can lead to better outcomes for patients.
“We are seeing a lot being developed right now, especially in AI and digital health technologies, and I’m very much a pro-technology person,” Gerke said. “But at the same time, I think it’s extremely important to have legal and ethical safeguards around these tools. We need to conduct ethics by design from the beginning and carefully consider some of the ethical questions as we develop and deploy these systems.”
Femtech products collect sensitive information, yet Femtech companies and the information they collect are typically not covered by the U.S. Health Insurance Portability and Accountability Act. Regulatory frameworks, such as Europe’s General Data Protection Regulation, provide more consistent protections than those currently available in the U.S.
“Many direct-to-consumer women’s health apps fall outside of HIPAA protection, which means the very sensitive data they collect can be vulnerable,” Gerke said. “Depending on where you live and what the terms of the user agreements say, your information might be shared or sold without you being fully aware. That’s especially concerning because this is data about menstruation, sexual activity, and fertility—information people usually want to keep private.”
Bias in AI systems remains a consistent problem that stems from training AI models on incomplete or unrepresentative datasets. If training data are not representative of the intended use population, they risk worsening already existing health disparities.
“AI tools are prone to bias, particularly when the data they are trained on isn’t representative,” Gerke said. “Consider an AI tool for skin cancer detection that was trained mostly on images of lighter skin. The same tools might likely not work effectively for people with darker skin tones. This isn’t just a technical problem; it can have serious health consequences if algorithms miss diagnoses or generate inaccurate results for certain groups.”
Gerke leads projects funded by the European Union (Horizon Europe) on the ethics and legal aspects of AI-assisted surgery and colonoscopy. She is working to address the evolving legal questions, including liability and the dynamic between AI tools and clinicians.
“We need to think carefully about what happens if AI leads to over-diagnosis in patients or de-skilling of physicians,” Gerke said. “These are new challenges, and we have limited case law so far about liability. When a clinician uses AI and something goes wrong, it’s not yet clear who is responsible. These legal questions are evolving alongside technology.”
At CCIL, Gerke also hosts conferences inviting professionals across all health and cancer-related disciplines to create safer clinical AI tools. She aims to help facilitate the next conference at the University of Illinois Urbana-Champaign in the spring of 2027.
“Often, stakeholders work separately, but it’s critical to bring everyone together from the beginning—developers, clinicians, ethicists, lawyers—to talk about the issues,” Gerke said. “That kind of collaboration helps create safer, more effective tools and builds trust between all parties.”
Sara Gerke
Richard W. and Marie L. Corman Scholar, College of Law
Associate Professor, Law
CCIL Research Program and Theme
- Program: Cancer Technology and Data Science
- Theme: Computational Engineering and Data Science
Research Focus
Sara Gerke’s research focuses on the ethical and legal challenges posed by artificial intelligence and big data in health care and health law in the United States and Europe. She also researches comparative law and the ethics of other issues at the cutting edge of medical developments, such as the clinical translation of stem cell research, biological products (such as somatic cells, tissues, and gene therapy), reproductive medicine (such as mitochondrial replacement techniques), and digital health.
Professor Gerke has led multiple funded research projects, including those funded by the NIH and the European Union (Horizon Europe). She has over 80 publications in health law and bioethics, with a focus on AI and digital health. Her work has appeared in leading law, medical, scientific, and bioethics journals, including The New England Journal of Medicine, JAMA, Science, Nature Medicine, and BMJ. Her scholarship has been featured in major media outlets such as The Wall Street Journal, The Economist, Forbes, Scientific American, and Bloomberg Law.
Editor’s notes:
Sara Gerke is the Richard W. and Marie L. Corman Scholar and an Associate Professor in the College of Law. She is also an Associate Professor in the European Union Center.
She can be reached at gerke@illinois.edu.
The paper, “Effective regulation of technology in women’s health and healthcare” published in The BMJ, is available here.
DOI: https://doi.org/10.1136/bmj-2025-086300
This story was written by Hailee Munno, CCIL Communications Intern.