As someone who has lived through the consequences of misdiagnosis for over 20 years, I watched the Microsoft Ignite keynote this year with mixed emotions. The spotlight was on Copilot’s integration with Epic, promising to solve the challenge of overworked doctors by summarizing patient notes to help streamline clinical workflows. On the surface, this sounds revolutionary. Improved efficiency through automation so doctors can focus on better care as they navigate overloaded patient schedules. A challenge they have been facing for years. Beneath the excitement lies a question that keeps me up at night:
What happens when artificial intelligence inherits the same biases that have harmed women and other marginalized groups, perpetuating decades of inequities embedded in medical research?
The Numbers
I cannot speak for everyone, so I am going to speak from my experience as a misdiagnosed neurodiverse woman. Let’s start with the numbers:
- 12 million adults in the U.S. are misdiagnosed annually, representing about 5% of all diagnoses.
- Women are 20–30% more likely to be misdiagnosed than men.
- ADHD affects 3–5% of adult women, but many go undiagnosed until their late 20s or 30s.
- Girls are diagnosed at a 3:1 ratio compared to boys, despite similar prevalence because diagnostic criteria were designed for boys.
- Women often mask symptoms to fit societal expectations, leading to burnout and misdiagnosis with anxiety or depression.
- Hormonal changes amplify ADHD symptoms, yet research on this intersection is minimal.
These aren’t just statistics. They represent years of pain, confusion, and emergency hospitalizations. Years of being told, “It’s just stress,” and hours spent in doctors’ appointments with questions left unanswered.
A 20-Year Journey
My story began at 12, when I started seeing a psychologist regularly. By college, I added a psychiatrist to my care team and followed every piece of medical advice I was given. I tried countless medications and therapies under diagnoses like anxiety, depression, and even bipolar disorder. None of them were accurate. For more than 20 years, I navigated treatments that never addressed the real issue.
What makes this even more alarming is that, despite my transparency and commitment to care, I was prescribed medications that put me at severe risk, ultimately leading to hospitalization at 22. (Spoiler alert: at 34, I finally learned I have ADHD.)
The Divergent Mind by Jenara Nerenberg explores how neurodivergent women are often overlooked or misdiagnosed due to gendered expectations and outdated diagnostic frameworks. Reading this book for the first time felt like someone had been narrating my life all along. I finally felt understood and seen. (Thank you to the new psychiatrist who finally spotted the signs and recommended this 🫶).
This lived experience is why I’m deeply concerned about removing detailed human review of patient notes in favor of AI-driven summarization. While the goal is to reduce clinician overload, eliminating nuanced human interaction introduces a dangerous gap, especially for patients whose conditions don’t fit traditional patterns.
When 12 million adults are misdiagnosed each year and women face a 20–30% higher risk, automation without detailed human oversight doesn’t just streamline workflows, it risks amplifying the very biases we need to fix.
Enter AI: Promise or Peril?
Microsoft’s keynote showcases Copilot summarizing patient notes as a feature that could transform healthcare documentation. But here’s my glaring concern…
AI learns from historical data, and that data is riddled with bias. If women, minorities, and neurodiverse individuals have been misdiagnosed for decades…. what happens when those patterns become part of the algorithm?
When you hear that AI is “over 90% accurate,” it sounds impressive but those numbers don’t come from healthcare. They come from tests that measure how well AI understands language and general knowledge, not how well it diagnoses patients or interprets complex medical histories. Healthcare requires more than speed. It demands context, empathy, and safeguards against systemic bias. Without rigorous bias audits and human oversight, replacing detailed review with AI-driven summaries could perpetuate decades of inequities.
In medicine, an error isn’t just a bug…it’s a life. I am all for AI in many areas of our lives. I am not interested in AI in my doctor’s office yet.
Sources
On Misdiagnosis & Neurodiversity
- Why are so many neurodivergent women misdiagnosed? – APS Insights
- ADHD in Women: Overwhelmingly Misunderstood – Understood.org
- Miss. Diagnosis: A Systematic Review of ADHD in Adult Women – Journal of Attention Disorders
- ADHD Gender Bias Causes Underdiagnosis – ADDitude
On AI Bias in Healthcare
- Addressing AI Algorithmic Bias in Health Care – JAMA Network
- Responsible Use of AI in Healthcare – Joint Commission & CHAI Guidance
- Bias Recognition and Mitigation Strategies in AI Healthcare – Nature npj Digital Medicine