Refuse AI notes

Refuse AI notes

Many medical offices are integrating AI-mediated note taking. In a different world where LLMs prioritized accuracy and practices weren’t squeezed for every dime of profit, this wouldn’t necessarily be a bad thing. That’s not our world.

Offices experimenting with AI will ask your permission. This protects them from liability, but you can refuse to be a guinea pig.

medical AI
Image is a headline from an AI journal, documenting common issues with note-taking in medical settings.

Why we do it:

Our medical system is collapsing as we strip public funding, corporations squeeze practices for profit, and insurance companies take their pound from the middle. Protecting our own health is critical to our ability to resist, and to community health.

Medical note-taking mistakes aren’t new. I know someone diagnosed with diabetes during pregnancy based on weight alone, despite healthy tests. She spent her third trimester fighting to have her high-risk designation corrected before labor. Another friend has a specious cardiac diagnosis that pops up on her chart periodically. No one knows where it came from. It gets removed, then shows up again. Medical notes are incredibly persistent.

Unfortunately AI note-taking is making mistakes more prevalent. Errors are so common articles exist to train medical staff to identify the most common ones. Imaginary additions will be particularly catastrophic if the GOP ever manages to reverse ACA blocks on pre-existing condition exclusions. Missing or misfiled data can be dangerous, too.

AI has valid uses in medicine. Pattern recognition has been improving radiology since the 80’s, but the models used to record notes indulge in pattern-matching inappropriate to the task, and with medical staff stripped bare, those notes won’t be carefully checked at every appointment. To get accurate notes, ask your doctor to make them herself, during the appointment, as they used to.

Until we have protocols that prioritize accuracy, guarantee time for doctor review adjacent to the visit (not in “spare time” or at the end of the day) and provide a legally guaranteed pathway for correcting mistakes, allowing your medical records to be compiled by tools built to socialize, not perform accurately, is risky.

Protect yourself from this grand corporate experiment by asking your medical professionals if they use AI note-taking aids. If they do, ask them not to. Tell them you understand they are overworked, but if they don’t have time to take notes during the visit, you are skeptical they will be given sufficient time to review the notes later.

Or simply say no. No is a complete sentence.

Scroll to Top