Subscribe on: RSS | SPOTIFY | APPLE PODCAST | GOOGLE | BREAKER | ANCHOR
On this episode of The Digital Patient, Dr. Joshua Liu, Co-founder & CEO of SeamlessMD, and colleague, Alan Sardana, chat with David Lovinger, MD, FHM, FACP, Associate Chief Medical Officer and Chief Informatics Officer at Carle Health, about "How a Sculptor Became a CMIO, Why Pop-Ups Are Almost Never the Right Answer, How Design Thinking Fixes What Informatics Degrees Miss, and more..." Click the play button to listen or read the show notes below.
Audio:
Guest(s):
- David Lovinger, MD, FHM, FACP, Associate Chief Medical Officer and Chief Informatics Officer at Carle Health
- Joshua Liu, MD (@joshuapliu), Co-founder & CEO at SeamlessMD
Episode 224 - Show Notes:
[00:00:07] Episode preview
[00:06:00] How a decade in fine arts revealed the shape of broken clinical workflows — Dr. Lovinger explains how his background in sculpture informs his approach to clinical workflow design. Good design, he argues, is fundamentally about making things work better—like OXO kitchen tools engineered for people with rheumatoid arthritis. That same lens applies to the EMR.
[00:07:11] What EMR design constraints reveal about clinical work — Dr. Lovinger describes watching clinicians perform unnecessary steps in the EMR without noticing, and explains why the best design solution is often to start from scratch—though that's almost never an option with commercial EMRs. Improvement within constraints is the daily reality of health IT.
[00:08:11] Why AI belongs in the workflow optimizer lane, not the diagnostic driver's seat — Dr. Lovinger lays out his core argument: AI makes mistakes humans wouldn't (citing a Google AI tool that recommended eating small rocks based on an Onion article), and medicine is fundamentally a human endeavor rooted in empathy and understanding that AI cannot truly replicate.
[00:09:11] What AI can do that humans cannot—and why that still earns it a short leash — From detecting patterns in digital pathology slides to identifying a patient's race from a chest X-ray through mechanisms no one fully understands, Dr. Lovinger acknowledges AI's unique pattern-recognition capabilities—while arguing it still warrants a "very short leash" in clinical settings for the foreseeable future.
[00:12:11] Why deploying AI responsibly means owning the liability — Drawing on his earlier writing about systems-based accountability, Dr. Lovinger argues that AI cannot take responsibility for clinical decisions—and that organizations must be willing to own that liability themselves before deploying AI responsibly. "EMR companies are famous for not owning any liability."
[00:15:11] How good safety design makes the wrong action nearly impossible—and why alert fatigue is a design failure — Using the example of industrial paper-cutting machines that require both hands and a foot to trigger, Dr. Lovinger argues that the goal is to make errors structurally difficult. The challenge in medicine: too many exceptions to every rule, which leads to a flood of low-signal, ignored alerts.
[00:17:11] The 70% pop-up reduction that reframed how his team approaches clinical decision support — Dr. Lovinger's health system eliminated 70% of its pop-up alerts—not because the warnings were wrong, but because they were badly designed. Clinician attention is a finite resource: "Your attention as a clinician is a very important commodity, and I think that gets lost."
[00:18:11] The TAVR case study: how a pop-up request became a better patient list — A cardiologist wanted an alert every time one of his TAVR patients was admitted within 30 days. Dr. Lovinger walks through why the approach was flawed—wrong recipient, wrong time, wrong trigger—and how redesigning it as a nurse-reviewed patient list led to a better outcome. The cardiologist was initially resistant, then came around after a month.
[00:21:11] What AI-enabled self-service workflows could mean for standardization—and why that's complicated — As AI lowers the technical barrier for clinicians to build their own workflows, Dr. Lovinger cautions against assuming personalization is always better. Referencing Atul Gawande's "The Checklist Manifesto," he argues that standardization protects patients even when individual clinicians believe they have a better way. "If everybody practices differently, what is the standard of care? There really isn't one."
[00:24:11] Why clinical notes are a design problem AI might actually help solve — Dr. Lovinger calls note-writing "one of the worst areas of the EHR"—bloated, inconsistent, and often unintelligible even to colleagues. He sees AI chart summarization and query tools as a genuine opportunity to extract meaning from documentation that no one can currently parse efficiently.
[00:26:11] The fancy knife principle: AI enhances skilled practitioners but doesn't create them — Dr. Lovinger uses a culinary analogy to frame AI's real value proposition: "Just like a fancy pan or an incredible knife does not make you a good cook, but if you are a good cook, it can help you be better." Underlying expertise and judgment remain irreplaceable.
[00:29:11] How Dr. Lovinger evaluates new AI tools—and why pilot testing is the only real answer — Determining whether a tool crosses the sniff test is harder than it sounds; he's been regularly surprised in both directions. Draft InBasket reply AI offers a telling example: it paradoxically supplies a human touch—empathetic language clinicians don't have time to type themselves—despite being imperfect.
[00:33:21] The ambient AI ROI problem: why reducing pajama time is valuable but hard to monetize — Time saved by ambient AI often goes toward reducing after-hours clinical work rather than increasing patient volume. Dr. Lovinger acknowledges that clinician retention is enormously costly, and burnout reduction has real economic value—"but it is often very hard to measure, and that is a challenge right now that I don't think anyone has a good handle on."
[00:35:21] The financial paradox at the heart of clinical AI adoption — A story from Intermountain Health illustrates the tension: giving diabetic surgical patients perioperative antibiotics was the right clinical decision, but it reduced complications—and revenue. Highly evidenced AI tools like retinal scan models go unadopted, while lower-evidence LLMs spread widely—because incentives, not evidence, drive adoption.
[00:41:21] How Dr. Lovinger runs data-driven accountability without burning out clinicians — His sepsis program used detailed nursing and physician process metrics to drive compliance. The key was agreeing on a shared baseline of a few priorities rather than twenty-five, and accepting that 80–90% compliance is often the right target. "If you're doing it 90% and you deviate there, I don't care. But if you're doing it at 50%, then I do."
[00:43:21] The hidden cost of success: when the EMR stops being the problem — At Carle, Dr. Lovinger's team drove clinician satisfaction with the EMR from the 54th to the 90th percentile. The unintended consequence: clinicians are no longer providing feedback, because the EMR is no longer their primary frustration. Success creates its own blind spots.
[00:44:21] Why the EMR is often the vehicle for clinicians' pain—not the source of it — Dr. Lovinger closes with a reframe: many EMR frustrations trace back to decisions made by healthcare organizations and government regulators, not the software itself. "The EMR is the vehicle for your pain, but it is not the driving force. And understanding that is something I wish that people had a better grasp of."
Fast 5 Lightning Round:
- What is your favorite book or book you’ve gifted the most?
The Visual Display of Quantitative Information by Edward Tufte - If you could instantly master any skill, what would it be?
Basketball. - Would you rather have Super strength, super speed, or the ability to read people’s minds?
Super speed. - What is something in healthcare you believe others might find insane?
"The degree to which people get along and work well together." (Dr. Lovinger describes an in-flight medical emergency where he and several other physicians and a nurse naturally coordinated to treat a hypoglycemic passenger—no one assigned roles, they just gravitated to what needed to be done) - What is the last movie or TV show you saw, and what did you think of it?
The Americans
The Digital Patient has been recognized as Feedspot's #1 Patient Engagement Podcast of 2025. Thank you to our listeners for making this happen!








.png)
