–Normalize early. Ask upfront: “What did AI tell you?” This surfaces misconceptions efficiently before they drive anxiety or derail the visit.
–Reframe AI’s role. Position AI as a conversation starter, not a second opinion. You bridge the gap between generic outputs and their individualized care plan.
–Partner, don’t scold. Applaud engagement while gently correcting misinformation. Sort signal from noise together rather than competing with the technology.
When I see patients in the clinic, the “I Googled about this potential treatment option…” has been replaced by “I prompted ChatGPT to tell me about the symptoms to expect on this therapy, and here are the dozen questions we need to get through.” For oncologists, this changes the rhythm of the first encounter. Instead of starting from scratch, many visits begin with, “I read online that…” followed by an AI-derived conclusion. A patient might cling to an overly optimistic survival estimate (which oncologists are also prone to, by the way) or become terrified by grim statistics pulled from populations that have little to do with their own tumor biology.
Getting through cancer treatment (and supporting a loved one going through it) is no small feat, and we should applaud our patients for taking time to think critically about their disease and what the next-line treatment options upon progression might look like. While many patients now walk into oncology visits armed with AI-generated explanations of staging, prognosis, side effects, and “best” treatments, others provide me with concerns about a specialized therapy that are misleading or simply wrong in ways that are hard for patients or caregivers to recognize.
The hardest part of discussing AI tools is nuance. AI can list treatment options better than before, but it cannot fully explain why one approach fits one specific patient while another does not. It does not know about comorbidities, performance status, access barriers, or the patient’s own values and priorities up front. A recent study of different AI tools found that they still struggle to answer more complex medical cases that involve subspecialty-level knowledge, and that the concordance with human experts is relatively low.
Clinicians are left to bridge the gap between generic AI outputs and personalized care plans that account for complexity. One of the most effective strategies I use in the clinic is to normalize the behavior of using AI. Saying, “A lot of my patients use AI tools – what did it tell you?” invites honesty instead of defensiveness, warding off the risk of the patient going into a tangent when you have three other patients already seated in their office room still waiting to see you. When oncologists proactively ask patients to share what they saw, the visit becomes more efficient because misconceptions are surfaced early instead of quietly driving fear or false hope. The goal is not to scold patients for using AI but to partner with them in sorting signals from noise.
Over time, oncology teams may adopt their own vetted AI tools for patient education, symptom triage, or visit preparation. One area where both clinicians and administration are aligned on is the use of AI scribes that can transcribe visits, so that we can spend more time talking to our patient instead of frantically typing while facing the computer screen When clinicians control which tools are used-and how-they can harness the benefits of clear language and rapid summarization while maintaining oversight of the clinical message. AI works best when it is framed as a conversation starter, not a second opinion.Ultimately, AI has added a new layer to cancer communication. Patients may arrive more engaged and more curious, but also more misaligned.
Oncologists who embrace that curiosity while firmly guiding the conversation back to individualized, evidence-based care will be best positioned to navigate this new reality. Perhaps we can find the specific AI technologies that not only increase patient satisfaction, but allow oncologists to spend more time with family and make it home for dinner every night. The aim is not to compete with AI, but to contextualize it.