As AI allows any patient to generate well-reasoned, personalized treatment plans, the medical system will face pressure to evolve beyond rigid standards. This will necessitate reforms around liability, data access, and a patient's "right to try" non-standard treatments that are demonstrably well-researched via AI.
While GPT-5 Pro provides exhaustive, expert-level readouts, the speaker found a presumed Gemini 3 checkpoint superior for his use case. It delivered equally sharp analysis but in a much faster, more focused, and easier-to-digest format, feeling like a conversation with a brilliant yet efficient expert.
While doctors focused on the immediate, successful treatment, the speaker used AI to research and plan for the low-probability but high-impact event of a cancer relapse. This involved proactively identifying advanced diagnostics (ctDNA) and compiling a list of relevant clinical trials to act on immediately if needed.
By continuously feeding lab results and treatment updates into GPT-5 Pro, the speaker created an AI companion to validate the medical team's decisions. This not only caught minor discrepancies but, more importantly, provided immense peace of mind that the care being administered was indeed state-of-the-art.
AI identified circulating tumor DNA (ctDNA) testing as a highly sensitive method for detecting cancer recurrence earlier than scans or symptoms. Despite skepticism from oncologists who deemed it unproven, the speaker plans to use it for proactive monitoring—a strategy he would not have known about otherwise.
The speaker faced two simultaneous crises: his son's cancer and a basement flood. AI was able to bridge the knowledge gap between oncology (immunosuppression risks) and mold remediation (air quality management), providing a synthesized, actionable plan that no single human expert could realistically offer.
When a lab report screenshot included a dismissive note about "hemolysis," both human doctors and a vision-enabled AI made the same mistake of ignoring a critical data point. This highlights how AI can inherit human biases embedded in data presentation, underscoring the need to test models with varied information formats.
The speaker found Claude's response overly emotional compared to GPT-4's clinical tone. However, he reflects that this alarming style may have been exactly what was needed to overcome his initial denial and act with the urgency the situation demanded, especially for a user prone to downplaying concerns.
The speaker regrets not using AI to guide a physical exam of his son. A key diagnostic breakthrough occurred when a doctor found a specific point of pain on his son's abdomen. This suggests a powerful, untapped use case for AI in helping patients or caregivers identify crucial physical symptoms that might otherwise be missed.
For a $200/month subscription, AI provided analysis and peace of mind potentially worth tens of thousands of dollars, representing less than 0.2% of the total estimated medical costs. In a high-stakes crisis, the speaker notes he would have willingly paid $10,000/month, highlighting AI's immense, under-captured value.
