We scan new podcasts and send you the top 5 insights daily.
Nektar's initial poor trial results were heavily impacted by one patient missing their week 12 appointment, which coincided with a rare, outlier placebo response. This unlucky convergence suppressed the reported drug efficacy, creating a massive misperception of the drug's potential.
When a company reports an 'efficacy estimate,' it often excludes patients who dropped out of a trial, inflating perceived success. Investors should demand the 'treatment regimen estimate,' which includes all participants and aligns with what the FDA actually considers for drug approval.
To frame its trial results positively, Compass Pathways used less stringent definitions for key endpoints. It defined 'clinically meaningful reduction' and 'remission' at levels below the common standard, a tactic that calls into question the true magnitude of the drug's benefit.
The market soured on Nektar's alopecia data because of low overall response rates. This misses that the drug is slow-acting and nearly half the patients dropped out before it could take effect. The real efficacy is likely much higher among patients who complete the full treatment course.
Progress in drug development often hides inside failures. A therapy that fails in one clinical trial can provide critical scientific learnings. One company leveraged insights from a failed study to redesign a subsequent trial, which was successful and led to the drug's approval.
Despite reporting positive Phase 2 asthma data that met the company's stated goals for 12-week dosing, Upstream Bio's stock dropped significantly. The CEO attributes this to the 24-week dosing data being less robust on the primary endpoint, highlighting the gap between achieving clinical goals and meeting nuanced market expectations for a best-case scenario.
Gossamer's Phase 3 drug for PAH failed after being designed around a promising subgroup identified in a post-hoc analysis of a less-than-stellar Phase 2 trial. This outcome serves as a cautionary tale for clinical development, highlighting the high risk of basing expensive pivotal studies on retrospective data mining rather than robust, pre-specified endpoints.
After reacquiring a "failed" ALS drug, Neuvivo's team re-analyzed the 200,000 pages of trial data. They discovered a programming error in the original analysis. Correcting this single mistake was a key step in reversing the trial's outcome from failure to success.
The placebo effect in gastrointestinal treatments is remarkably high, around 35-40%. This makes subjective patient feedback unreliable for assessing a therapy's true effectiveness and underscores the urgent need for objective, data-driven measurement tools.
The study presented three different datasets over a short period. While efficacy endpoints like PFS and OS changed, the toxicity data remained identical. This is highly unusual, as resolving censored patient data for efficacy should also lead to updated toxicity information, suggesting a rushed or incomplete analysis process.
In the ASCENT-07 trial, blinded central review showed no benefit for sacituzumab, while treating investigators saw a clear benefit. This discrepancy arose because clinicians acted on new lesions or effusions that central reviewers deemed "unclear," showing how rigid trial criteria can miss nuanced clinical signals.