Novartis's CEO highlights a surprising inefficiency: clinical trial nurses often record patient data on paper, which is then manually entered into multiple digital systems. This archaic process creates immense friction, cost, and risk of error, representing a huge, unsolved "boring problem" in biotech.

Related Insights

A COVID-19 trial struggled for patients because its sign-up form had 400 questions; the only person who could edit the PHP file was a grad student. This illustrates how tiny, absurd operational inefficiencies, trapped in silos, can accumulate and severely hinder massive, capital-intensive research projects.

The pharmaceutical industry's historically high profitability created a lack of urgency for technological innovation beyond basic ERP systems. It wasn't until patent cliffs and messy M&A integrations squeezed margins that companies began seriously investing in modern data platforms and cloud infrastructure to improve efficiency.

While the FDA is often blamed for high trial costs, a major culprit is the consolidated Clinical Research Organization (CRO) market. These entrenched players lack incentives to adopt modern, cost-saving technologies, creating a structural bottleneck that prevents regulatory modernization from translating into cheaper and faster trials.

A significant portion of biotech's high costs stems from its "artisanal" nature, where each company develops bespoke digital workflows and data structures. This inefficiency arises because startups are often structured for acquisition after a single clinical success, not for long-term, scalable operations.

A primary barrier to modernizing healthcare is that its core technology, the Electronic Health Record (EHR), is often built on archaic foundations from the 1960s-80s. This makes building modern user experiences incredibly difficult.

While AI can accelerate the ideation phase of drug discovery, the primary bottleneck remains the slow, expensive, and human-dependent clinical trial process. We are already "drowning in good ideas," so generating more with AI doesn't solve the fundamental constraint of testing them.

Despite a threefold increase in data collection over the last decade, the methods for cleaning and reconciling that data remain antiquated. Teams apply old, manual techniques to massive new datasets, creating major inefficiencies. The solution lies in applying automation and modern technology to data quality control, rather than throwing more people at the problem.

The process of testing drugs in humans—clinical development—is a massive, under-studied bottleneck, accounting for 70% of drug development costs. Despite its importance, there is surprisingly little public knowledge, academic research, or even basic documentation on how to improve this crucial stage.

Despite major scientific advances, the key metrics of drug R&D—a ~13-year timeline, 90-95% clinical failure rate, and billion-dollar costs—have remained unchanged for two decades. This profound lack of productivity improvement creates the urgent need for a systematic, AI-driven overhaul.

Novartis's CEO views AI not as a single breakthrough technology but as an enabler that creates small efficiencies across the entire R&D value chain. The real impact comes from compounding these small gains to shorten drug development timelines by years and improve overall success rates.