Improving Trial Designs for Heterogeneous Populations
Accurately projecting outcomes for diverse patient populations – from the wealth of genomic, phenotypic and outcomes data available through genome sequencing and electronic health records – holds the potential to transform the effectiveness and efficiency of drug and medical device development.
Yet the computations required for statisticians to explore, understand and interpret these enormous multivariate and often poorly structured data take a prohibitively long time to complete. In many cases, it would take months and often years to transform the raw data into informative, scientifically sound, and statistically significant results using even the most advanced conventional computers.
Quantum computing may change that. In theory, quantum computers are capable of computational power many orders of magnitude beyond today’s conventional computers. Quantum computers promise to be faster and more reliable in correctly identifying patterns and dynamic trends in massive noisy data sets similar to those that would be useful for mapping dose response and other complex therapeutic relationships.
Quantum computing could prove useful in designing clinical trials. For example, statistical methods of trial design that fuse combinatorial and optimal experimental design techniques have been developed to potentially reduce by half the number of sites and patients needed to select the best combination treatment for targeting multiple cancer types and various biomarkers.
That emerging approach was among the advances presented at a quantum computing workshop this spring, co-sponsored by the ICON Innovation Centre with the George Washington University (GWU) Department of Statistics and Lockheed Martin Corporation. Speakers representing GWU, Lockheed, ICON, the National Institute of Standards and Technology, the University of Wisconsin-Madison, and Nokia Bell Labs addressed more than 50 participants from academia, government, and industry.
The workshop featured quantum computing overviews, talks on quantum algorithms and their links with statistics, as well as case studies and a round table discussion. It exemplifies the cross-industry collaborative approach that is moving this potentially revolutionary computing technology from theory to practical reality.
We asked Sergei Leonov, VP of Clinical Trial Methodology at ICON, to reflect on discussions at the conference about how quantum computing is evolving for clinical applications.
When will quantum computing be available for use in clinical trials?
Quantum computing is only beginning to move from the realm of demonstrations to practical application. Medical product development may be one of the earliest likely applications.
However, harnessing the potential power of quantum computing computational power will require significant additional advances in both mathematics and technology. The machine learning algorithms and statistical methods needed are in development, but far from mature.
Significant hardware and software challenges also exist. One is providing a super-cool (no pun intended) environment. Since minor interactions with the external world may change the state of a quantum system, a stable quantum system can only exist at temperatures close to absolute zero on the Kelvin scale (0°K = -273.15°C = 459.67°F) and being isolated from the external world. For instance, the latest generation D-Wave system operates at 15 millikelvins [1]. Given the physical limits involved, developing software and interfaces capable of reliably working with transient quantum models is difficult.
While it will take time and effort to apply quantum computing to clinical development it may be the only practical way to process the masses of data now available. Even partial success would represent a “quantum leap” in computing power and with it development efficiency, so the effort is well worth it. We expect to make major progress in the next couple of years.
What is quantum computing?
Quantum computing is an alternate approach to solving complex mathematical problems, often involving vast amounts of data.
The theoretical power advantage quantum computing holds over conventional computing primarily relates to two quantum mechanical properties – superposition and entanglement. Together these greatly increase both the quantity of data a computer can process, and the ways in which these data can be combined [2].
For conventional computing, the basic unit, the bit, exists in one state at a time, and this state is deterministic, either 0 or 1. Superposition means that the basic unit of quantum computing, known as the qubit, exists in two states at one time, and these states are probabilistic, adding up to 1. Therefore, whereas classical computing is limited to manipulating binary bits in a linear, deterministic stream, quantum computing can manipulate vast data sets simultaneously in a probabilistic ocean. The magnitude of this difference is hinted at by the number of states a quantum computer can represent. Today, the most powerful commercially available quantum computer operates on quantum systems of up to 2,000 qubits, which could exist in a superposition of as many as 22,000, or about 10600 quantum states. For context, it is estimated that there are about 1080 atoms in the known, observable universe.
The second advantage of quantum computing is entanglement, which means individual quantum qubits can interact directly with each other, even at great distances, altering each other’s states simultaneously without intermediate causal connections. By comparison, conventional bits interact only in a linear sequence, changing each other’s state one at a time in an extended chain of binary operations. So, a quantum computer potentially can use computational “shortcuts” not available in conventional computers.
References.
- Latest Generation D-Wave System Nielsen, M.A., Chuang, I.L. (2010).
- Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press.
In this section
-
Digital Disruption
- AI and clinical trials
-
Clinical trial data anonymisation and data sharing
-
Clinical Trial Tokenisation
-
Closing the evidence gap: The value of digital health technologies in supporting drug reimbursement decisions
-
Digital disruption in biopharma
-
Disruptive Innovation
- Remote Patient Monitoring
-
Personalising Digital Health
- Real World Data
-
The triad of trust: Navigating real-world healthcare data integration
-
Patient Centricity
-
Agile Clinical Monitoring
-
Capturing the voice of the patient in clinical trials
-
Charting the Managed Access Program Landscape
-
Developing Nurse-Centric Medical Communications
- Diversity and inclusion in clinical trials
-
Exploring the patient perspective from different angles
-
Patient safety and pharmacovigilance
-
A guide to safety data migrations
-
Taking safety reporting to the next level with automation
-
Outsourced Pharmacovigilance Affiliate Solution
-
The evolution of the Pharmacovigilance System Master File: Benefits, challenges, and opportunities
-
Sponsor and CRO pharmacovigilance and safety alliances
-
Understanding the Periodic Benefit-Risk Evaluation Report
-
A guide to safety data migrations
-
Patient voice survey
-
Patient Voice Survey - Decentralised and Hybrid Trials
-
Reimagining Patient-Centricity with the Internet of Medical Things (IoMT)
-
Using longitudinal qualitative research to capture the patient voice
-
Agile Clinical Monitoring
-
Regulatory Intelligence
-
An innovative approach to rare disease clinical development
- EU Clinical Trials Regulation
-
Using innovative tools and lean writing processes to accelerate regulatory document writing
-
Current overview of data sharing within clinical trial transparency
-
Global Agency Meetings: A collaborative approach to drug development
-
Keeping the end in mind: key considerations for creating plain language summaries
-
Navigating orphan drug development from early phase to marketing authorisation
-
Procedural and regulatory know-how for China biotechs in the EU
-
RACE for Children Act
-
Early engagement and regulatory considerations for biotech
- Regulatory Intelligence Newsletter
-
Requirements & strategy considerations within clinical trial transparency
-
Spotlight on regulatory reforms in China
-
Demystifying EU CTR, MDR and IVDR
-
Transfer of marketing authorisation
-
An innovative approach to rare disease clinical development
-
Therapeutics insights
- Endocrine and Metabolic Disorders
- Cardiovascular
- Cell and Gene Therapies
- Central Nervous System
-
Glycomics
- Infectious Diseases
- NASH
- Oncology
- Paediatrics
-
Respiratory
-
Rare and orphan diseases
-
Advanced therapies for rare diseases
-
Cross-border enrollment of rare disease patients
-
Crossing the finish line: Why effective participation support strategy is critical to trial efficiency and success in rare diseases
-
Diversity, equity and inclusion in rare disease clinical trials
-
Identify and mitigate risks to rare disease clinical programmes
-
Leveraging historical data for use in rare disease trials
-
Natural history studies to improve drug development in rare diseases
-
Patient Centricity in Orphan Drug Development
-
The key to remarkable rare disease registries
-
Therapeutic spotlight: Precision medicine considerations in rare diseases
-
Advanced therapies for rare diseases
-
Transforming Trials
-
Accelerating biotech innovation from discovery to commercialisation
-
Ensuring the validity of clinical outcomes assessment (COA) data: The value of rater training
-
Linguistic validation of Clinical Outcomes Assessments
-
Optimising biotech funding
- Adaptive clinical trials
-
Best practices to increase engagement with medical and scientific poster content
-
Decentralised clinical trials
-
Biopharma perspective: the promise of decentralised models and diversity in clinical trials
-
Decentralised and Hybrid clinical trials
-
Practical considerations in transitioning to hybrid or decentralised clinical trials
-
Navigating the regulatory labyrinth of technology in decentralised clinical trials
-
Biopharma perspective: the promise of decentralised models and diversity in clinical trials
-
eCOA implementation
- Blended solutions insights
-
Implications of COVID-19 on statistical design and analyses of clinical studies
-
Improving pharma R&D efficiency
-
Increasing Complexity and Declining ROI in Drug Development
-
Innovation in Clinical Trial Methodologies
- Partnership insights
-
Risk Based Quality Management
-
Transforming the R&D Model to Sustain Growth
-
Accelerating biotech innovation from discovery to commercialisation
-
Value Based Healthcare
-
Strategies for commercialising oncology treatments for young adults
-
US payers and PROs
-
Accelerated early clinical manufacturing
-
Cardiovascular Medical Devices
-
CMS Part D Price Negotiations: Is your drug on the list?
-
COVID-19 navigating global market access
-
Ensuring scientific rigor in external control arms
-
Evidence Synthesis: A solution to sparse evidence, heterogeneous studies, and disconnected networks
-
Global Outcomes Benchmarking
-
Health technology assessment
-
Perspectives from US payers
-
ICER’s impact on payer decision making
-
Making Sense of the Biosimilars Market
-
Medical communications in early phase product development
-
Navigating the Challenges and Opportunities of Value Based Healthcare
-
Payer Reliance on ICER and Perceptions on Value Based Pricing
-
Payers Perspectives on Digital Therapeutics
-
Precision Medicine
-
RWE Generation Cross Sectional Studies and Medical Chart Review
-
Survey results: How to engage healthcare decision-makers
-
The affordability hurdle for gene therapies
-
The Role of ICER as an HTA Organisation
-
Strategies for commercialising oncology treatments for young adults
-
Blog
-
Videos
-
Webinar Channel