Scientific advances in genomics, proteomics, mathematics, and computer software engineering have spawned emerging technologies that have the potential to make cancer diagnosis and treatment more precise. But because major challenges have arisen in how to translate technologies based on genomics and proteomics into useful, reliable, precision applications, the National Cancer Policy Forum (NCPF) of the National Academies of Medicine held a workshop to explore the realities of using complex computational omics-based tools in clinical oncology.
This was the second NCPF workshop meeting highlighting strategies to improve cancer diagnosis and care. As previously reported in Oncology Times, the first meeting-held in February 2018-focused on increasing patient access to high-quality imaging and pathology. The second meeting ventured into the areas of complex computational biology modeling, machine learning, and artificial intelligence (AI) to achieve a precise diagnosis and guide therapy based on a cancer patient's individual genomic makeup.
Computational Biology Methods
Computational biology is a growing field that develops algorithms, statistical analysis methods, and ultimately biological models to diagnose and predict treatment response. Technically sophisticated algorithms for pattern recognition store and transmit images. A written summary report from this second NCPF meeting on precision oncology will be forthcoming from the National Academies Press.
The NCPF noted that the cancer community is in need of clear direction "on how and when to apply complex computational biology methods in precision cancer care to ensure both patient safety and responsible use of scientific findings."
Keynote speaker Atul Butte, MD, PhD, who participated remotely, agreed. "I don't think we're training oncologists in how to read them," he said of the results of complex genomic tools. Butte is the Priscilla Chan and Mark Zuckerberg Distinguished Professor and inaugural Director of the Bakar Computational Health Sciences Institute at the University of California, San Francisco, as well as the Chief Data Scientist for the entire University of California Health System.
Imaging Advances
Currently the FDA is working on a guidance document for quantitative imaging tools to ensure that they meet certain technical specifications for accuracy and precision, said Nicholas Petrick, PhD, Deputy Director for the Division of Imaging, Diagnostics, and Software Reliability within FDA's Center for Devices and Radiological Health.
In February 2018, the oncology imaging software, Oncology AI suite, was the first product of its kind to be given the FDA clearance. This cloud-based AI medical imaging software tool uses deep learning to improve imaging accuracy and consistency to arrive at a more precise diagnosis.
Today, "we're still in the discovery stage," noted Constantine Gatsonis, PhD, the Henry Ledyard Goddard University Professor and founding Chair of the Department of Biostatistics and the Center for Statistical Sciences at Brown University School of Public Health, Providence, R.I., pointing out that software and technology are constantly evolving, and thus constitute a moving target.
Precision Diagnosis
To illustrate the complexities of precision diagnosis and care in day-to-day clinical oncology, NCPF member Christopher R. Cogle, MD, Chair of this second NCPF workshop, presented the case of one of his patients with dysplastic hematopoiesis who underwent whole exome sequencing. The woman, 76, had hundreds of single nucleotide polymorphisms and gene copy number variations, said Cogle, who is Professor of Medicine and Pierre Chagnon Professor of Stem Cell Biology and Bone Marrow Transplant at the University of Florida. "How can we as oncologists handle this type of patient clinically?" asked Cogle, citing constant time pressures in clinical practice, along with the need to interpret complicated data quickly and then make treatment recommendations.
Stressing that the days of single-gene, single-drug matching are over, Cogle said the goal today in precision oncology is to map disease biology, predict patient response, and choose the correct molecularly targeted drug-a process in which computational biology modeling can help. Cogle said his hope is that this emerging technology will "progress in an orderly and efficient manner so that groups like ASCO can embrace it."
"I do believe that the future of medicine lies in integrating human elements and technology...we need more than genes, we need an integrated approach," said NCPF member Hedvig Hricak, MD, PhD, Chair of the Department of Radiology at Memorial Sloan Kettering Cancer Center, Member of the Molecular Pharmacology and Chemistry Program at the Sloan Kettering Institute, and Professor in the Gerstner Sloan Kettering Graduate School of Biomedical Sciences. Hricak chaired the first NCPF meeting on improving cancer diagnosis and care and was a member of the workshop planning committee for the second NCPF meeting on this topic.
"What we really need is a framework to demonstrate the clinical utility of this technology," NCPF member Richard L. Schilsky, MD, FACP, FSCT, FASCO, told Oncology Times. Schilsky, a panelist at this NCPF meeting on cancer diagnosis and care, is Senior Vice President and Chief Medical Officer of ASCO, and was a member of the planning committee of the first NCPF meeting on improving diagnosis in cancer care.
For example, he explained that a helpful framework might be a clinical trial showing that patients diagnosed and subsequently treated using a computational algorithm did better than those who were diagnosed and treated without using this algorithm. Commenting during the meeting, Schilsky said oncologists don't care so much about what is in a "black box" machine learning tool, but about whether it really helps them make better clinical decisions for their patients.
"We are awash in messages about the value of genomics in health," said Steven Goodman, MD, MHS, PhD, Associate Dean for Clinical and Translational Research, Professor of Medicine and Health Research & Policy, and Chief of the Division of Epidemiology at Stanford University School of Medicine. Goodman stressed the need for stringent oversight of computational tools used in precision oncology and the need to ensure their reproducibility, repeatability, and reliability.
"The ultimate truth is whether the patient will be better off if we use a computational tool," Goodman said. "That's a very high bar." He advocated raising standards for preclinical cancer research using computational tools also and noted that these tools frequently don't go through the traditional phase I, phase II, and phase III clinical trial process. What is needed is "validation, validation, validation," emphasized Goodman, who is co-founder and Co-Director of the Meta-Research Innovation Center at Stanford (METRICS), whose mission is to examine and improve the reproducibility and efficiency of biomedical research.
Caution Needed
Goodman mentioned a cautionary specter haunting the development of computational methods in precision oncology: a case of premature use of flawed gene-expression tests developed by cancer researchers at Duke University, tests employed in three clinical trials to determine which chemotherapy treatment patients with lung or breast cancer would receive. In July 2010, more than 30 outside scientists raised concerns about the validity of the gene-expression tests and asked the NCI to intervene and suspend the trials until the tests could be thoroughly reviewed.
A report published by the National Academies Press in 2012, "Evolution of Translational Omics: Lessons Learned and the Path Forward," concluded that constant vigilance is needed to ensure the validity of gene-based tests and tools. It stated, "Unfortunately, multiple systems put in place by Duke University to ensure the integrity and rigor of the scientific process failed. However, Duke University is not unique."
Among the flaws discovered with the Duke data were lack of confirmation of the omics discovery using an independent sample set, and lack of analytical and clinical/biological validation of the omics-based test prior to beginning clinical trials, according to the report on translational omics.
"Omics tools are only as good as the data they're based on," said Sean Khozin, MD, MPH, Associate Director at the FDA Oncology Center of Excellence and founding Director of Information Exchange and Data Transformation (INFORMED) at the agency. "Data quality is everything," he added, highlighting a recurring theme at this meeting.
Agreeing was David Magnus, PhD, the Thomas A. Raffin Professor of Medicine, and Biomedical Ethics, Professor of Pediatrics and Medicine, and Director of the Center for Biomedical Ethics at Stanford University. Magnus, a panelist and member of the planning committee for the second NCPF meeting on precision oncology, stressed that the clinical application of a computational tool is "only as good as the data it operates on."
Also in agreement was Lisa M. McShane, PhD, Acting Associate Director for the Division of Cancer Treatment and Diagnosis at NCI, where she heads the Biometric Research Program. "I get to see some of the crash and burn stories," said McShane. For example, she cited a study in which only 50 percent of frozen study samples collected were fit for assay a few years later because they had degraded. She recommended rigorous requirements for validation of omics prediction tools.
In addition to the quality of data underlying new diagnostic technologies, those data must be representative of the population being studied and free of bias, said Kadija Ferryman, PhD, a postdoctoral scholar at the Data & Society Research Institute in New York. Ferryman, who leads the Fairness in Precision Medicine research study, pointed out that the potential for bias and discrimination in predictive precision medicine exists if the data don't reflect population diversity.
For example, "AI-driven dermatology could leave dark-skinned patients behind," said Ferryman. She noted that most digital images of the skin feature lighter skin tones, and thus they do not capture serious lesions such as melanoma on darker skin. "How can we prevent these problems at the outset in use of AI tools?" asked Ferryman. She advocated thinking about data and the possibility for bias up front as new computational tools advance.
NCPF member Amy P. Abernethy, MD, PhD, who is Chief Medical Officer, Chief Scientific Officer, and Senior Vice President for Oncology at Flatiron Health, agreed on the need to base new diagnostic tools for precision oncology on truly representational data. "There's a risk of bias if you don't have adequately representative datasets," said Abernethy, a member of the meeting's planning committee. She recommended evaluating a dataset for race, ethnicity, and age-matching the data against a reference standard. She also stressed the importance of reliability and quality of data underlying a computational algorithm, as well as the need for such data to be recent and available in order to inform clinical decision-making. "We're expecting these datasets to do a lot of work for us," she said, so they need to be of the highest quality.
Finally, in the effort to enhance precision oncology with new technologies, it must not be forgotten that what cancer patients also really need is effective new therapies and effective new combinations of therapies, said Keith T. Flaherty, MD, Director of the Henri and Belinda Termeer Center for Targeted Therapy and Director of Clinical Research at Massachusetts General Hospital, and Professor of Medicine at Harvard Medical School.
"From here on out it's going to be about combinations; we need more drugs and the ability for them to be paired," said Flaherty, principal investigator of NCI's Molecular Analysis for Therapy Choice (MATCH) Trial. MATCH is the first NCI-sponsored trial assigning patients to targeted therapy independent of tumor type based on DNA sequencing detection of oncogenes.
Peggy Eastman is a contributing writer.