AI Tool Automates High-Quality Brain Tumor Evaluation via PET Imaging

Reston, VA-A new artificial intelligence tool provides a fully automated, easy-to-use and objective way to detect and evaluate brain tumors, according to new research published in the October issue of The Journal of Nuclear Medicine. Developed for use with amino acid PET scans, the deep learning-based segmentation algorithm can also assess brain tumor patients' response to treatment with quality comparable to that of an experienced physician, but in a fraction of the time.

PET has become increasingly important in brain tumor diagnostics, complementing structural MRI. Over the past years, several studies have demonstrated the diagnostic value of the metabolic tumor volume for assessing treatment response in brain tumor patients. However, since measuring changes in the metabolic tumor volume of a brain tumor is time-consuming, it is usually not part of the routine clinical assessment.

"The fact that metabolic tumor volume is not routinely assessed in clinical practice suggests that the time and effort required for volumetric amino acid PET segmentation still exceeds clinical benefit," said Philipp Lohmann, PhD, assistant professor (Habilitation) in Medical Physics, and team leader for Quantitative Image Analysis & AI at the Institute of Neuroscience and Medicine, Research Center in Juelich, Germany. "In response, our team developed a deep learning-based segmentation algorithm for a robust and fully automated volumetric evaluation of amino acid PET data and evaluated its performance for response assessment in patients with gliomas."

Researchers retrospectively evaluated 699 18F-FET PET scans (at initial diagnosis or during follow-up) from 555 brain tumor patients. The deep learning-based segmentation algorithm was configured on a training and a test dataset, and changes in metabolic tumor volume were measured. Additionally, the algorithm was applied to data from a recently published 18F-FET PET study on response assessment in glioblastoma patients treated with adjuvant temozolomide chemotherapy. The response assessment from the algorithm was then compared to the assessment of an experienced physician, as reported in the study.

In the test dataset, 92 percent of lesions with increased uptake and 85 percent of lesions with isometric or hypometabolic uptake were correctly identified by the algorithm. Change in metabolic tumor volume, as detected by the algorithm, was a significant determinant of disease-free and overall survival, in agreement with the physician's assessment.

"These findings highlight the value of the deep learning-based segmentation algorithm for improvement and automatization of clinical decision-making based on the volumetric evaluation of amino acid PET," stated Lohmann. "The segmentation tool developed in our study could be an important platform to further promote amino acid PET and to strengthen its clinical value, which may give brain tumor patients access to important diagnostic information that was previously unavailable or difficult to obtain."

To facilitate clinical implementation, the segmentation algorithm is freely available and can be executed on a conventional GPU-equipped computer in less than two minutes without preprocessing. "We hope to encourage and support treating physicians in neuro-oncology centers to consider amino acid PET for their patients, even if they have little or no prior experience," said Lohmann. "Every patient with a brain tumor should have access to amino acid PET."

Figure 6. Representative 18F-FET PET images at baseline and follow-up of glioma patients with favorable (top row) and unfavorable (bottom row) outcomes after 2 cycles of adjuvant temozolomide. OS = overall survival; PFS = progression-free survival; TBRmean = mean TRB.

The authors of "Automated Brain Tumor Detection and Segmentation for Treatment Response Assessment Using Amino Acid PET" include Robin Gutsche, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, and RWTH Aachen University, Aachen, Germany; Carsten Lowis and Philipp Lohmann, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany; Karl Ziemons, Medical Engineering and Technomathematics, FH Aachen University of Applied Sciences, Juelich, Germany; Martin Kocher, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, and Department of Stereotaxy and Functional Neurosurgery, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany; Garry Ceccon, Department of Neurology, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany; Cláudia Régio Brambilla, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, and JARA-BRAIN-Translational Medicine, Aachen, Germany; Nadim J. Shah, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, JARA-BRAIN-Translational Medicine, Aachen, Germany, and Department of Neurology, University Hospital RWTH Aachen, Aachen, Germany; Karl-Josef Langen, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, Department of Nuclear Medicine, University Hospital RWTH Aachen, Aachen, Germany, and Center for Integrated Oncology, Universities of Aachen, Bonn, Cologne, and Duesseldorf, Germany; Norbert Galldiks, Institute of Neuroscience and Medicine, Forschungszentrum Juelich GmbH, Juelich, Germany, Department of Neurology, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany, and Center for Integrated Oncology, Universities of Aachen, Bonn, Cologne, and Duesseldorf, Germany; and Fabian Isensee, Applied Computer Vision Lab, Helmholtz Imaging, Heidelberg, Germany, and Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany.

Visit the JNM website for the latest research, and follow our new Twitter and Facebook pages @JournalofNucMed or follow us on LinkedIn.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.