Machine learning model builds on imaging methods to better detect ovarian lesions

Although ovarian cancer is the deadliest type of cancer for women, only about 20% of cases are found at an early stage, as there are no real screening tests for them and few symptoms to prompt them. Additionally, ovarian lesions are difficult to diagnose accurately - so difficult, in fact that there is no sign of cancer in more than 80% of women who undergo surgery to have lesions removed and tested.

Headshot of Quing Zhu, Edwin H. Murty Professor of Engineering in the department of biomedical envineering
Zhu

Quing Zhu, the Edwin H. Murty Professor of Biomedical Engineering at Washington University in St. Louis' McKelvey School of Engineering, and members of her lab have applied a variety of imaging methods to diagnose ovarian cancer more accurately. Now, they have developed a new machine learning fusion model that takes advantage of existing ultrasound features of ovarian lesions to train the model to recognize whether a lesion is benign or cancerous from reconstructed images taken with photoacoustic tomography. Machine learning traditionally has been focused on single modality data. Recent findings have shown that multi-modality machine learning is more robust in its performance over unimodality methods. In a pilot study of 35 patients with more than 600 regions of interest, the model's accuracy was 90%.

It is the first study using ultrasound to enhance the machine learning performance of photoacoustic tomography reconstruction for cancer diagnosis. Results of the research were published in the December issue of the journal Photoacoustics.

Headshot of Yun Zhu, standing outside
Zou

"Existing modalities are mainly based on the size and shape of the ovarian lesions, which do not provide an accurate diagnosis for earlier ovarian cancer and for risk assessment of large adnexal/ovarian lesions," said Zhu, also a professor of radiology at the School of Medicine. "Photoacoustic imaging adds more functional information about vascular contrast from hemoglobin concentration and blood oxygen saturation."

Yun Zou, a doctoral student in Zhu's lab, developed a new machine learning fusion model by combining an ultrasound neural network with a photoacoustic tomography neural network to perform ovarian lesion diagnosis. Cancerous lesions of the ovaries can present in several different morphologies from ultrasound: some are solid, and others have papillary projects inside cystic lesions, making them more difficult to diagnose. To improve overall diagnosis of ultrasound, they added the total hemoglobin concentration and blood oxygenation saturation from photoacoustic imaging, both of which are biomarkers for cancerous ovarian tissue.

"Our results showed that the ultrasound-enhanced photoacoustic imaging fusion model reconstructed the target's total hemoglobin and blood oxygen saturation maps more accurately than other methods and provided an improved diagnosis of ovarian cancers from benign lesions," Zou said.


Funding for this research is from the National Institutes of Health (R01 CA237664, R01 CA228047).

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 165 full-time faculty, 1,420 undergraduate students, 1,614 graduate students and 21,000 living alumni, we are working to solve some of society's greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.