AI Models to Revolutionize Pathology, Mass General Brigham Claims

Mass General Brigham

Foundation models, advanced artificial intelligence systems trained on large-scale datasets, hold the potential to provide unprecedented advancements for the medical field. In computational pathology (CPath), these models may excel in diagnostic accuracy, prognostic insights, and predicting therapeutic responses. Researchers at Mass General Brigham have designed the two of the largest CPath foundation models to date: UNI and CONCH. These foundation models were adapted to over 30 clinical, diagnostic needs, including disease detection, disease diagnosis, organ transplant assessment, and rare disease analysis. The new models overcame limitations posed by current models, performing well not only for the clinical tasks the researchers tested but also showing promise for identifying new, rare and challenging diseases. Papers on UNI and CONCH are published today in Nature Medicine.

UNI is a foundation model for understanding pathology images, from recognizing disease in histology region-of-interests to gigapixel whole slide imaging. Trained using a database of over 100 million tissue patches and over 100,000 whole slide images, it stands out as having universal AI applications in anatomic pathology. Notably, UNI employs transfer learning, applying previously acquired knowledge to new tasks with remarkable accuracy. Across 34 tasks, including cancer classification and organ transplant assessment, UNI outperformed established pathology models, highlighting its versatility and potential applications as a CPath tool.

CONCH is a foundation model for understanding both pathology images and language. Trained on a database of over 1.17 million histopathology image-text pairs, CONCH excels in tasks like identifying rare diseases, tumor segmentation, and understanding gigapixel images. Because CONCH is trained on text, pathologists can interact with the model to search for morphologies of interest. In a comprehensive evaluation across 14 clinically relevant tasks, CONCH outperformed standard models and demonstrated its effectiveness and versatility.

The research team is making the code publicly available for other academic groups to use in addressing clinically relevant problems.

"Foundation models represent a new paradigm in medical artificial intelligence," said corresponding author Faisal Mahmood, PhD, of the Division of Computational Pathology in the Department of Pathology at Mass General Brigham. "These models are AI systems that can be adapted to many downstream, clinically relevant tasks. We hope that the proof-of-concept presented in these studies will set the stage for such self-supervised models to be trained on larger and more diverse datasets."

Authorship: Mass General Brigham co-authors of the UNI paper include Richard J. Chen, Tong Ding, Ming Y. Lu, Drew F. K. Williamson, Guillaume Jaume, Andrew H. Song, Bowen Chen, Andrew Zhang, Daniel Shao, Muhammad Shaban, Mane Williams, Lukas Oldenburg, Luca L. Weishaupt, Judy J. Wang, Anurag Vaidya, Long Phi Le, Georg Gerber, Sharifa Sahai, and Walt Williams.

Mass General Brigham co-authors of the CONCH paper include Ming Y. Lu, Bowen Chen, Drew F. K. Williamson, Richard J. Chen, Ivy Liang, Tong Ding, Guillaume Jaume, Igor Odintsov, Long Phi Le, Georg Gerber, and Andrew Zhang. Additional co-authors include Anil V. Parwani.

Disclosures: Richard J. Chen, Ming Y. Lu, Bowen Chen and Faisal Mahmood are inventors on two provisional US patents filed corresponding to the methodological aspects of this work.

Funding: This work was supported in part by the BWH president's fund, BWH and MGH Pathology, NIH NIGMS R35GM138216 and the Massachusetts Life Sciences Center.

Papers cited:

  1. Lu MY et al. "A visual-language foundation model for computational pathology" Nature Medicine DOI: 10.1038/s41591-024-02856-4

  1. Chen, RJ et al. "Towards a general-purpose foundation model for computational pathology" Nature Medicine DOI: 10.1038/s41591-024-02857-3

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.