AI Tool by Mayo Clinic Spots Infections in Patient Photos

Mayo Clinic

ROCHESTER, Minn. — A team of Mayo Clinic researchers has developed an artificial intelligence (AI) system that can detect surgical site infections (SSIs) with high accuracy from patient-submitted postoperative wound photos, potentially transforming how postoperative care is delivered.

Published in the Annals of Surgery , the study introduces an AI-based pipeline the researchers created that can automatically identify surgical incisions, assess image quality and flag signs of infection in photos submitted by patients through online portals. The system was trained on over 20,000 images from more than 6,000 patients across nine Mayo Clinic hospitals.

"We were motivated by the increasing need for outpatient monitoring of surgical incisions in a timely manner," says Cornelius Thiels, D.O. , a hepatobiliary and pancreatic surgical oncologist at Mayo Clinic and co-senior author of the study. "This process, currently done by clinicians, is time-consuming and can delay care. Our AI model can help triage these images automatically, improving early detection and streamlining communication between patients and their care teams."

The AI system uses a two-stage model. First, it detects whether an image contains a surgical incision and then evaluates whether that incision shows signs of infection. The model, Vision Transformer, achieved a 94% accuracy in detecting incisions and an 81% area under the curve (AUC) in identifying infections.

"This work lays the foundation for AI-assisted postoperative wound care, which can transform how postoperative patients are monitored," says Hala Muaddi, M.D., Ph.D., a hepatopancreatobiliary fellow at Mayo Clinic and first author. "It's especially relevant as outpatient operations and virtual follow-ups become more common."

The researchers are hopeful that this technology could help patients receive faster responses, reduce delays in diagnosing infections and support better care for those recovering from surgery at home. With further validation, it could function as a frontline screening tool that alerts clinicians to concerning incisions. This AI tool also paves the way for developing algorithms capable of detecting subtle signs of infection, potentially before they become visually apparent to the care team. This would allow for earlier treatment, decreased morbidity and reduced costs.

"For patients, this could mean faster reassurance or earlier identification of a problem," says Dr. Muaddi. "For clinicians, it offers a way to prioritize attention to cases that need it most, especially in rural or resource-limited settings."

Importantly, the model demonstrated consistent performance across diverse groups, addressing concerns about algorithmic bias.

While the results are promising, the team says that further validation is needed.

"Our hope is that the AI models we developed — and the large dataset they were trained on — have the potential to fundamentally reshape how surgical follow-up is delivered," says Hojjat Salehinejad, Ph.D. , a senior associate consultant of health care delivery research within the Kern Center for the Science of Health Care Delivery and co-senior author. "Prospective studies are underway to evaluate how well this tool integrates into day-to-day surgical care."

This research was supported by the Dalio Philanthropies Artificial Intelligence/Machine Learning Enablement Award and the Simons Family Career Development Award in Surgical Innovation.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.