AI Revolutionizes Gel Electrophoresis Analysis

University of Edinburgh

University of Edinburgh scientists have harnessed the power of AI in a new tool that promises to speed up analysis of data from gel electrophoresis experiments.

The technique is widely used across biological sciences to separate and analyse biomolecules and routinely used to inform on many biomolecule activities such as genomic manipulation, DNA supercoiling or evaluating the success or failure of assembly of a bionanostructure or artificial conjugate.

The core principle of gel electrophoresis is simple: biomolecules are suspended within inset wells in a gel matrix, a voltage is applied and charged particles are pushed through the matrix. The size and charge of different molecules cause them to move at different rates, resulting in a barcode-like pattern of 'bands' extending from a well within a 'lane'. These patterns can be photographed and interpreted to yield both qualitative and quantitative information on the contents of a sample.

Despite the unprecedented advances in image processing in recent years software methods for the analysis of gel images have remained essentially unchanged for decades. Most, if not all, image analysis approaches involve either a manual or semi-automatic process of digitally carving out lanes and bands from an image before summing the intensity of the pixels in each band. This process is tedious, prone to user error and relies on assumptions that make it difficult to address bands with irregular shapes or curved trajectories.

However, by framing the extraction and analysis of gel bands from an image as an AI task, a machine learning model can automate most of the tedious steps in the analysis process, while also eliminating biases and assumptions inherent to manual approaches.

The Edinburgh team began their project by setting up an extensive dataset of over 500 human-labelled gel images featuring a range of common experimental scenarios. They used this dataset to train a lightweight neural network to accurately identify bands from images. The result was a highly effective model capable of identifying bands regardless of their quality, background intensity and even the presence of unexpected discontinuities such as torn gel chunks. Furthermore, the approach was able to produce quantitation results that matched or surpassed those generated using conventional tools.

To enable others to apply the technique to their data, the team also developed GelGenie, an open-source graphical application that allows users to extract bands from gel images on their own devices, with no expert knowledge or experience required.

The entire dataset, model weights and scripting framework have also been released to allow others to use or fine-tune the models for more specialized applications or their own custom pipelines.

Dr Matthew Aquilina, who co-led the project whilst at the University of Edinburgh, presently a postdoctoral research fellow at Harvard University & the Dana-Farber Cancer Institute, said: 'To the best of our knowledge, GelGenie is the first software platform to investigate universal gel analysis using AI. We hope our platform has set the stage for a truly universal gel analysis framework that others will integrate into their workflow and continue to iterate on with further refinements and improved functionality.'

Dr Katherine Dunn, University of Edinburgh, School of Engineering, who co-led the project and supervised Dr Aquilina, said: 'Gel electrophoresis is used widely across academia and industry, but most scientists use relatively unsophisticated methods to analyse gel electrophoresis data. Our new tool harnesses the power of artificial intelligence to bring the analysis of gel electrophoresis data firmly into the 21st century.'

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.