A new software enables brain simulations which both imitate the processes in the brain in detail and can solve challenging cognitive tasks. The program was developed by a research team at the Cluster of Excellence 'Machine Learning: New Perspectives for Science' at the University of Tübingen. The software thus forms the basis for a new generation of brain simulations which allow deeper insights into the functioning and performance of the brain. The Tübingen researchers' paper has been published in the journal Nature Methods.
For decades researchers have been trying to create computer models of the brain in order to increase understanding of the organ and the processes that take place there. Using mathematical methods they have simulated the behavior and interaction of nerve cells and their compounds. However, previous models had significant weaknesses: They were either based on oversimplified neuron models and therefore strayed significantly from biological reality, or they depicted the biophysical processes within cells in detail, but were incapable of carrying out similar tasks to the brain. "Either the path is similar to that in the brain, but the result is not, or the result is correct but the process that leads there does not compare with the processes in the brain," explains Michael Deistler, first author of the study and researcher in Professor Jakob Macke's work group. Jaxley, as the new program is called, allows the training of brain models such that both apply - an important step towards being able to draw conclusions from the model about the actual processes in the brain.
This has been achieved using a method that is also used to train artificial neuronal networks: 'backpropagation of error'. With the aid of backpropagation, an artificial neuronal network adjusts its parameters during training so that a given input results in a desired output. The network keeps adapting itself until it reliably achieves the desired task. In this way, the network learns which features and connections in the data are important to a specific process, in order also to deliver the correct results given new, similar examples. The Tübingen researchers have transferred this training principle to brain simulations.
Detailed brain models carry out challenging tasks
When the brain carries out a task, there are hundreds of important parameters in the neurons that are involved. This may for example be the size of the neurons, the strength of connections or the number of ion channels. "Many of these parameters cannot be measured. Until now this has made it impossible to developed exact simulations that produce good results," says Deistler. "Jaxley can train these non-measurable parameters in brain models. The software repeatedly changes their values, repeatedly readjusts, until the simulation reaches the desired result." After training, the resulting brain models were for example capable of classifying images or storing and accessing memories.
"Thanks to Jaxley we can now study how neuronal mechanisms contribute to solving tasks," says Jakob Macke, Professor of Machine Learning in Science at the University of Tübingen and last author of the study. "The software will allow neuroscientists to investigate the complexity of the brain and depict it in computer simulations." Long-term such simulations could also be applied in medicine, for instance in order to understand neurological diseases better or virtually study the effect of medicines in advance.
The president of the University of Tübingen Professor Dr. Dr. h.c. (Dōshisha) Karla Pollmann says: "This work is a striking demonstration of how machine learning can enrich other areas of science: Artificial intelligence is a key technology which opens up new horizons for basic research."
Publication:
Michael Deistler, Kyra L. Kadhim, Matthijs Pals, Jonas Beck, Ziwei Huang, Manuel Gloeckler, Janne K. Lappalainen, Cornelius Schröder, Philipp Berens, Pedro J. Gonçalves, Jakob H. Macke: Jaxley: Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics, Nature Methods (2025). https://doi.org/10.1038/s41592-025-02895-w