How brain research is helping artificial intelligence

Symbolic picture for the article. The link opens the image in a large view.
Bild: PantherMedia/Kheng Ho Toh

FAU researchers paving the way for more efficient AI

Artificial intelligence (AI), especially the training of AI systems such as ChatGPT, consumes enormous amounts of energy. If AI could work more like human brains, it would be much more efficient. Dr. Achim Schilling and Dr. Patrick Krauss from the Neuroscience Laboratory of the Department of Otorhinolaryngology – Head and Neck Surgery, Universitätsklinikum Erlangen, together with colleagues Dr. Richard Gerum from Canada and Dr. André Erpenbeck from the USA, have discovered a method of modifying artificial nerve cells in such a way that they behave more like the nerve cells in the brain.

The researchers’ work at FAU aims to support the development of artificial intelligence systems that require fewer resources such as energy or computer power. Their study has been recognized as the best publication at the world’s largest neural network conference among more than 1,800 papers submitted and more than 1,000 accepted.

Conventional AI systems are built from units that are roughly based on the design of nerve cells. However, they use continuous numerical values in their work, while brain nerve cells use binary electrical impulses, called spikes, to process information. This makes the activity of the human brain much more efficient, because the information is not encoded in the strength of these impulses, but in their time intervals.

AI needs many times more energy compared to the brain

The spikes are millisecond long, always equal voltage pulses – the information lies in the time between the occurrence of the spikes. In AI systems, very large matrices are multiplied by real numbers – the information is contained in the exact values, i.e. the activations of the artificial neurons. This consumes vast amounts of energy. In comparison, the brain needs 20 watts for processing information – the amount of energy of a light bulb. Even simple graphics processors for AI applications already consume several hundred watts.

Improving AI systems also requires a lot of energy and hardware resources. This is because the systems are mainly trained by increasing the amount of data, such as by learning from text corpora on the Internet. The number of trainable parameters is also constantly increasing.

Together with their colleagues Gerum and Erpenbeck, FAU researchers Schilling and Krauss, who work at the interface between AI and brain research, have focused their work on a special type of artificial nerve cells. These LSTM (long short-term memory) units, can “remember” previous experiences and be made to forget using gates in order to delete information that is no longer needed from the system.

The researchers have now modified the LSTM units in such a way that they behave like brain nerve cells that use spikes in the transmission and processing of information. They used the properties of LSTM neurons to mimic the membrane potential – the voltage – of biological cells. This allowed the input signal, which comes from other neurons, to be summed up.

Tested on images – very promising results

The FAU researchers tested the modified LSTM units on four image data sets that are used to train AI systems. They wanted to find out whether their LSTM units are as powerful as existing artificial nerve cells. The LSTM units also achieved similarly good results. It is now planned to apply the algorithms to more complex data such as voice and music.

The special feature of the FAU researchers’ work is that their approach combines the advantages of AI and brain research. Their findings could pave the way for developing AI systems that work more like the human brain and solve complex tasks quickly – and with a lower consumption of resources.

About the Best Paper Award

The significance of the research by Schilling, Krauss and their international partners was recognized by the Best Paper Award at the International Joint Conference on Neural Networks 2023 in a competition involving more than 1000 other published works. The conference is the world’s largest conference for artificial neural networks with a high-profile expert audience. It is the most renowned platform for the exchange of ideas between scientists from various fields of artificial neural networks, neuroinformatics and neurotechnology.

Further information

Dr. Achim Schilling
Neuroscience Laboratory of the Department of Otorhinolaryngology – Head and Neck Surgery at Universitätsklinikum Erlangen
Achim.Schilling@uk-erlangen.de