Potential Key to Learning in Humans and AI


By: Nour Hany

 

Brain cells variability may improve brain performance and speed up the learning process in both humans and artificial intelligence (AI), researchers discovered in a recent study.

 

     
 

Before we get into the core of the study, let us define an important term first Neurons, or nerve cells, are billions of cells that are essential units of the nervous system and the brain. These cells are responsible for sending motor commands to our muscles, receiving sensory input from the external world, and transforming and relaying the electrical signals at every step in between. Vast neural networks connect these cells and allow us to learn. From a distance, neurons do sound alike, but if we look close enough, we will not find two neurons that look exactly alike, which gives them a snowflake vibe!

 
     

 

In AI, each and every cell in an artificial neural network (the base of AI technology) is identical; however, their connectivity varies. As much as it is obvious how advancing and smart this technology is, it has not quite reached the intelligence level of an actual human brain. They do not learn as quickly or accurately as humans do, which made researchers speculate that this is due to their lack of cell variability.

 

"Evolution has given us incredible brain functions, most of which we are only just beginning to understand. Our research suggests that we can learn vital lessons from our own biology to make AI work better for us," said lead author Dr. Dan Goodman, Imperial's Department of Electrical and Electronic Engineering.

 

The study found that when tweaking the electrical properties of individual cells in simulations of brain networks, these networks learned faster than simulations with cells that are identical. Moreover, the networks needed fewer of these tweaked individual cells to have the same results. Also, the method was less energy-intensive, while in the case of identical cells, it was the other way around.

"The brain needs to be energy efficient while still being able to excel at solving complex tasks. Our work suggests that having a diversity of neurons in both brains and AI systems fulfills both these requirements and could boost learning," said first author Nicolas Perez, PhD student at Imperial College London's Department of Electrical and Electronic Engineering.

In the study, researchers mainly focused on tweaking the "time constant"; it is "how quickly each cell decides what it wants to do based on what the cells connected to it are doing." – ScienceDaily. Some of these cells will only look at what the connected cells have just done and then decide what they want to do very quickly. Some other cells can be slower in their reaction, as their decision is based on what other cells have been doing for a while.

After the cells' time constants had been varied, researchers tasked the network by performing some benchmark machine learning tasks: to recognize human gestures, to identify spoken digits and commands, and to classify images of clothing and handwritten digits.

When the results came out, they showed that the network was better able to solve tasks in more real-world, complicated settings by allowing it to combine slow and fast information. When the researchers changed the amount of variability in the simulated networks, the ones that performed best were found to be matching the amount of variability seen in the brain. This finding suggests that our human brain may have evolved to include just the right amount of variability needed for optimal learning.

 

 

The findings of this study could educate us about the reasons behind how good our brains are at learning. It may also assist us while building better AI systems, in which digital assistants can recognize faces and voices, or self-driving car technology can be found. "We demonstrated that AI can be brought closer to how our brains work by emulating certain brain properties. However, current AI systems are far from achieving the level of energy efficiency that we find in biological systems. Next, we will look at how to reduce the energy consumption of these networks to get AI networks closer to performing as efficiently as the brain," Nicolas added.

 

References

www.sciencedaily.com

qbi.uq.edu.au