Artificial Intelligence Researchers Explore the Dendritic Architecture of the Human Brain
When does the human brain begin learning? Amazingly, brain development and preparation for cognitive function begin before birth! According to Kadic & Kurjac (2018), “Fetal action planning is established by 22 weeks and investigations using four-dimensional ultrasound reveal that complexity of fetal motor action and behavior increases as pregnancy progresses…The capacity of the fetus to learn and memory are prodigious.”[i] Thus, by the time an infant is born and takes its first breath, the neuroanatomy needed for learning are well developed.
The basic anatomic structure in the brain is a type of cell called a neuron. The neuron is often likened to a tree, with roots, a trunk, branches and even leaves:
- Axon (similar to the root) is the cell’s output structure to communicate with other neurons by electrically stimulating release of biochemical messengers (neurotransmitters) into the synapse (gap) between itself and other receiving neurons.
- Soma (similar to the trunk) is the cell’s nucleus which contains its DNA, and which manufactures proteins to be transported throughout the axon and dendrites.
- Dendrites (branches) extend into the synapses between their tips and other neurons to receive the biochemical messages coming their way. Dendrites have spines (similar to leaves) as part of their receptors.
It is estimated that there are 86 billion neurons in the human brain, and they make up a complex learning system that can train itself by recognizing patterns in the environment, reinforcing its own success, and correcting its own errors. As the newborn begins its life journey, its brain is learning at a remarkable rate. Some of its learning comes from being given training models, while some of its learning occurs without guidance. All of this is part of what we call intelligence, and it’s part of human nature.
Artificial Intelligence and the brain
The development of Artificial Intelligence (AI) began with the effort to build machines that “think” like the brain. Its earliest forms built on our understanding of local connections between any two neurons, as if there’s a straight line from neuron A to neuron B to neuron C. But remember, each neuron has dendrites that branch out and therefore can contact many other branches from a multitude of neurons, essentially creating a “forest” of transmissions in the brain.
It’s important to understand how Mother Nature equipped the brain to learn. Paradoxically, though, not only does studying the brain’s neuronal networks and pathways help design AI, but the act of designing deeper and deeper AI processes actually helps guide research into the brain’s own methods of interconnecting neurons via their multitudinous dendrites, which is called dendritic learning. Analyzing biological dendritic learning helps guide the creation of a type of AI called Deep Leaning (DL). DL is exciting because, after a sort of basic training, it begins to learn without guidance—very much like the infant brain when it’s figuring out the world without being directed by a specific model.
A recent example of applying human dendritic learning to AI offers the tantalizing possibility that DL can actually outperform the brain! The April, 2022 issue of the journal Nature carried an open access article by Hodassman, et al. with the (perhaps daunting) title, “Efficient Dendritic Learning as an Alternative to Synaptic Plasticity Hypothesis.”[ii] The authors point out that simply trying to build artificial learning tools based on local, linear models would require exponential amplification that would be relatively clumsy, and slow to detect and correct erroneous pathways.
Therefore, they propose a DL model based on dendritic interconnections with nonlinear activations at nodes (intersections). To do so, they create not only feedforward transmissions, but also what they call a backpropagation signal throughout the pervasive treelike architecture. This can minimize the need for long-term memory, and accelerate error detection.
While taking advantage of the branching dynamics of neurons as the human brain performs them, the authors point out that adding the backpropagation makes their process “intrinsically different” from the brain, in which such a process is “biologically implausible.” They note, “The emergence of many input crosses as a byproduct of nonlinear amplification of dendritic segments differentiates between the computational power of a single dendrite or neuron…”, which is the power with which traditional AI programs are constructed.
The Hodassman team has laid important groundwork for taking DL to a new level. By capitalizing on the dendritic architecture of the brain’s neurons, they have generated a model that transcends the one neuron-transmission-to-another imitative process that launched AI in the late 20th century. Stay tuned to our AI blogs, where we report ongoing developments and this exhilarating field, and how they can be applied in medicine.
NOTE: This content is solely for purposes of information and does not substitute for diagnostic or medical advice. Talk to your doctor if you are experiencing pelvic pain, or have any other health concerns or questions of a personal medical nature.
[i] Kadic AS, Kurjak A. Cognitive Functions of the Fetus. Ultraschall Med. 2018 Apr;39(2):181-189.
[ii] Hodassman S, Vardi R, Tugendhaft Y, Goldental A, Kanter I. Efficient dendritic learning as an alternative to synaptic plasticity hypothesis. Sci Rep. 2022 Apr 28;12(1):6571.
- CATEGORY:
- Artificial Intelligence