Written by Brinda Aparajita Cheema
The neural network in artificial intelligence is adapted by that of the brain. Although, even if this is the case, they do differ in many ways from their biological counterparts. The inspiration for planes came from birds, and sources of transportation such as cars came from horses. However, none of today's machines resemble metal structures that are capable of living, breathing, or self-replicating. Nevertheless, our limited machines are even more powerful than humans in their domains, i.e., more useful.
What is a neural network?
In layman's terms, a neural network is one that receives an input, processes it, and generates an output while learning from the data it acquires. The brain’s neural network consists of billions of nerve cells called neurons. Neurons use electrical signals to communicate. Dendrites, a tree-like structure in the neuron, receive communications from other neurons and send them through its cell body. The signal is passed to other neurons by the axon. This process is happening in billions of neurons across the brain, creating this vast system that eventually becomes self-sustaining (capable of thoughts, emotions, and desires). In contrast, the neural network in AI can only mimic certain parts of the neuron, such as dendrites, cell bodies, or axons using simplified mathematical models. It is more specialized for a specific task as it cannot create or destroy connections between neurons or ignore signal timings.
Key differences: Neural networks in the brain and AI
Size: Our brain consists of about 86 billion neurons, while the number of 'neurons' in an AI network is about 10-1000. Taken out of context, that is a substantial gap, but the artificial neurons are more powerful in certain aspects. Preceptors, the predecessors to artificial neurons, work in a linear fashion where it takes inputs on their "dendrites" and generates outputs on their 'axon branches.' Several perceptrons lie in a single layer of a perceptron network but are not interconnected. Whereas deep neural networks usually consist of input neurons, output neurons, and neurons in the hidden layers, in-between. All the layers are usually fully connected to the next layer, implying that artificial neurons, for the most part, have as many connections as there are artificial neurons in the preceding and following layers combined.
Speed: Biological neurons usually fire signals about 200 times a second. These signals travel at various speeds depending on the type of nerve impulse, ranging from 0.60 m/s up to 120 m/s. Information in artificial neurons is alternatively carried over by the continuous, floating-point number values of synaptic weights (strength or amplitude of a connection between two nodes). The speed of calculating an algorithm carries no information other than making the model's execution and training faster. Artificial neurons do not experience 'fatigue.' Given artificial neural networks, models can be understood as a bunch of matrix operations and finding derivatives; these calculations can be highly optimized for vector processors and speeded up using GPUs (Graphics Processing Unit) or dedicated hardware
Fault Tolerance: Biological neural networks are fault-tolerant due to their synchronous nature. Minor failures do not result in memory loss because the information is stored redundantly. The brain can recover and heal to some extent. Artificial neural networks are not designed for fault tolerance or self-regeneration, as they are part of a network that has asynchronous computing nodes.
Learning: It is still a mystery how the brain learns; how redundant connections store and retrieve information. Fibers in the brain grow and reach out to connect to other neurons, neuroplasticity causes the formation of new connections and areas to shift and alter function, and synapses can be strengthened or weakened based on their importance. By learning, we build on information that is already stored in the brain. Our knowledge deepens through repetition and sleep, and tasks that once a focus is required can be performed automatically once mastered. On the other hand, artificial neural networks have a predefined model where no further neurons or connections can be added or removed. During training, only the weights of the connections can change. Networks begin with random weight values and slowly attempt to reach a point where further weight changes would no longer improve performance.
Artificial Intelligence is now able to beat people in every area. If sufficient data and examples are provided digitally, it can be turned into numerical values without much difficulty. The machine gets proficient in specific tasks. For example, AlphaGo, an AI software, can beat anyone in a game of Go, yet it can most likely be defeated in the game of Tic-Tac-Toe as it has no information about the games outside its domain. Machine learning map input features to outputs more efficiently than humans. However, they have difficulty finding and understanding additional features. Thus, they are unable to update their world models based on them quickly. Machine learning models learn the relationship in data representation. This also means that if the representation is vague and depends on the context, even the most accurate models will fail because the output results are valid only under different circumstances.
The brain figured out how to use fire —which marked the beginning of civilization— to name itself, is a miracle that reveals how the networks in the brain are interlinked in a sophisticated manner. The AI neural network is unable to duplicate the natural brain as yet. With further advancements in technology, it is possible to create an AI software that mimics the human brain vastly better than it does today. This endeavor requires more complex network creation, injection of large data doses, and time input to make AI approximate the vast complexities of the natural brian.