
The ANNABELL model is a cognitive architecture entirely made up of interconnected artificial neurons, able to learn to communicate using human language starting from a state of ‘tabula rasa’ only through communication with a human interlocutor.
CREDIT
Bruno Golosio
A computer simulation of a cognitive model entirely made up of artificial neurons learns to communicate through dialogue starting from a state of tabula rasa
A group of researchers from the University of Sassari (Italy) and the University of Plymouth (UK) has developed a cognitive model, made up of two million interconnected artificial neurons, able to learn to communicate using human language starting from a state of “tabula rasa”, only through communication with a human interlocutor. The model is called ANNABELL (Artificial Neural Network with Adaptive Behavior Exploited for Language Learning) and it is described in an article published in the international scientific journal PLOS ONE. This research sheds light on the neural processes that underlie the development of language.
How does our brain develop the ability to perform complex cognitive functions, such as those needed for language and reasoning? This is a question that certainly we are all asking ourselves, to which the researchers are not yet able to give a complete answer. We know that in the human brain there are about one hundred billion neurons that communicate by means of electrical signals. We learned a lot about the mechanisms of production and transmission of electrical signals among neurons. There are also experimental techniques, such as functional magnetic resonance imaging, which allow us to understand which parts of the brain are most active when we are involved in different cognitive activities. But a detailed knowledge of how a single neuron works and what are the functions of the various parts of the brain is not enough to give an answer to the initial question.
We might think that the brain works in a similar way to a computer: after all, even computers work through electrical signals. In fact, many researchers have proposed models based on the analogy brain-is-like-a-computer since the late ’60s. However, apart from the structural differences, there are profound differences between the brain and a computer, especially in learning and information processing mechanisms. Computers work through programs developed by human programmers. In these programs there are coded rules that the computer must follow in handling the information to perform a given task. However there is no evidence of the existence of such programs in our brain. In fact, today many researchers believed that our brain is able to develop higher cognitive skills simply by interacting with the environment, starting from very little innate knowledge. The ANNABELL model appears to confirm this perspective.
ANNABELL does not have pre-coded language knowledge; it learns only through communication with a human interlocutor, thanks to two fundamental mechanisms, which are also present in the biological brain: synaptic plasticity and neural gating. Synaptic plasticity is the ability of the connection between two neurons to increase its efficiency when the two neurons are often active simultaneously, or nearly simultaneously. This mechanism is essential for learning and for long-term memory. Neural gating mechanisms are based on the properties of certain neurons (called bistable neurons) to behave as switches that can be turned “on” or “off” by a control signal coming from other neurons. When turned on, the bistable neurons transmit the signal from a part of the brain to another, otherwise they block it. The model is able to learn, due to synaptic plasticity, to control the signals that open and close the neural gates, so as to control the flow of information among different areas.
The cognitive model has been validated using a database of about 1500 input sentences, based on literature on early language development, and has responded by producing a total of about 500 sentences in output, containing nouns, verbs, adjectives, pronouns, and other word classes, demonstrating the ability to express a wide range of capabilities in human language processing.
Read more: A network of artificial neurons learns to use human language
The Latest on: Artificial neural network
[google_news title=”” keyword=”artificial neural network” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Artificial neural network
- The Deep Neural Network of AI That Misleads Resultson June 5, 2023 at 8:08 am
Assistant Professor Peter Koo has found that scientists using popular computational tools to interpret AI predictions are picking up too much “noise,” or extra information, when analyzing DNA. And ...
- Artificial Intelligence Series 1 Of 5: Past, Present, And Futureon June 1, 2023 at 12:08 am
Human integration with technology has always been intertwined. From fire to AI. However, the speed of technology development is exponentially changing.
- AI Terminology 101: How Recurrent Neural Networks Are Revolutionising AIon May 30, 2023 at 8:05 am
Recurrent Neural Networks (RNNs) are a class of artificial neural networks designed for processing sequential data. Unlike traditional feedforward neural networks, which process inputs ...
- Artificial Neural Networks Global Market Report 2023: Sector to Reach $1.3 Billion by 2030 at an 11% CAGRon May 29, 2023 at 3:34 pm
ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key ...
- Artificial Neural Networks Global Market Report 2023: Sector to Reach $1.3 Billion by 2030 at an 11% CAGRon May 29, 2023 at 3:20 pm
DUBLIN, May 29, 2023 /PRNewswire/ -- The "Artificial Neural Networks: Global Strategic Business Report" report has been added to ResearchAndMarkets.com's offering. Global Artificial Neural ...
- What is Black Box AI? Experts explain the hidden decision-making of artificial intelligence machineson May 23, 2023 at 6:16 am
"The whole idea of a black box is you’re not allowed to look inside and see, and that’s what we have with these artificial neural networks, with hundreds of billions of nodes inside of a box ...
- The Need For An Artificial Sensory Nervous System For Generative AIon May 23, 2023 at 4:30 am
Demand for neural networks and LLMs will snowball in the coming years, and supply will develop to meet requirements.
- Some Neural Networks Learn Language Like Humanson May 22, 2023 at 11:05 pm
We preselected all newsletters you had before unsubscribing.
- Some Neural Networks Learn Language Like Humanson May 22, 2023 at 11:05 pm
We preselected all newsletters you had before unsubscribing.
via Bing News