
via The Ohio State University.
Scientists develop the next generation of reservoir computing
A relatively new type of computing that mimics the way the human brain works was already transforming how scientists could tackle some of the most difficult information processing problems.
Now, researchers have found a way to make what is called reservoir computing work between 33 and a million times faster, with significantly fewer computing resources and less data input needed.
In fact, in one test of this next-generation reservoir computing, researchers solved a complex computing problem in less than a second on a desktop computer.
Using the now current state-of-the-art technology, the same problem requires a supercomputer to solve and still takes much longer, said Daniel Gauthier, lead author of the study and professor of physics at The Ohio State University.
“We can perform very complex information processing tasks in a fraction of the time using much less computer resources compared to what reservoir computing can currently do,” Gauthier said.
“And reservoir computing was already a significant improvement on what was previously possible.”
The study was published today (Sept. 21, 2021) in the journal Nature Communications.
Reservoir computing is a machine learning algorithm developed in the early 2000s and used to solve the “hardest of the hard” computing problems, such as forecasting the evolution of dynamical systems that change over time, Gauthier said.
Dynamical systems, like the weather, are difficult to predict because just one small change in one condition can have massive effects down the line, he said.
One famous example is the “butterfly effect,” in which – in one metaphorical illustration – changes created by a butterfly flapping its wings can eventually influence the weather weeks later.
Previous research has shown that reservoir computing is well-suited for learning dynamical systems and can provide accurate forecasts about how they will behave in the future, Gauthier said.
It does that through the use of an artificial neural network, somewhat like a human brain. Scientists feed data on a dynamical network into a “reservoir” of randomly connected artificial neurons in a network. The network produces useful output that the scientists can interpret and feed back into the network, building a more and more accurate forecast of how the system will evolve in the future.
The larger and more complex the system and the more accurate that the scientists want the forecast to be, the bigger the network of artificial neurons has to be and the more computing resources and time that are needed to complete the task.
One issue has been that the reservoir of artificial neurons is a “black box,” Gauthier said, and scientists have not known exactly what goes on inside of it – they only know it works.
The artificial neural networks at the heart of reservoir computing are built on mathematics, Gauthier explained.
“We had mathematicians look at these networks and ask, ‘To what extent are all these pieces in the machinery really needed?’” he said.
In this study, Gauthier and his colleagues investigated that question and found that the whole reservoir computing system could be greatly simplified, dramatically reducing the need for computing resources and saving significant time.
They tested their concept on a forecasting task involving a weather system developed by Edward Lorenz, whose work led to our understanding of the butterfly effect.
Their next-generation reservoir computing was a clear winner over today’s state—of-the-art on this Lorenz forecasting task. In one relatively simple simulation done on a desktop computer, the new system was 33 to 163 times faster than the current model.
But when the aim was for great accuracy in the forecast, the next-generation reservoir computing was about 1 million times faster. And the new-generation computing achieved the same accuracy with the equivalent of just 28 neurons, compared to the 4,000 needed by the current-generation model, Gauthier said.
An important reason for the speed-up is that the “brain” behind this next generation of reservoir computing needs a lot less warmup and training compared to the current generation to produce the same results.
Warmup is training data that needs to be added as input into the reservoir computer to prepare it for its actual task.
“For our next-generation reservoir computing, there is almost no warming time needed,” Gauthier said.
“Currently, scientists have to put in 1,000 or 10,000 data points or more to warm it up. And that’s all data that is lost, that is not needed for the actual work. We only have to put in one or two or three data points,” he said.
And once researchers are ready to train the reservoir computer to make the forecast, again, a lot less data is needed in the next-generation system.
In their test of the Lorenz forecasting task, the researchers could get the same results using 400 data points as the current generation produced using 5,000 data points or more, depending on the accuracy desired.
“What’s exciting is that this next generation of reservoir computing takes what was already very good and makes it significantly more efficient,” Gauthier said.
He and his colleagues plan to extend this work to tackle even more difficult computing problems, such as forecasting fluid dynamics.
“That’s an incredibly challenging problem to solve. We want to see if we can speed up the process of solving that problem using our simplified model of reservoir computing.”
Original Article: A new way to solve the ‘hardest of the hard’ computer problems
More from: Ohio State University | Clarkson University
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Reservoir computing
- Unleashing the power of water: Researchers build analog computer to forecast chaotic futureson May 29, 2023 at 8:31 am
Can a computer accurately anticipate the future like a human? Researchers build computer that defies convention to forecast chaotic events.
- Prototype 'Reservoir Computer' Predicts Events Better Than Some Digital Computerson May 27, 2023 at 3:00 pm
Can a computer learn from the past and anticipate what will happen next, like a human? You might not be surprised to hear that some cutting-edge AI models could achieve this feat, but what about a ...
- Researchers built an analogue computer that uses water waves to forecast the chaotic futureon May 26, 2023 at 1:28 pm
More information: Ivan Maksymov et al, Reservoir computing based on solitary-like waves dynamics of liquid film flows: A proof of concept, Europhysics Letters (2023). DOI: 10.1209/0295-5075/acd471 ...
- Researchers built an analogue computer that uses water waves to forecast the chaotic futureon May 25, 2023 at 6:45 pm
and forecasts future events via an approach called “reservoir computing”. In benchmark tests, our analogue computer did well at remembering input data and forecasting future events – and in ...
- Quantum Computing surges after follow-on order to support NASA's climate monitoringon May 23, 2023 at 2:22 am
The company will process satellite images for NASA by utilizing its photonic-based reservoir computing technology, in addition to testing its proprietary quantum photonic system for remote sensing ...
Go deeper with Google Headlines on:
Reservoir computing
[google_news title=”” keyword=”reservoir computing” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Next-generation reservoir computing
- GuruPlug, The Next Generation Of SheevaPlugon May 31, 2023 at 5:01 pm
Meet GuruPlug, an all-in-one server that is now available for pre-order. This is the next generation of the popular SheevaPlug that features some added goodies. The base model sells for the same $ ...
- Quantum Computing Inc Announces First Quarter 2023 Financial Resultson May 11, 2023 at 1:48 pm
Dirac series entropy quantum computer. Zero-trust cyber security protection solution. Non-repeatable Quantum Random Number Generator. Reservoir photonics computing (machine learning/predictive) ...
- Quantum Computing Inc Announces First Quarter 2023 Financial Resultson May 11, 2023 at 8:01 am
Quantum Computing Inc. initiates commercialization ... photonic quantum chip will enable us to take our products to a next generation level, stay tuned." Operating Expenses: First quarter ...
- Quantum Computing Inc.'s Reservoir Quantum Computer to Demonstrate 'the Power of Artificial Intelligence' via Partnership with millionways AIon April 28, 2023 at 9:33 am
millionways, a New York City based technology firm, is a leader in the development of AI algorithms used to effectively provide next-gen ... application of our reservoir computing technology.
- Quantum Computing Inc.'s Reservoir Quantum Computer to Demonstrate 'the Power of Artificial Intelligence' via Partnership with millionways AIon April 28, 2023 at 9:33 am
millionways, a New York City based technology firm, is a leader in the development of AI algorithms used to effectively provide next-gen ... application of our reservoir computing technology.
Go deeper with Google Headlines on:
Next-generation reservoir computing
[google_news title=”” keyword=”next-generation reservoir computing” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]