
via www.interactive-biology.com
Narrowing the gap between biological brains and electronic ones
SINCE nobody really knows how brains work, those researching them must often resort to analogies. A common one is that a brain is a sort of squishy, imprecise, biological version of a digital computer. But analogies work both ways, and computer scientists have a long history of trying to improve their creations by taking ideas from biology. The trendy and rapidly developing branch of artificial intelligence known as “deep learning”, for instance, takes much of its inspiration from the way biological brains are put together.
The general idea of building computers to resemble brains is called neuromorphic computing, a term coined by Carver Mead, a pioneering computer scientist, in the late 1980s. There are many attractions. Brains may be slow and error-prone, but they are also robust, adaptable and frugal. They excel at processing the sort of noisy, uncertain data that are common in the real world but which tend to give conventional electronic computers, with their prescriptive arithmetical approach, indigestion. The latest development in this area came on August 3rd, when a group of researchers led by Evangelos Eleftheriou at IBM’s research laboratory in Zurich announced, in a paper published in Nature Nanotechnology, that they had built a working, artificial version of a neuron.
Neurons are the spindly, highly interconnected cells that do most of the heavy lifting in real brains. The idea of making artificial versions of them is not new. Dr Mead himself has experimented with using specially tuned transistors, the tiny electronic switches that form the basis of computers, to mimic some of their behaviour. These days, though, the sorts of artificial neurons that do everything from serving advertisements on web pages to recognising faces in Facebook posts are mostly simulated in software, with the underlying code running on ordinary silicon. That works, but as any computer scientist will tell you, creating an ersatz version of something in software is inevitably less precise and more computationally costly than simply making use of the thing itself.
Learn more: Artificial neurons – You’ve go a nerve
The Latest on: Neuromorphic computing
via Google News
The Latest on: Neuromorphic computing
- Hospital Tests Neuromorphic Chip-Powered Robotic Armon February 24, 2021 at 1:29 pm
The device, mounted on wheelchairs and powered by technology that imitates the way the human brain works, could provide patients new levels of independence.
- Computing goes ‘neuromorphic’ with nanotechnologyon February 23, 2021 at 1:00 pm
Nanotechnology is poised to transform today's conventional information processing systems. University of Canterbury (UC) researchers are leading the ...
- The Future of Computing: Hype, Hope, and Realityon February 19, 2021 at 12:18 am
Hype, Hope, and Reality By Bill Reichert, Partner, Pegasus Tech Ventures - For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has ...
- Neuromorphic Computing Opportunity Analysis 2020: Promising Start-Ups Developing Non-Invasive BCI Solutions Gaining Traction from VC Funds and Tier-1on February 18, 2021 at 12:34 am
Opportunity Analysis" report has been added to ResearchAndMarkets.com's offering. Conventional artificial intelligence (AI)-enabled processors are based on rule-based algorithms and are optimized for ...
- Global Neuromorphic Computing Market 2020 Growth Opportunities, Market Shares, Future Estimations and Key Countries by 2025on February 17, 2021 at 10:38 pm
Global Neuromorphic Computing Market 2020 by Company, Type and Application, Forecast to 2025 is extremely much required in some ways for business growth and to thrive within the market. In this report ...
- FeFETs Bring Promise And Challengeson February 17, 2021 at 12:18 am
Implementing logic using FeFETs is a significant topic in its own right, so the following will focus on the memory implementations, since they’re where most of the development focus is. Logic will be ...
- Advanced materials, hardware for next-generation computingon February 15, 2021 at 7:45 am
Control Engineering - Increasing digitalization is constantly driving the demands on electronic hardware. Speed, performance, miniaturization and energy efficiency are becoming ...
- Physicists Discover Important and Unexpected Electronic Property of Graphene – Could Power Next-Generation Computerson February 13, 2021 at 6:18 pm
Unconventional form of ferroelectricity could impact next-generation computing. MIT researchers and colleagues recently discovered an important — and unexpected — electronic property of graphene, a ma ...
- Neuromorphic Computer Technology: Analysis of Opportunitieson February 12, 2021 at 4:46 am
Reportlinker.com Announces Release of Report “Neuromorphic Computing: Opportunity Analysis” – With technological advancements in chipset architecture and algorithms, neuromorphic chipsets process ...
- Neuromorphic Computing: Opportunity Analysison February 12, 2021 at 2:41 am
Reportlinker.com announces the release of the report "Neuromorphic Computing: Opportunity Analysis" - With technology advancements in chipset architecture and algorithms, neuromorphic chipsets are ...
via Bing News