Now Reading
The promise of binarized neural networks for fast, accurate machine learning applications

The promise of binarized neural networks for fast, accurate machine learning applications

via PACIFIC NORTHWEST NATIONAL LABORATORY

via PACIFIC NORTHWEST NATIONAL LABORATORY

The promise of binarized neural networks for fast, accurate machine learning applications

As anyone with a green thumb knows, pruning can promote thriving vegetation. A snip here, a snip there, and growth can be controlled and directed for a more vigorous plant.

The same principle can be applied to machine learning algorithms. Removing bits and pieces along coding branches in those algorithms can reduce complexity in decision trees and increase predictive performance.

Researchers at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) have done just that. Exploring with binarized neural networks (BNNs), they used pruning principles to significantly reduce computation complexity and memory demands. BNNs are close cousins to deep neural networks, which require large amounts of computation. But BNNs differ in a significant way: they use single bits to encode each neuron and parameter, using much less energy and power for computation.

Pruning for faster growth

Researchers recognized the potential value of BNNs for machine learning starting in about 2016. If constructed–or pruned–just the right way, they consume less computing energy and are nearly as accurate as deep neural networks. That means BNNs have more potential to benefit resource-constrained environments, such as mobile phones, smart devices, and the entire Internet of Things ecosystem.

This is where pruning comes into play. As neural networks research has grown in recent years, pruning has gained more interest among computing researchers.

“Pruning is currently a hot topic in machine learning,” said PNNL computer scientist Ang Li. “We can add software and architecture coding to push the trimming towards a direction that will have more benefits for the performance of computing devices. These benefits include lower energy needs and lower computing costs.”

Pruning for precision

Li was among a group of PNNL researchers who recently published results in the Institute of Electrical and Electronics Engineers Transactions on Parallel and Distributed Systems showing the benefits of selective pruning. The research demonstrated that pruning redundant bits of the BNN architecture led to a custom-built out-of-order BNN, called O3BNN-R. Their work shows a highly-condensed BNN model–which already could display high-performing supercomputing qualities–can be shrunk signi?cantly further without loss of accuracy.

“Binarized neural networks have the potential of making the processing time of neural networks around microseconds,” said Tong “Tony” Geng, a Boston University doctoral candidate who, as a PNNL intern, assisted Li on the O3BNN-R project.

See Also

“BNN research is headed in a promising direction to make neural networks really useful and be readily adopted in the real-world,” said Geng, who will rejoin the PNNL staff in January as a postdoctoral research fellow. “Our finding is an important step to realize this potential.”

Their research shows this out-of-order BNN can prune, on average, 30 percent of operations without any accuracy loss. With even more fine tuning–in a step called “regularization at training”–the performance can be improved an additional 15 percent.

Pruning for power

In addition to this out-of-order BNN’s contributions to the Internet of Things, Li also pointed to potential benefits to the energy grid. Implementation of a modified BNN could also provide a boost to existing software that guards against cyberattacks when deployed in the power grid by helping existing sensors detect and respond to an attack, said Li.

“Basically,” said Li, “we are accelerating the speed of processing in hardware.”

The Latest Updates from Bing News & Google News

Go deeper with Bing News on:
Binarized neural networks
Go deeper with Google Headlines on:
Binarized neural networks

[google_news title=”” keyword=”binarized neural networks” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]

Go deeper with Bing News on:
BNN research
  • Course-based Undergraduate Research Experience (CURE)

    CURE is an acronym for ‘Course-Based Undergraduate Research Experience’. It is a novel form of classroom based courses that offer students hands on experience doing original research and offer faculty ...

  • Research & Scholarship

    Because it takes a research university. William & Mary is a fully formed university making our share of a university's contributions to the creation of knowledge — research and scholarship. Our ...

  • Top Science News

    Apr. 24, 2024 — New research has highlighted the profound link between dietary choices and brain ... How Do Birds Flock? Researchers Do the Math to Reveal Previously Unknown Aerodynamic ...

  • Service Learning and Undergraduate Research

    The UAB Office of Service Learning and Undergraduate Research connects students, faculty, and community nonprofit partners to enrich student academic learning, promote civic engagement, and to ...

  • Post-Halving Surge: Standard Chartered Predicts Bitcoin to Hit $150K on Reduced Market Leverage

    Geoff Kendrick, Standard Chartered Banks analyst and head of digital assets research, believes bitcoin (BTC) would likely trend upward following the halving due to lower leveraged market positions. In ...

Go deeper with Google Headlines on:
BNN research

[google_news title=”” keyword=”BNN research” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]

What's Your Reaction?
Don't Like it!
0
I Like it!
0
Scroll To Top