Researchers are developing computers capable of “approximate computing” to perform calculations good enough for certain tasks that don’t require perfect accuracy, potentially doubling efficiency and reducing energy consumption.
“The need for approximate computing is driven by two factors: a fundamental shift in the nature of computing workloads, and the need for new sources of efficiency,” said Anand Raghunathan, a Purdue Professor of Electrical and Computer Engineering, who has been working in the field for about five years. “Computers were first designed to be precise calculators that solved problems where they were expected to produce an exact numerical value. However, the demand for computing today is driven by very different applications. Mobile and embedded devices need to process richer media, and are getting smarter – understanding us, being more context-aware and having more natural user interfaces. On the other hand, there is an explosion in digital data searched, interpreted, and mined by data centers.”
A growing number of applications are designed to tolerate “noisy” real-world inputs and use statistical or probabilistic types of computations.
“The nature of these computations is different from the traditional computations where you need a precise answer,” said Srimat Chakradhar, department head for Computing Systems Architecture at NEC Laboratories America, who collaborated with the Purdue team. “Here, you are looking for the best match since there is no golden answer, or you are trying to provide results that are of acceptable quality, but you are not trying to be perfect.”
However, today’s computers are designed to compute precise results even when it is not necessary. Approximate computing could endow computers with a capability similar to the human brain’s ability to scale the degree of accuracy needed for a given task. New findings were detailed in research presented during the IEEE/ACM International Symposium on Microarchitecture, Dec. 7-11 at the University of California, Davis.
The inability to perform to the required level of accuracy is inherently inefficient and saps energy.
“If I asked you to divide 500 by 21 and I asked you whether the answer is greater than one, you would say yes right away,” Raghunathan said. “You are doing division but not to the full accuracy. If I asked you whether it is greater than 30, you would probably take a little longer, but if I ask you if it’s greater than 23, you might have to think even harder. The application context dictates different levels of effort, and humans are capable of this scalable approach, but computer software and hardware are not like that. They often compute to the same level of accuracy all the time.”
Purdue researchers have developed a range of hardware techniques to demonstrate approximate computing, showing a potential for improvements in energy efficiency.
The research paper presented during the IEEE/ACM International Symposium on Microarchitecture was authored by doctoral student Swagath Venkataramani; former Purdue doctoral student Vinay K. Chippa; Chakradhar; Kaushik Roy, Purdue’s Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering; and Raghunathan.
Recently, the researchers have shown how to apply approximate computing to programmable processors, which are ubiquitous in computers, servers and consumer electronics.
“In order to have a broad impact we need to be able to apply this technology to programmable processors,” Roy said. “And now we have shown how to design a programmable processor to perform approximate computing.”
The Latest on: Approximate computing
via Google News
The Latest on: Approximate computing
- Filings buzz: what is the interest among drinks manufacturers in cloud computing?on July 26, 2022 at 2:57 am
Mentions of cloud computing within the public filings of drinks manufacturers were 26% lower in the first quarter year-on-year, analysis claims.
- Forget Digital Computing, You Need An Analog Computeron July 25, 2022 at 5:00 pm
This is clearly work-in-progress and as they say on the main site, their focus is on chips for hybrid analog-digital computing, with a focus on energy-efficient approximate methods. With that in ...
- Wearable Computing Market Size, Share Report, Forecast Between 2022 to 2028on July 21, 2022 at 4:50 am
According to our latest report, the Wearable Computing market, which was valued at US$ million in 2022, is expected to grow at a CAGR of approximate percent over the forecast period. Receive the ...
- Filings buzz in the automotive industry: 81% increase in cloud computing mentions in Q1 of 2022on July 19, 2022 at 6:21 am
We've analysed companies' annual reports and other filings to see which key issues are receiving the most attention.
- Filings buzz in the packaging industry: 122% increase in cloud computing mentions in Q1 of 2022on July 19, 2022 at 4:00 am
We've analysed companies' annual reports and other filings to see which key issues are receiving the most attention.
- Filings buzz in pharma: 113% increase in cloud computing mentions in Q1 of 2022on July 19, 2022 at 4:00 am
CSL referred to cloud computing the most between April 2021 and March 2022. Mentions of cloud computing within the filings of companies in the pharmaceutical industry rose 113% between the final ...
- Filings buzz: 127% increase in cloud computing mentions in Q1on July 19, 2022 at 2:26 am
VF mentioned cloud computing the second most - the issue was referred to in 0.02% of sentences in the company's filings. The analysis provides an approximate indication of which companies are focusing ...
- Filings buzz in the aerospace and defence sector: 24% decrease in cloud computing mentions since Q1 of 2021on July 18, 2022 at 10:00 pm
We've analysed companies' annual reports and other filings to see which key issues are receiving the most attention.
via Bing News