In creating an entirely new way to compress data, a team of researchers from the UCLA Henry Samueli School of Engineering and Applied Science has drawn inspiration from physics and the arts.
The result is a new data compression method that outperforms existing techniques, such as JPEG for images, and that could eventually be adopted for medical, scientific and video streaming applications.
In data communication, scientific research and medicine, an increasing number of today’s applications require the capture and analysis of massive amounts of data in real time.
But “big data,” as it’s known, can present big problems, particularly in specialized fields in which the events being studied occur at rates that are too fast to be sampled and converted into digital data in real time. For example, in order to detect rare cancer cells in blood, researchers must screen millions of cells in a high-speed flow stream.
To help improve the process, the UCLA group, led by Bahram Jalali, holder of the Northrop Grumman Opto-Electronic Chair in Electrical Engineering, and including postdoctoral researcher Mohammad Asghari, created an entirely new method of data compression. The technique reshapes the signal carrying the data in a fashion that resembles the graphic art technique known as anamorphism, which has been used since the 1500s to create optical illusions in art and, later, film.
The Jalali group discovered that it is possible to achieve data compression by stretching and warping the data in a specific fashion prescribed by a newly developed mathematical function. The technology, dubbed “anamorphic stretch transform,” or AST, operates both in analog and digital domains. In analog applications, AST makes it possible to not only capture and digitize signals that are faster than the speed of the sensor and the digitizer, but also to minimize the volume of data generated in the process.
AST can also compress digital records — for example, medical data so it can be transmitted over the Internet for a tele-consultation. The transformation causes the signal to be reshaped is such a way that “sharp” features — its most defining characteristics — are stretched more than data’s “coarse” features.
The technique does not require prior knowledge of the data for the transformation to take place; it occurs naturally and in a streaming fashion.
“Our transformation causes feature-selective stretching of the data and allocation of more pixels to sharper features where they are needed the most,” Asghari said. “For example, if we used the technique to take a picture of a sailboat on the ocean, our anamorphic stretch transform would cause the sailboat’s features to be stretched much more than the ocean, to identify the boat while using a small file size.”
AST can also be used for image compression, as a standalone algorithm or combined with existing digital compression techniques to enhance speed or quality or to improve the amount images can be compressed. Results have shown that AST can outperform standard JPEG image compression format, with dramatic improvement in terms of image quality and compression factor.
The new technique has its origin in another technology pioneered by the Jalali group, time stretch dispersive Fourier transform, which is a method for slowing down and amplifying faint but very fast signals so they can be detected and digitized in real time.
High-speed instruments created with this technology enabled the discovery of optical rogue waves in 2007 and the detection of cancer cells in blood with one-in-a-million sensitivity in 2012. But these instruments produce a fire hose of data that overwhelms even the most advanced computers. The need to deal with such data loads motivated the UCLA team to search for a new data compression technology.
Jalali said the discovery is rooted in — and inspired by — both physics and the arts.
“Reshaping the data by stretching and wrapping it in the prescribed manner compresses it without losing pertinent information,” he said. “It emulates what happens to waves as they travel through physical media with specific properties. It also brings to mind aspects of surrealism and the optical effects of anamorphism.”
The Latest on: Data compression
[google_news title=”” keyword=”Data compression” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Data compression
- Researchers conduct survey on deduplication systemson May 1, 2024 at 9:34 am
A review published in the International Journal of Grid and Utility Computing has investigated ways in which the increasing problem of duplicate data in computer storage systems might be addressed.
- Bingbot To Test Zstd Compression After Fully Gaining Full Brotli Compressionon April 19, 2024 at 4:12 am
Fabrice Canel from Microsoft announced that BingBot now fully supports Brotli compression and will soon be testing zstd Zstandard compression, a lossless data compression, for its crawler.
- Exploring the Apache ecosystem for data analysison April 11, 2024 at 2:00 am
The FDAP stack brings enhanced data processing capabilities to large volumes of data. Apache Arrow acts as a cross-language development platform for in-memory data, facilitating efficient data ...
- Malicious SSH backdoor sneaks into xz, Linux world's data compression libraryon March 29, 2024 at 3:24 pm
Red Hat on Friday warned that a malicious backdoor found in the widely used data compression software library xz may be present in instances of Fedora Linux 40 and the Fedora Rawhide developer ...
- Data compression IP Listingon March 24, 2024 at 5:00 pm
The CCSDS-122-E encoder core from Alma Technologies is a complete and self-contained implementation of the CCSDS122.0-B-1 image data compression standard. The encoder accepts the uncompressed ...
- Text Compression Gets Weirdly Efficient With LLMson August 27, 2023 at 3:10 am
Do neural networks and LLMs sound far too serious and complicated for your text compression needs? As long as you don’t mind a mild amount of definitely noticeable data loss, check out [Greg ...
- A Real Time Data Compression Techniqueon June 25, 2018 at 3:16 am
The most obvious answer is to use some type of compression. If you have the time to run a compressor on the data and then decompress it all inside your window, then that’s probably the answer ...
- What Is File Compression & Decompression?on August 14, 2017 at 6:40 am
Lossy Compression To make files up to 80 percent smaller, lossy compression is used. Lossy compression software removes some redundant data from a file. Because data is removed, the quality of the ...
- Samplify Systems Announces Availability of World's First 16-channel, 12-bit, 65 Msps A/D Converter With Integrated Real-Time Data Compressionon October 19, 2008 at 5:00 pm
World's lowest-power, highest channel count 12-bit ADC with Samplify's Prism(TM) data compression reduces I/O pin count and power consumption in ultrasound equipment and WiMax wireless base stations.
via Bing News