New research aims to open the ‘black box’ of computer vision
It can take years of birdwatching experience to tell one species from the next. But using an artificial intelligence technique called deep learning, Duke University researchers have trained a computer to identify up to 200 species of birds from just a photo.
The real innovation, however, is that the A.I. tool also shows its thinking, in a way that even someone who doesn’t know a penguin from a puffin can understand.
The team trained their deep neural network — algorithms based on the way the brain works — by feeding it 11,788 photos of 200 bird species to learn from, ranging from swimming ducks to hovering hummingbirds.
The researchers never told the network “this is a beak” or “these are wing feathers.” Given a photo of a mystery bird, the network is able to pick out important patterns in the image and hazard a guess by comparing those patterns to typical species traits it has seen before.
Along the way it spits out a series of heat maps that essentially say: “This isn’t just any warbler. It’s a hooded warbler, and here are the features — like its masked head and yellow belly — that give it away.”
Duke computer science Ph.D. student Chaofan Chen and undergraduate Oscar Li led the research, along with other team members of the Prediction Analysis Lab directed by Duke professor Cynthia Rudin.
They found their neural network is able to identify the correct species up to 84% of the time — on par with some of its best-performing counterparts, which don’t reveal how they are able to tell, say, one sparrow from the next.
Rudin says their project is about more than naming birds. It’s about visualizing what deep neural networks are really seeing when they look at an image.
Similar technology is used to tag people on social networking sites, spot suspected criminals in surveillance cameras, and train self-driving cars to detect things like traffic lights and pedestrians.
The problem, Rudin says, is that most deep learning approaches to computer vision are notoriously opaque. Unlike traditional software, deep learning software learns from the data without being explicitly programmed. As a result, exactly how these algorithms ‘think’ when they classify an image isn’t always clear.
Rudin and her colleagues are trying to show that A.I. doesn’t have to be that way. She and her lab are designing deep learning models that explain the reasoning behind their predictions, making it clear exactly why and how they came up with their answers. When such a model makes a mistake, its built-in transparency makes it possible to see why.
For their next project, Rudin and her team are using their algorithm to classify suspicious areas in medical images like mammograms. If it works, their system won’t just help doctors detect lumps, calcifications and other symptoms that could be signs of breast cancer. It will also show which parts of the mammogram it’s homing in on, revealing which specific features most resemble the cancerous lesions it has seen before in other patients.
In that way, Rudin says, their network is designed to mimic the way doctors make a diagnosis. “It’s case-based reasoning,” Rudin said. “We’re hoping we can better explain to physicians or patients why their image was classified by the network as either malignant or benign.”
Learn more: THIS A.I. BIRDWATCHER LETS YOU ‘SEE’ THROUGH THE EYES OF A MACHINE
The Latest on: Deep learning
[google_news title=”” keyword=”deep learning” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Deep learning
- New multi-task deep learning framework integrates large-scale single-cell proteomics and transcriptomics dataon April 26, 2024 at 7:35 am
The exponential progress in single-cell multi-omics technologies has led to the accumulation of large and diverse multi-omics datasets. However, the integration of single-cell proteomics and ...
- Europe taps deep learning to make industrial robots safer colleagueson April 26, 2024 at 1:07 am
European researchers have launched the RoboSAPIENS project to make adaptive industrial robots more efficient and safer to work with humans.
- AI-powered 'deep medicine' could transform health care in the NHS and reconnect staff with their patientson April 25, 2024 at 10:20 am
Today's NHS faces severe time constraints, with the risk of short consultations and concerns about the risk of misdiagnosis or delayed care. These challenges are compounded by limited resources and ...
- Researchers develop deep learning alternative to monitoring laser powder bed fusionon April 24, 2024 at 9:12 am
Many things can go wrong when additively manufacturing (AM) metal and without in-situ process monitoring, defects can only be detected and characterized after a product is built. Most commonly, ...
- Deep Reinforcement Learning: Applications and Future Directionson April 23, 2024 at 9:38 am
Deep reinforcement learning (DRL) integrates the reinforcement learning's decision-making ability and deep learning's feature representation ability to attain robust end-to-end learning control ...
- Deep learning predicts heart arrhrythmia 30 minutes in advanceon April 23, 2024 at 4:15 am
Atrial fibrillation is the most common cardiac arrhythmia worldwide with around 59 million people concerned in 2019. This irregular heartbeat is associated with increased risks of heart failure, ...
- Researchers develop deep-learning model capable of predicting cardiac arrhythmia 30 minutes before it happenson April 22, 2024 at 1:23 pm
Atrial fibrillation is the most common cardiac arrhythmia worldwide with around 59 million people concerned in 2019. This irregular heartbeat is associated with increased risks of heart failure, ...
- Deep learning tool may advance precision medicine approacheson April 19, 2024 at 6:00 am
The deep learning-based Lifelong Neural Network for Gene Regulation tool may shed light on how genetic variations influence a patient's drug response.
- Using deep learning to image the Earth's planetary boundary layeron April 18, 2024 at 1:43 pm
Although the troposphere is often thought of as the closest layer of the atmosphere to the Earth's surface, the planetary boundary layer (PBL)—the lowest layer of the troposphere—is actually the part ...
- Quantum Deep Learning: Unlocking New Frontierson April 8, 2024 at 7:35 pm
Quantum computing and deep learning have seen major breakthroughs in past decades. In recent times, the convergence of these fields has resulted in the development of quantum-inspired deep learning ...
via Bing News