
Research Image
CREDIT: POSTECH
Breast cancer undisputedly has the highest incidence rate in female patients. Moreover, out of the six major cancers, it is the only one that has shown an increasing trend over the past 20 years.
The chance of survival would be higher if breast cancer is detected and treated early. However, the survival rate drastically decreases to less than 75% after stage 3, which means early detection with regular medical check-ups is critical for reducing patient mortality. . Recently a research team at POSTECH developed an AI network system for ultrasonography to accurately detect and diagnose breast cancer.
A team of researchers from POSTECH led by Professor Chulhong Kim (Department of Convergence IT Engineering, the Department of Electrical Engineering, and the Department of Mechanical Engineering), and Sampa Misra and Chiho Yoon (Department of Electrical Engineering) has developed a deep learning-based multimodal fusion network for segmentation and classification of breast cancers using B-mode and strain elastography ultrasound images. The findings from the study were published in Bioengineering & Translational Medicine.
Ultrasonography is one of the key medical imaging modalities for evaluating breast lesions. To distinguish benign from malignant lesions, computer-aided diagnosis (CAD) systems have offered radiologists a great deal of help by automatically segmenting and identifying features of lesions.
Here, the team presented deep learning (DL)-based methods to segment the lesions and then classify them as benign or malignant, using both B-mode and strain elastography (SE-mode) images. First of all, the team constructed a ‘weighted multimodal U-Net (W-MM-U-Net) model’ where the optimum weight is assigned on different imaging modalities to segment lesions, utilizing a weighted-skip connection method. Also, they proposed a ‘multimodal fusion framework (MFF)’ on cropped B-mode and SE-mode ultrasound (US) lesion images to classify benign and malignant lesions.
The MFF consists of an integrated feature network (IFN) and a decision network (DN). Unlike other recent fusion methods, the proposed MFF method can simultaneously learn complementary information from convolutional neural networks (CNN) that are trained with B-mode and SE-mode US images. The features of the CNN are ensembled using the multimodal EmbraceNet model, while DN classifies the images using those features.
The method predicted seven benign patients as being benign in three out of the five trials and six malignant patients as being malignant in five out of the five trials, according to the experimental results on the clinical data. This means the proposed method outperforms the conventional single and multimodal methods and would potentially enhance the classification accuracy of radiologists for breast cancer detection in US images.
Professor Chulhong Kim explained, “We were able to increase the accuracy of lesion segmentation by determining the importance of each input modal and automatically giving the proper weight.” He added, “We trained each deep learning model and the ensemble model at the same time to have a much better classification performance than the conventional single modal or other multimodal methods.”
Original Article: AI-powered ultrasound imaging that detects breast cancer
More from: Pohang University of Science and Technology
The Latest Updates from Bing News
Go deeper with Bing News on:
Detecting breast cancer
- Shannen Doherty Reveals Cancer Spread To Her Brain In Heartbreaking Radiation Video: ‘My Fear Is Obvious’
After being was diagnosed with Stage IV cancer, Shannen Doherty shared that the disease has spread to her brain, and though she’s afraid, she’s not giving up the fight.
- Breaking Boundaries in Breast Cancer Screening: Densitas® and ScreenPoint Medical Set to Improve Patient Outcomes
High-quality imaging is essential for accurate breast cancer detection, yet up to 50% of mammograms fail to meet image quality standards, with poor positioning accounting for 80% of these cases.
- Shannen Doherty says breast cancer has spread to her brain
Shannen Doherty, known for her roles in “Beverly Hills 90210” and “Charmed,” has shared an update on her health.
- Breast Cancer - Ductal Carcinoma Directory
how ductal carcinoma is detected, diagnosed, and treated, and much more. WebMD explains ductal lavage, a test used to detect cells that could turn into breast cancer.
- Researchers identify barriers to breast cancer screening in vulnerable populations
Women of racial and ethnic minorities experience challenges that hinder adherence to regular mammography screenings.
Go deeper with Bing News on:
Multimodal fusion framework
- Multimodal technique for analyzing audio and visual data improves performance of machine-learning models
Researchers from MIT, the MIT-IBM Watson AI Lab, IBM Research, and elsewhere have developed a new technique for analyzing unlabeled audio and visual data that could improve the performance of ...
- 5 Best Business Continuity Software of 2023
The Fusion Framework System is a comprehensive risk management platform that gives businesses better control over their experience through various customizable and flexible tools. With this ...
- Alphabet CEO Sundar Pichai on Leadership, AI, and Big Tech
Where these models are progressing is they’re all today you have text models, you have image models and so on, but the next generation of models will be multimodal ... and responsible and ...
- Autodesk (ADSK) Q1 2024 Earnings Call Transcript
Thank you for standing by, and welcome to Autodesk first quarter fiscal 2024 earnings conference call. At this time, all participants are in a listen-only mode. After the speaker's presentation, there ...
- Atomic energy body proposes fusion framework to manage British energy grids
the software has since been adopted by various fusion research programs, including the mammoth International Thermonuclear Experimental Reactor (ITER). It is described as a real-time framework ...