Poster Poster Program Diagnostic and Interventional Radiology Physics

An Interpretable Ensemble Machine Learning Framework for Early Breast Cancer Diagnosis Using Explainable AI

Abstract
Purpose

Early and accurate diagnosis of breast cancer is critical for improving patient survival rates; however, the clinical adoption of artificial intelligence–based diagnostic systems remains limited due to their lack of interpretability. The primary purpose of this study is to develop an interpretable ensemble machine learning framework that achieves high diagnostic accuracy while providing clinically meaningful explanations to support physician decision-making.

Methods

The proposed framework is used to assess the Breast Cancer Wisconsin Diagnostic dataset. Extensive data pre-processing is done which involves feature normalization, outlier elimination and selection of features based on correlation to eliminate redundancy in the feature set to increase model robustness. There are several machine learning classifiers, i.e., Logistic Regression, Support Vector Machine, Random Forest, XGBoost, and Extra Trees, and each of them is trained separately. To enhance predictive performance, weighted soft-voting ensemble approach is used in which model weight is optimized by receiver operating characteristic-area under the curve (ROC-AUC) performance. SHapley Additive exPlanations are used to obtain global and local feature importance to deal with the interpretability challenge, and Local Interpretable Model-Agnostic Explanations are used to produce instance-level explanations based on any prediction.

Results

It has been shown that the proposed ensemble framework achieves better accuracy, ROC- AuC, sensitivity, and specificity than individual classifiers and has a robust performance to diagnose breast tumours at an early stage. Also, the explainability analysis indicates that such clinically relevant features as mean radius, texture, and concavity have a role in classification of malignancies, which are consistent with the established medical knowledge.

Conclusion

The current research offers an open AI-based diagnostic model which is able to bridge the predictive performance and clinical interpretability gap. The suggested solution has a great potential to increase trust levels in AI-assisted breast cancer diagnosis, and it can be applied to other medical decision-support environments.

People

Related

Similar sessions

Poster Poster Program
Jul 19 · 07:00
B-Trac – Breast Tissue Rotation and Compression Apparatus for Calibration

Mammography (compressed 2D) and MRI (uncompressed 3D) capture breast tissue under different conditions, complicating tumor localization across modalities. To bridge this gap, we developed a customizable physical platform to simul...

Dayadna Hernandez Perez
Diagnostic and Interventional Radiology Physics 0 people interested
Poster Poster Program
Jul 19 · 07:00
Comprehensive Medical Physics Assessment of Digital Mammography Equipment: A Three-Year Multi-Site Evaluation of Technical Performance and Radiation Safety at 24 Saudi Arabian Healthcare Institutions (2022–2024)

To conduct a comprehensive multi-center audit evaluating the technical performance, image quality, and radiation safety of digital mammography systems across 24 unique healthcare facilities in Saudi Arabia. This study aims to est...

Sami Alshaikh, PhD
Diagnostic and Interventional Radiology Physics 0 people interested
Poster Poster Program
Jul 19 · 07:00
Starting Small: Implementing a CT Protocol Optimization Program

This talk describes our organization’s CT optimization program, and how we implemented it to make efficient use of limited physicist time.

Robert J. Cropp, PhD
Diagnostic and Interventional Radiology Physics 0 people interested