Loading...

Evaluation of Explainability Methods for Breast Cancer Histopathological Image Classification

Afshar Mazandaran, Pardis | 2025

0 Viewed
  1. Type of Document: M.Sc. Thesis
  2. Language: Farsi
  3. Document No: 57912 (19)
  4. University: Sharif University of Technology
  5. Department: Computer Engineering
  6. Advisor(s): Fatemizadeh, Emadeddin; Rohban, Mohammad Hossein
  7. Abstract:
  8. The analysis of histopathological images is essential for the accurate diagnosis of cancer and the development of treatment plans. With significant advancements in deep learning and the adoption of advanced models such as convolutional neural networks, the accuracy and efficiency of image analysis have greatly improved. However, these models are often referred to as "black boxes," and one of their major challenges is the lack of transparency in their decision-making processes. This opacity reduces trust in their results, particularly in sensitive medical fields. To address this issue, the field of Explainable Artificial Intelligence (XAI) has emerged, aiming to provide clear and understandable explanations for the behavior and outcomes of deep learning models. Despite notable progress, a key challenge remains the absence of standardized criteria and methods for assessing the quality and reliability of the explanations produced by these methods. In medical applications, simply providing explanations is insufficient; these explanations must be rigorously validated to ensure they are consistent with established scientific data and evidence. In this study, we introduce a framework to evaluate explainable AI methods in histopathological image analysis. The framework incorporates a novel occlusion strategy based on image inpainting, which improves upon traditional occlusion strategies by preventing the generation of unrealistic images and artifacts. A key advantage of this approach is that by minimizing artifacts, the model's predictions become more accurate and trustworthy, enhancing the overall reliability of explainable AI evaluations. This method utilizes a diffusion model to inpaint occluded regions, filling them with healthy or non-cancerous tissue. Additionally, the framework allows for the comparison of different explainable AI methods, enabling the assessment of the quality and reliability of their generated explanations. Such comparisons facilitate the identification of the most effective method for histopathological image analysis, leading to improved diagnostic accuracy. Moreover, these comparisons highlight the strengths and weaknesses of each method, contributing to the refinement of their performance in data evaluation and analysis. Ultimately, this approach will enable doctors and researchers to confidently rely on the results of AI models in histopathological image analysis, increasing trust in the use of artificial intelligence for more precise and accurate disease diagnosis
  9. Keywords:
  10. Explainable Artificial Intelligence ; Cancer Diagnosis ; Breast Cancer ; Heatmap Evaluation ; Histopathological Image Analysis

 Digital Object List

 Bookmark

...see more