Publication Date

Fall 2023

Degree Type

Master's Project

Degree Name

Master of Science in Computer Science (MSCS)

Department

Computer Science

First Advisor

Genya Ishigaki

Second Advisor

Fabio Di Troia

Third Advisor

Navrati Saxena

Keywords

Convolutional Neural Networks, Explainable Artificial Intelligence, Local Interpretable Model-agnostic Explanations, Medical Image Classification.

Abstract

Diabetes, a chronic metabolic disorder, poses a significant health threat with potentially severe consequences, including diabetic retinopathy, a leading cause of blindness. In this project, we tackle this threat by developing a Convolutional Neural Network (CNN) to support the diagnosis based on eye images. The aim is early detection and intervention to mitigate the effects of diabetes on eye health. To enhance transparency and interpretability, we incorporate explainable AI techniques. This research not only contributes to the early diagnosis of diabetic eye disease but also advances our understanding of how deep learning models arrive at their decisions, fostering trust and clinical applicability in healthcare diagnostics.

Our results show that our CNN model performs exceptionally well in classifying ocular images, attaining a 91% accuracy rate. Furthermore, we implemented explainable AI techniques, such as LIME (Local Interpretable Model-agnostic Explanations), which improves the transparency of our model’s decision-making. The areas of interest in the eye images were clarified for us by LIME, which enhanced our understanding of the model’s predictions. The high accuracy and interpretability of our approach demonstrate its potential for clinical applications and the broader field of healthcare diagnostics.

Share

COinS