A Hybrid Deep Learning and Explainable AI Framework for Early Detection of Type 2 Diabetes: A Multi-Factor Approach

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Brieflands

Abstract

Background: As a global health issue, the need for sophisticated prediction models to support early diabetes mellitus diagnosis and treatment is growing. Deep learning (DL) models lack interpretability despite their accuracy; traditional machine learning (ML) models occasionally overlook the complex interaction among genetic, lifestyle, and biological components. Objectives: This work presents a hybrid DL framework combining deep neural networks (DNNs) and Extreme Gradient Boosting (XGBoost) to enhance explainability and predictive performance in early diabetes detection. Methods: This study retrospectively examined 1,284 anonymized patient records collected from two hospitals in Sirjan (March 2023 - 2025), comprising both diabetic and non-diabetic individuals. Glucose level, hemoglobin A1c (HbA1c), insulin resistance, Body Mass Index (BMI), blood pressure, and cholesterol were identified as the most significant predictors using recursive feature elimination (RFE). All analyses were conducted in Python 3.10 using TensorFlow 2.12 and XGBoost 2.0, executed on an NVIDIA RTX 4090 GPU environment. The F1-score, accuracy, precision, recall, positive predictive value (PPV), and negative predictive value (NPV) were applied to evaluate the hybrid model compared to logistic regression (LR), random forest (RF), support vector machine (SVM), standalone XGBoost, and DNN. Results: With an accuracy of 94%, the hybrid model (DNN+XGBoost) outperformed standalone models like XGBoost (89%) and DNN (91%) as well as LR (78%), SVM (82%), and others (P = 0.006). Precision and recall were attained at 93% and 95%, respectively. The most significant predictors identified by SHapley Additive exPlanations (SHAP) analysis were glucose (0.35) and HbA1c (0.30), validating the model's clarity and clinical usefulness. Conclusions: The proposed hybrid AI model balances high accuracy and interpretability, suggesting its potential utility for AI-assisted diabetes prediction in future clinical settings pending external validation. This model builds trust among clinicians by applying SHAP-based explainability.

Description

Keywords

Citation

URI

Endorsement

Review

Supplemented By

Referenced By