Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Development of explainable artificial intelligence model for biomedical survival analysis
Download
10786131.pdf
ABDULLAH NURİ SOMUNCUOĞLU.pdf
Date
2026-1-20
Author
Somuncuoğlu, Abdullah Nuri
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
107
views
0
downloads
Cite This
While machine learning and deep learning architectures offer unprecedented predictive accuracy in survival analysis, their deployment in high-stakes biomedical domains is hindered by their "black-box" nature. Current Explainable AI (XAI) techniques, particularly Local Interpretable Model-Agnostic Explanations (LIME), rely on stochastic perturbation strategies that often yield unstable and inconsistent explanations for identical instances, thereby compromising clinical trust. This thesis addresses the inherent instability of linear surrogate models by introducing a novel framework: MARS-LIME. By integrating Multivariate Adaptive Regression Splines (MARS) into the local explanation process, the proposed method effectively captures non-linear relationships and high-dimensional interactions within local decision boundaries, which traditional linear approximations fail to resolve. Furthermore, an extended variant, MARS2-LIME, is developed to explicitly model second-order variable interactions, enhancing the granularity of risk factor analysis. Rigorous empirical evaluations conducted on the NHANES I epidemiological dataset demonstrate that MARS-based approaches significantly outperform standard LIME and OptiLIME in terms of Coefficients Stability Index (CSI) and Variables Stability Index (VSI). The study also presents a fuzzy-logic-based visualization technique to quantify feature ranking consistency. Ultimately, this research mitigates the stochastic volatility of post-hoc explanations, providing a robust and mathematically stable foundation for interpreting complex survival models in critical healthcare applications.
Subject Keywords
Survival Analysis
,
Explainable Artificial Intelligence
,
Machine Learning and Deep Learning
,
LIME
,
MARS
URI
https://hdl.handle.net/11511/118477
Collections
Graduate School of Natural and Applied Sciences, Thesis
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
A. N. Somuncuoğlu, “Development of explainable artificial intelligence model for biomedical survival analysis,” Ph.D. - Doctoral Program, Middle East Technical University, 2026.