Predictive Performance and Interpretability: Glass Box Modeling with Explainable Boosting Machine


  • Cem Özkurt Sakarya University of Applied Sciences



Explainable Boosting Machine, XGBoost, Glass Box Model, Black Box Model, Interpretability, Telecom Churn Prediction


This study aims to perform a comprehensive comparative analysis of glass box modeling using Explainable Boosting Machine (EBM) and black box modeling with XGBoost in the context of telecom churn prediction. The primary focus is on evaluating predictive performance and interpretability, recognizing the trade-off between model complexity and explainability. The Explainable Boosting Machine (EBM) is chosen as a representative glass box model due to its intrinsic interpretability features, enabling us to gain insights into the decision-making process. On the other hand, XGBoost, a well-known black box model, is selected for its superior predictive capabilities, often at the cost of reduced interpretability. The study employs a Telecom Churn dataset, and both models are trained, evaluated, and compared in terms of accuracy, precision, recall, and F1 score. Interpretability is assessed through global and local explanations generated by EBM and LimeTabular for XGBoost. Results from this comparative analysis provide valuable insights into the balance between model performance and interpretability in the specific domain of telecom churn prediction. This exploration contributes to the ongoing discussion regarding the appropriate model choice, considering the diverse needs of stakeholders ranging from data scientists to business analysts.

Author Biography

Cem Özkurt, Sakarya University of Applied Sciences

Artificial Intelligence and Data Science Research and Application Center,Turkey




How to Cite

Özkurt, C. (2023). Predictive Performance and Interpretability: Glass Box Modeling with Explainable Boosting Machine. AS-Proceedings, 1(7), 1109–1115.