Loading...

Elaris Computing Nexus

Elaris Computing Nexus


Visualizing Multi Class Decision Boundaries of Ensemble Tree Models for Improved Interpretability


Elaris Computing Nexus

Received On : 25 May 2025

Revised On : 02 August 2025

Accepted On : 30 August 2025

Published On : 25 September 2025

Volume 01, 2025

Pages : 157-169


Abstract

Accurate and interpretable multi-class classification remains a significant challenge in machine learning, particularly for datasets with overlapping feature distributions. Traditional ensemble methods, such as Random Forest and boosting algorithms, often face a trade-off between accuracy and interpretability in Random Forests provide stability but may retain bias, while boosting models achieve high accuracy at the expense of fragmented and less understandable decision boundaries. The Hybrid Boosted Forest (HBF) is a novel ensemble framework that integrates the diversity of Random Forests with the adaptive weighting mechanism of boosting. HBF incorporates dynamic tree depth selection based on feature heterogeneity, weighted aggregation of tree predictions, and a controlled boosting stage that emphasizes misclassified samples, resulting in robust performance and interpretable decision boundaries. Evaluation of HBF on the Iris dataset using multiple feature pairs demonstrates superior performance compared with six state-of-the-art models, including Decision Tree, Random Forest, Extra Trees, AdaBoost, Gradient Boosting, and XGBoost. HBF achieves an accuracy of 98.1%, surpassing the next best model (XGBoost at 97.2%), while maintaining high interpretability (7/10) and balanced computational efficiency. Decision boundary visualizations illustrate smooth, structured, and human-understandable class separations compared with baseline models. The results confirm that HBF offers a robust, explainable, and computationally practical solution for multi-class classification, providing a promising direction for ensemble learning research that demands both performance and interpretability.

Keywords

Hybrid Boosted Forest, Multi-Class Classification, Ensemble Learning, Decision Boundary Visualization, Interpretability.

  1. J. Yoo and L. Sael, “EDiT: Interpreting Ensemble Models via Compact Soft Decision Trees,” 2019 IEEE International Conference on Data Mining (ICDM), pp. 1438–1443, Nov. 2019, doi: 10.1109/icdm.2019.00187.
  2. L. J. Mena et al., “Enhancing financial risk prediction with symbolic classifiers: addressing class imbalance and the accuracy–interpretability trade–off,” Humanities and Social Sciences Communications, vol. 11, no. 1, Nov. 2024, doi: 10.1057/s41599-024-04047-5.
  3. E. Rocha Liedl, S. M. Yassin, M. Kasapi, and J. M. Posma, “Topological embedding and directional feature importance in ensemble classifiers for multi-class classification,” Computational and Structural Biotechnology Journal, vol. 23, pp. 4108–4123, Dec. 2024, doi: 10.1016/j.csbj.2024.11.013.
  4. S. Krishnamoorthy, “Interpretable Classifier Models for Decision Support Using High Utility Gain Patterns,” IEEE Access, vol. 12, pp. 126088–126107, 2024, doi: 10.1109/access.2024.3455563.
  5. Coscia, V. Dentamaro, S. Galantucci, A. Maci, and G. Pirlo, “Automatic decision tree-based NIDPS ruleset generation for DoS/DDoS attacks,” Journal of Information Security and Applications, vol. 82, p. 103736, May 2024, doi: 10.1016/j.jisa.2024.103736.
  6. L. Lei, S. Shao, and L. Liang, “An evolutionary deep learning model based on EWKM, random forest algorithm, SSA and BiLSTM for building energy consumption prediction,” Energy, vol. 288, p. 129795, Feb. 2024, doi: 10.1016/j.energy.2023.129795.
  7. M. Yousefi, V. Oskoei, H. R. Esmaeli, and M. Baziar, “An innovative combination of extra trees within adaboost for accurate prediction of agricultural water quality indices,” Results in Engineering, vol. 24, p. 103534, Dec. 2024, doi: 10.1016/j.rineng.2024.103534.
  8. R. Cep, M. Elangovan, J. V. N. Ramesh, M. K. Chohan, and A. Verma, “Convolutional Fine-Tuned Threshold Adaboost approach for effectual content-based image retrieval,” Scientific Reports, vol. 15, no. 1, Mar. 2025, doi: 10.1038/s41598-025-93309-6.
  9. W. Zhang, P. Shi, P. Jia, and X. Zhou, “A novel gradient boosting approach for imbalanced regression,” Neurocomputing, vol. 601, p. 128091, Oct. 2024, doi: 10.1016/j.neucom.2024.128091.
  10. X. Li et al., “Exploring interactive and nonlinear effects of key factors on intercity travel mode choice using XGBoost,” Applied Geography, vol. 166, p. 103264, May 2024, doi: 10.1016/j.apgeog.2024.103264.
  11. X. Mao et al., “A variable weight combination prediction model for climate in a greenhouse based on BiGRU-Attention and LightGBM,” Computers and Electronics in Agriculture, vol. 219, p. 108818, Apr. 2024, doi: 10.1016/j.compag.2024.108818.
  12. Z. Fan, J. Gou, and S. Weng, “A Feature Importance-Based Multi-Layer CatBoost for Student Performance Prediction,” IEEE Transactions on Knowledge and Data Engineering, vol. 36, no. 11, pp. 5495–5507, Nov. 2024, doi: 10.1109/tkde.2024.3393472.
  13. S. Y. Ugurlu, “Inter‐Hammett: Enhancing Interpretability in Hammett‘s Constant Prediction via Extracting Rules,” ChemistrySelect, vol. 10, no. 30, Aug. 2025, doi: 10.1002/slct.202501778.
  14. Körner et al., “Explainable Boosting Machine approach identifies risk factors for acute renal failure,” Intensive Care Medicine Experimental, vol. 12, no. 1, Jun. 2024, doi: 10.1186/s40635-024-00639-2.
  15. Ghasemkhani, K. F. Balbal, and D. Birant, “A New Predictive Method for Classification Tasks in Machine Learning: Multi-Class Multi-Label Logistic Model Tree (MMLMT),” Mathematics, vol. 12, no. 18, p. 2825, Sep. 2024, doi: 10.3390/math12182825.
  16. Punyangarm and S. Chotayakul, “Hybrid sequence learning with interpretability for multi-class quality prediction in injection molding,” Results in Engineering, vol. 27, p. 106408, Sep. 2025, doi: 10.1016/j.rineng.2025.106408.
  17. Q. Yuan, L. Zhao, S. Wang, Y. Chang, and F. Wang, “Quality analysis and prediction for multi-phase multi-mode injection molding processes,” 2018 Chinese Control and Decision Conference (CCDC), pp. 3591–3596, Jun. 2018, doi: 10.1109/ccdc.2018.8407745.
  18. S. Struchtrup, D. Kvaktun, and R. Schiffers, “Adaptive quality prediction in injection molding based on ensemble learning,” Procedia CIRP, vol. 99, pp. 301–306, 2021, doi: 10.1016/j.procir.2021.03.045.
  19. A. Haldorai, R. Babitha Lincy, M. Suriya, and M. Balakrishnan, “Enhancing Military Capability Through Artificial Intelligence: Trends, Opportunities, and Applications,” Artificial Intelligence for Sustainable Development, pp. 359–370, 2024, doi: 10.1007/978-3-031-53972-5_18.
  20. G. Gokilakrishnan, P. A. Varthnan, D. V. Kumar, Ram. Subbiah, and A. H, “Modeling and Performance Evaluation for Intelligent Internet of Intelligence Things,” 2023 9th International Conference on Advanced Computing and Communication Systems (ICACCS), Mar. 2023, doi: 10.1109/icaccs57279.2023.10112692.
  21. H. Jung, J. Jeon, D. Choi, and J.-Y. Park, “Application of Machine Learning Techniques in Injection Molding Quality Prediction: Implications on Sustainable Manufacturing Industry,” Sustainability, vol. 13, no. 8, p. 4120, Apr. 2021, doi: 10.3390/su13084120.
CRediT Author Statement

The author reviewed the results and approved the final version of the manuscript.

Acknowledgements

Authors thank Reviewers for taking the time and effort necessary to review the manuscript.

Funding

No funding was received to assist with the preparation of this manuscript.

Ethics Declarations

Conflict of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Availability of Data and Materials

Data sharing is not applicable to this article as no new data were created or analysed in this study.

Author Information

Contributions

All authors have equal contribution in the paper and all authors have read and agreed to the published version of the manuscript.

Corresponding Author



Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution NoDerivs is a more restrictive license. It allows you to redistribute the material commercially or non-commercially but the user cannot make any changes whatsoever to the original, i.e. no derivatives of the original work. To view a copy of this license, visit: https://creativecommons.org/licenses/by-nc-nd/4.0/

Cite this Article

Vincenzo Anselmi, “Visualizing Multi Class Decision Boundaries of Ensemble Tree Models for Improved Interpretability”, Elaris Computing Nexus, pp. 157-169, 2025, doi: 10.65148/ECN/2025015.

Copyright

© 2025 Vincenzo Anselmi. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.