Abstract
Decision trees rank among the most popular classification tools, employed in practical applications due to their known efficiency. A Nash equilibrium-based decision tree splits node data using the Nash equilibrium concept. Boosting is a technique that is used to enhance the performance of a classifier by allowing an in-depth exploration of the data. This paper proposes the use of an AdaBoost model with a log-loss optimization mechanism to improve the performance of an equilibrium-based decision tree. The two-step approach first builds equilibrium decision trees on weighted data; after that, determines the contribution of each classifier by optimizing the overall log-loss function. Numerical experiments illustrate the approach’s performance by comparing results on a set of synthetic and real-world data with state-of-the-art tree-based boosting methods.
Citare
@Inproceedings{Lung2025LogLossOF,
author = {R. Lung and M. Suciu},
booktitle = {Cybernetics and systems},
title = {Log-Loss Optimization for Boosting a Nash Equilibrium Decision Tree},
year = {2025}
}
