GS-BART: Bayesian Additive Regression Trees with Graph-split Decision Rules
Shuren He, Huiyan Sang, Quan Zhou
Published: 2025/9/8
Abstract
Ensemble decision tree methods such as XGBoost, Random Forest, and Bayesian Additive Regression Trees (BART) have gained enormous popularity in data science for their superior performance in machine learning regression and classification tasks. In this paper, we introduce a new Bayesian graph-split additive decision tree method, GS-BART, designed to enhance the performance of axis-parallel split-based BART for dependent data with graph structures. The proposed approach encodes input feature information into candidate graph sets and employs a flexible split rule that respects the graph topology when constructing decision trees. We consider a generalized nonparametric regression model using GS-BART and design a scalable informed MCMC algorithm to sample the decision trees of GS-BART. The algorithm leverages a gradient-based recursive algorithm on root directed spanning trees or chains. The superior performance of the method over conventional ensemble tree models and Gaussian process regression models is illustrated in various regression and classification tasks for spatial and network data analysis.