Bayesian Smoothed Quantile Regression

Bingqi Liu, Kangqiang Li, Tianxiao Pang

公開日: 2025/8/3

Abstract

Bayesian quantile regression (BQR) based on the asymmetric Laplace distribution (ALD) has two fundamental limitations: its posterior mean yields biased quantile estimates, and the non-differentiable check loss precludes gradient-based MCMC methods. We propose Bayesian smoothed quantile regression (BSQR), a principled reformulation that constructs a novel, continuously differentiable likelihood from a kernel-smoothed check loss, simultaneously ensuring a consistent posterior by aligning the inferential target with the smoothed objective and enabling efficient Hamiltonian Monte Carlo (HMC) sampling. Our theoretical analysis establishes posterior propriety for various priors and examines the impact of kernel choice. Simulations show BSQR reduces predictive check loss by up to 50% at extreme quantiles over ALD-based methods and improves MCMC efficiency by 20-40% in effective sample size. An application to financial risk during the COVID-19 era demonstrates superior tail risk modeling. The BSQR framework offers a theoretically grounded, computationally efficient solution to longstanding challenges in BQR, with uniform and triangular kernels emerging as highly effective.

Bayesian Smoothed Quantile Regression | SummarXiv | SummarXiv