Distributed estimation based on different sources of observations has drawn attention in the modern statistical learning. In practice, due to the expensive cost or time‐consuming process to collect data in… Click to show full abstract
Distributed estimation based on different sources of observations has drawn attention in the modern statistical learning. In practice, due to the expensive cost or time‐consuming process to collect data in some cases, the sample size on each local site can be small, but the dimension of covariates is large and may be far larger than the sample size on each site. In this article, we focus on the distributed estimation and inference for a preconceived low‐dimensional parameter vector in the high‐dimensional quantile regression model with small local sample size. Specifically, we consider that the data are inherently distributed and propose two communication‐efficient estimators by generalizing the decorrelated score approach to conquer the slow convergence rate of nuisance parameter estimation and adopting the smoothing technique based on multiround algorithms. The risk bounds and limiting distributions of the proposed estimators are given. The finite sample performance of the proposed estimators is studied through simulations and an application to a gene expression dataset is also presented.
               
Click one of the above tabs to view related content.