In this letter, we present a sharp algorithmic analysis for alternating projected gradient descent which is used to solve the covariate adjusted precision matrix estimation problem in high-dimensional settings. By… Click to show full abstract
In this letter, we present a sharp algorithmic analysis for alternating projected gradient descent which is used to solve the covariate adjusted precision matrix estimation problem in high-dimensional settings. By introducing a new analytical tool (the generic chaining), we remove the impractical resampling assumption used in the literature. The new analysis also demonstrates that this algorithm not only enjoys a linear convergence rate in the absence of convexity, but also attains the minimax rate with optimal order of sample complexity. Our results, meanwhile, reveal a time-data tradeoff in this problem. Numerical experiments are provided to verify our theoretical results.
               
Click one of the above tabs to view related content.