LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Estimating Bayesian Diagnostic Models with Attribute Hierarchies with the Hamiltonian-Gibbs Hybrid Sampler

Photo by sharonmccutcheon from unsplash

Bayesian estimation of diagnostic models (B-DMs) has grown in popularity in recent years. Common estimation approaches for B-DMs include Gibbs sampling (GS), Metropolis–Hastings, and Hamiltonian Monte Carlo (HMC). The latter… Click to show full abstract

Bayesian estimation of diagnostic models (B-DMs) has grown in popularity in recent years. Common estimation approaches for B-DMs include Gibbs sampling (GS), Metropolis–Hastings, and Hamiltonian Monte Carlo (HMC). The latter has been lauded for its superior performance and computational efficiency over Metropolis-based samplers. However, HMCbased samplers rely on Hamiltonian dynamics which are only defined for continuous variables (e.g., Betancourt, 2019). Thus, HMC-based samplers cannot directly incorporate discrete/categorical variables (i.e., attribute parameters) and instead circumvent this issue by marginalizing over the categorical space at each step of the chain. Marginalization over the categorical space introduces computational complexity, particularly when the categorical variables form attribute hierarchies. In this study, we built a synthesis of HMC with the Gibbs sampler for estimating B-DMs. The Gibbs sampler is well-suited for sampling categorical parameters as only the full conditional distribution of the attributes is needed and can be obtained in closed form. Our approach—the Hamiltonian-Gibbs (HG) hybrid sampler—partitions the parameter space into continuous and discrete parameter blocks and utilizes HMC to update continuous parameters (i.e., item parameters) and GS to update discrete parameters (i.e., attributes). We examine the utility of the proposed algorithm by examining the parameter recovery performance across multiple attribute hierarchy configurations and by comparing the performance of proposed sampler relative to Gibbs-only sampling with respect to (1) chain autocorrelation, (2) parameter recovery, and (3) parameter posterior SD. We addressed our research questions via a simulation study. We varied the following conditions: (A) sample size (250, 500), (B) attribute structure (no hierarchy, linear hierarchy, divergent hierarchy), and (C) attribute correlations (0.1, 0.7). Item response data were generated according to a 45-item assessment measuring three attributes. A balanced Q-matrix was utilized with each item associated with a maximum of two attributes. Item intercept and maineffect parameters were randomly generated from Uð 1:5, 0:3Þ and Uð0:3, 1:5Þ distributions, respectively. Interaction-effect parameters were randomly generated from a Uð0, 1Þ distribution. Parameter recovery of the HG sampler was evaluated via correlations between true parameter values and corresponding parameter estimates, bias, and root mean square error. We compared the results from HG to Gibbs-only sampling as implemented in the JAGS package. We also compared posterior means and 95% highest posterior density interval (HPDI) from HG and Gibbs-only sampler with an empirical dataset. Overall, we found that the HG sampler performed adequately with respect to parameter recovery according to bias, RMSE, and correlations, with comparable parameter recovery performance across different attribute specifications. Posterior mean estimates from HG were similar to those obtained with Gibbs-only sampling, however, the 95% HPDI intervals of the former were generally narrower than those from the latter, indicating higher quality sampling (see Figure 1 for results from the empirical application). In general, our preliminary results suggest that an HG sampler may be a viable alternative to existing estimation methods for B-DMs.

Keywords: diagnostic models; sampler; parameter recovery; gibbs

Journal Title: Multivariate Behavioral Research
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.