Augmented Reality (AR) had been used in mechanical assembly supporting. The current reported researches of AR assembly supporting system (ARASS) less concerned about the interaction between the operators’ hands and… Click to show full abstract
Augmented Reality (AR) had been used in mechanical assembly supporting. The current reported researches of AR assembly supporting system (ARASS) less concerned about the interaction between the operators’ hands and virtual objects, especially on the occlusion between hands and virtual objects. The correct occlusion results would make ARASS more immersive and then the operators’ interaction experiences were increased. To address this issue, this paper presented a bared-hand depth perception method that was designed for improving interaction experience in ARASS. The method was based on an interaction scene that was designed for AR mechanical assembly supporting. The method consisted of two components: hand segmentation and hand mesh generation. Hand segmentation method segmented operator’s hand areas from depth image of scene and divided hand area into several sub areas that provided hand information. Hand surface mesh generation method generated hand surface meshes in 3D space based on the results of hand segmentation. The meshes were used to solve the occlusion problem of hand area in AR scene. The results verified that the bared-hand depth perception method could handle the occlusion between operator’s hands and virtual objects correctly in real-time and recognize hand information in a limited space. The method could increase operators’ depth perception of bared hand and make ARASS more immersive.
               
Click one of the above tabs to view related content.