Synthesizing realistic human motion data using a real-time motion capture system in a controlled environment is a critical challenge. In addition, effectively manipulating the existing motion data is another primary… Click to show full abstract
Synthesizing realistic human motion data using a real-time motion capture system in a controlled environment is a critical challenge. In addition, effectively manipulating the existing motion data is another primary concern and using such modified data in human motion analysis and activity recognition systems are prone to errors. This paper presents a simplistic and comprehensive system to effortlessly author, edit, and validate human motion data. The system enables a naive user to edit the existing motion data interactively using a humanoid model in a three-dimensional space, based on user-defined scenarios and synthesize numerous variations of the motion sequences. A modular concept of scenario-based sensed unit motion editing has been adopted to demonstrate the proposed system. We employed an efficient analytical kinematic and constraint solver to enforce the inherent body joint limits and external constraints while editing to synthesize complete and meaningful motion sequences. Furthermore, we substantiated the proposed sensed unit motion editing framework through a visual validation study using an open-source intuitive visualization tool called the Motion-Sphere. Finally, we compared the resultant synthesized motion against the real-time motion capture system data to verify the body segments’ orientation and position accuracy deviations.
               
Click one of the above tabs to view related content.