Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate… Click to show full abstract
Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we propose a novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event. SmartGrip emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips. To achieve this, we analyze how a sound signal propagates from the speaker to the microphone of a mobile device and then address three key challenges: sound structure design, volume control, and feature extraction and classification. We implement and evaluate SmartGrip on three Android mobile devices. With six representative grips, SmartGrip exhibits 93.1% average accuracy for ten users in an office environment. We also demonstrate that SmartGrip operates with 83.5 to 98.3% accuracy in six different (noisy) locations. Further demonstrating the feasibility of SmartGrip as a user interface, we develop an Android application that exploits SmartGrip, validating its practical usage.
               
Click one of the above tabs to view related content.