Attention is a condition when someone concentrates on a specific task while ignoring other perceivable information. Numerous methods of attention level detection such as observation, self-assessment, and objective performance have… Click to show full abstract
Attention is a condition when someone concentrates on a specific task while ignoring other perceivable information. Numerous methods of attention level detection such as observation, self-assessment, and objective performance have been applied especially in supervised machine learning. But those methods tend to be delayed, sporadic, not at the moment in time, and based on participant cognitive ability. This study proposed a new labeling method for attention level detection by using quantitative evaluation formula based on blink rates and pupillometry. Comparison in error detection between self-assessment, observation, and objective performance has been done in this study. After that, this study investigated the effect of attention level based on self-assessment toward blink rates and pupillometry. The result shown blink rates in low attention is higher than high attention. On the other hand, pupillometry in low attention is smaller than high attention. The effect of attention levels toward pupillometry and blink rates are extracted into several algorithms. The result from experimental procedure shown quantitative evaluation formula has percentage error less than 15% compared with self-assessment. Overall, these results demonstrated that the proposed method can be used to be data labeling for other physiological signals such as electroencephalograph (EEG), electrocardiograph (ECG), and near-infrared spectroscopy (NIRS). After that, this quantitative formula was applied to EEG–ECG–NIRS for attention level detection. Two-electrode wireless EEG, a wireless ECG, and two wireless channels NIRS has been used to detect attention level during tasks load. Our result has shown the accuracy system 82.31%.
               
Click one of the above tabs to view related content.