From smart work scheduling to optimal drug timing, there is enormous potential in translating circadian rhythms research results for precision medicine in the real world. However, the pursuit of such… Click to show full abstract
From smart work scheduling to optimal drug timing, there is enormous potential in translating circadian rhythms research results for precision medicine in the real world. However, the pursuit of such effort requires the ability to accurately estimate circadian phase outside of the laboratory. One approach is to predict circadian phase non-invasively using light and activity measurements and mathematical models of the human circadian clock. Most mathematical models take light as an input and predict the effect of light on the human circadian system. However, consumer-grade wearables that are already owned by millions of individuals record activity instead of light, which prompts an evaluation of the accuracy of predicting circadian phase using motion alone. Here, we evaluate the ability of four different models of the human circadian clock to estimate circadian phase from data acquired by wrist-worn wearable devices. Multiple datasets across populations with varying degrees of circadian disruption were used for generalizability. Though the models we test yield similar predictions, analysis of data from 27 shift workers with high levels of circadian disruption shows that activity, which is recorded in almost every wearable device, is better at predicting circadian phase than measured light levels from wrist-worn devices when processed by mathematical models. In those living under normal living conditions, circadian phase can typically be predicted to within 1 hour, even with data from a widely available commercial device (the Apple Watch). These results show that circadian phase can be predicted using existing data passively collected by millions of individuals with comparable accuracy to much more invasive and expensive methods.
               
Click one of the above tabs to view related content.