Assessment of Fatigue Using Wearable Sensors: A Pilot Study - 2020 Luo et al

Sly Saint

Senior Member (Voting Rights)
Abstract
Background: Fatigue is a broad, multifactorial concept encompassing feelings of reduced physical and mental energy levels. Fatigue strongly impacts patient health-related quality of life across a huge range of conditions, yet, to date, tools available to understand fatigue are severely limited.

Methods:
After using a recurrent neural network-based algorithm to impute missing time series data form a multisensor wearable device, we compared supervised and unsupervised machine learning approaches to gain insights on the relationship between self-reported non-pathological fatigue and multimodal sensor data.

Results:
A total of 27 healthy subjects and 405 recording days were analyzed. Recorded data included continuous multimodal wearable sensor time series on physical activity, vital signs, and other physiological parameters, and daily questionnaires on fatigue. The best results were obtained when using the causal convolutional neural network model for unsupervised representation learning of multivariate sensor data, and random forest as a classifier trained on subject-reported physical fatigue labels (weighted precision of 0.70 ± 0.03 and recall of 0.73 ± 0.03). When using manually engineered features on sensor data to train our random forest (weighted precision of 0.70 ± 0.05 and recall of 0.72 ± 0.01), both physical activity (energy expenditure, activity counts, and steps) and vital signs (heart rate, heart rate variability, and respiratory rate) were important parameters to measure. Furthermore, vital signs contributed the most as top features for predicting mental fatigue compared to physical ones. These results support the idea that fatigue is a highly multimodal concept. Analysis of clusters from sensor data highlighted a digital phenotype indicating the presence of fatigue (95% of observations) characterized by a high intensity of physical activity. Mental fatigue followed similar trends but was less predictable. Potential future directions could focus on anomaly detection assuming longer individual monitoring periods.

Conclusion:
Taken together, these results are the first demonstration that multimodal digital data can be used to inform, quantify, and augment subjectively captured non-pathological fatigue measures.

full text:
https://www.karger.com/Article/FullText/512166
 
I suppose it's a start. Better data might be obtained by monitoring eye movements and body movements and stance and other such involuntary factors. My guess is that even when reading (hold font, lighting, etc constant, and material similar), there will be a measurable difference in how long the eye lingers on a word, how quickly the person turns the page, and so forth. Maybe pupil diameter fluctuations changes with fatigue levels too.

My mental lethargy increases abruptly 20 minutes after eating a meal of quickly-digested carbs (I assume due to increased cerebral TRP -> KYN conversion). I can be reading a book, and suddenly my eyes stop tracking the words properly and I loose track of the story. I expect that will show up as changes in eye movement or pupil diameter, and maybe in finger twitch rate or some other involuntary muscle measurements.
 
Back
Top Bottom