Idea for machine learning model to track fatigue accurately

Discussion in 'Monitoring and pacing' started by forestglip, May 20, 2024.

  1. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    Up to now, I haven't really come up with any good ways to accurately track my fatigue, anxiety, and general well-being. I've noticed that my face feels and probably looks a lot more sad, expressionless, or numb when I am fatigued or in a crash. I had an idea to train a machine learning model to be able to track fatigue using pictures of my face.

    I wrote a blog with details if anyone is interested.

    Excerpt:
     
    alktipping, rvallee, shak8 and 5 others like this.
  2. Peter Trewhitt

    Peter Trewhitt Senior Member (Voting Rights)

    Messages:
    3,813
    A very interesting idea, I have not yet looked at your blog discussing that, but will come back to this later.

    When I was still working one colleague was pretty good at spotting when I should be stopping, she said there was a point when ‘my eyes glazed over’ and in my verbal responses ‘I wasn’t quite there’ and she knew that not to expect that we would would complete the task in hand at that point. Although subjectively I think I understand what she meant, I’m not sure how that can translated into measurable markers? Perhaps eye movement, perhaps time lag in responding verbally.

    One thing I have noticed with my cognitive deterioration over the ME decades, rather than when going in an out of PEM, is that I am shifting from sequential logic-deductive debate to rather free associating on individual aspects of what the other person says. Indeed this comment is an example of that, it is not reasoned response to @forestglip ’s suggestion of using AI facial analysis, but rather contains related ideas triggered in me. It may be the same changes in logical thinking occur during PEM, but I am not enough on the ball to notice against the background noise of other symptoms worsening. Could AI be trained to notice changes in individual’s logical processing/reasoning ability?

    Added Note - another free association is that my digit span reduces dramatically when in PEM and when in an ME crash. This is the telephone number memory, the ability to immediately repeat a string of digits, what we use to go from a number in a phone directory to dialling the number. The norm is repeating seven digits plus or minus two. Pre-morbidly I could happily repeat eight or nine digits (as a former psychology student I was very aware of this) now generally I am only reliable with four or five, but when in PEM it can go down to one or two, so that I can only dial a written phone number one digit at a time whilst marking the current digit physically with my finger. I am very aware of this at present as I am currently in a battle with my mobile provider who are failing to transfer my service from an ancient mobile to a new smart phone. Lots of texted security codes, passwords and PIN numbers in numerous phone calls to automated systems and unhelpful customer service operatives, as well as now having to manually copy my stored numbers from the old to the new phone. So I am having to consciously use my digit span in ongoing PEM inducing activity.

    It would not be too complex, I imagine, to create an app to measure your current digit span, though having to keep breaking to do so would be disruptive and could add to triggering PEM.
     
    Last edited: May 20, 2024
    alktipping, shak8, MeSci and 3 others like this.
  3. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    The cool thing with AI is you don't need to know the exact differences that are relevant. If you give it two photos (in reality two sets of hundreds or thousands of photos) and teach it that one is you pre-PEM or pre-"eyes glazed over" state and the other is after, it will figure out the difference on its own to be able to tell you what state you're in from a brand new photo. Unfortunately, we're not able to clearly see what the exact differences are that the AI learned to use in the photos - this is a developing field called Explainable AI. Though we can currently at least make it roughly highlight parts of images that were most important.

    The above would require that the "eyes glazed over" state could be caught in a momentary photo. If it's something that happens across time, like eye movements, other techniques could be used, like models that use videos instead of photos.

    Sure, it's just about creatively deciding how to make a training dataset. If you want it to look at text and tell you how "logical" it is, you'd have to first show it lots of other texts with all of them accurately labeled as "logical" or "not logical", or possibly a 1 to 10 scale of "logicalness".

    Or if you want it to detect if you are feeling worse based on how you write, it'd be a similar technique to the above face photo method. You'd have to label each text with something representing how you feel, like a number 1 through 10.

    Though if the changes are very gradual, it'd be tough to label well. Maybe if you had a lot of text from years ago before you were this unwell, and a lot from these days when you're definitely feeling significantly worse, that might work.
     
  4. Trish

    Trish Moderator Staff Member

    Messages:
    53,394
    Location:
    UK
    Last edited: May 20, 2024
    alktipping, Mij, MeSci and 3 others like this.
  5. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
    @forestglip

    Instead of making a Machine Learning Model using your face and manually labelling it, which would take large amounts of energy and time, and only be usable for you.

    You could use one of the many open-source emotion detection ML models. You would have to run this model a few hundred times while rating your fatigue, and then do statistics which correlate a certain detected emotion to your rated fatigue level. Then you could write a much simpler script to “translate” said emotions into a fatigue score.

    This has the benefit of not needing to actually make or train a model yourself, while still using the technology. It also means the model will be more likely to work for different people.

    Just a sidenote, why “fatigue” and not PEM in general? I know for some people the most disabling symptom is fatigue, but that seems to be far from universal, and I imagine having an ML model that predicts your level of PEM would be more valuable than one that predicts your level of fatigue.
     
    Last edited: May 20, 2024
  6. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
    Sidenote, I only had the energy to skim through and not read your entire blog, so I’m sorry if any of the comments I made were redundant :).
     
  7. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
  8. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
  9. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    I considered using a general emotion detection model, but I'm worried that what it's trained to detect won't capture the correct details for ME/CFS-like fatigue. Like day to day, these could be extremely tiny differences in a few pixels around one area of the face, while the emotions it was trained on generally relate to changes in other parts of the face.

    Though it is worth a try, maybe just its output for "happy" would correlate well, or as you say, the combination of all its outputs.

    The original plan was to "fine-tune" one of these general emotion detection models, which is basically taking advantage of what it has learned already, which is similar to the intended task, before training it on my face and fatigue specifically. Kind of like how training someone who already knows how to ride a tricycle to ride a bicycle is easier than teaching them on a bike from the start.

    Well I'm basically talking about PEM. When I feel a little worse day to day, my fatigue, anxiety, and mood are all worse. When I have a PEM crash, they are all much worse. My hypothesis is that the same facial changes pre and post PEM crash are generalizable to tracking my symptoms in general, even outside a crash, and tracking long term changes in how I feel.

    Also, thanks for looking into those fatigue detection models! I haven't looked yet, but they might be good enough without further training.
     
  10. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    Also, if this worked well, it could possibly be used as a PEM alerter, kind of like a dog that alerts if a seizure is about to happen.

    Maybe something like if you are doing too much, then the rate your expression is changing to fatigued is high, and you can have something that alerts if the rate of change between multiple photos is above a certain level.

    Or maybe just a certain level of fatigue on your face in one photo indicates that if you continue with too much exertion you'll crash, and if you rest, you'll avoid it.
     
  11. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    That's interesting - an EEG was the best method for detecting fatigue in drivers. 100% accurate. I wonder how to get me one of those...

    Also, in the blog I wrote about how making one label for fatigue per day, but taking lots of photos, could confuse the model if there's lots of variability throughout the day. Ideally, I'd take sets of photos where I know all the ones in one are more or less tired than all those in another.

    And I was considering photos before and after PEM crashes since the change is so obvious and dramatic. But I need a lot of data and I don't want to be purposely causing crashes.

    But I realized I get a similar fatigue reaction from eating medium to large meals, especially if they're heavy in fat or protein. (Which unrelatedly I think is a clue that there is some inability to switch away from glucose metabolism. Would also explain why a low carb diet causes intense fatigue that doesn't improve, at least for a long time, as opposed to before ME/CFS.)

    Anyway, a "food crash" comes immediately after eating and only lasts a few hours, so I'm fine with initiating these for science.
     
    alktipping, Peter Trewhitt and Yann04 like this.
  12. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
    I don’t think he’s on here but if your serious about doing a decent project out of this you could contact Ror Preston, he might have some good ideas/feedback. He’s an ME patient and data scientist who is currently working on Visible (tracking illness progression and PEM through heart rate variability).
     
  13. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    For now it's just a vague idea. I'm not really equipped to do intense projects and I'm pretty much a beginner in these technologies. But I'll try to send him an email to see if he has any feedback. Thanks!
     
  14. Yann04

    Yann04 Senior Member (Voting Rights)

    Messages:
    534
    Location:
    Switzerland (Romandie)
    Fair enough!

    I wish I could have the energy to use a laptop and train some ML models. That was a large part of the degree I was studying for before I got hit with this marvelous illness.

    Anyways, cool idea! I really hope the advancement of ML and DL and other types of AI will help a lot in future for pwME.
     
  15. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    Yeah, it feels very powerful and is fun. I can imagine it will be very useful to the discovery of biomarkers and treatments in all sorts of conditions.

    I wish I was super rich and could hire a team of ML experts to make some of my (what I think are promising, like the above) ideas a reality.
     
    Trish, Peter Trewhitt and Yann04 like this.
  16. forestglip

    forestglip Senior Member (Voting Rights)

    Messages:
    371
    Just wanted to add a potential other marker, which is close to invisible without technology, like HRV and minor changes in facial expression: eye movements. I bet if I had one of those devices to record eye movements every day, there would be significant patterns that correlate to fatigue and other symptoms. Maybe something like fewer random movements per minute.

    Not the most convenient or cheap thing to track, but theoretically do-able with consumer-grade hardware.
     
    Ash, alktipping, Eleanor and 2 others like this.
  17. Sean

    Sean Moderator Staff Member

    Messages:
    7,488
    Location:
    Australia
    I would like to see both tracking, and saccade, properly assessed. Along with a bunch of other dynamic stuff, like focus, pupil dilation, blinking behaviour, lubrication/tear production & composition, etc. Including any difference between the eyes (i.e. lateralisation). All properly controlled for PEM.

    We also need both long-term time frames of assessment and adequate temporal resolution (for all symptoms not just eye related). Short-term, and/or coarse temporal resolution, is insufficient.

    This would not only reveal and better characterise the symptom profile, but also the dynamic symptom pattern over time. The interplay between that data and the activity patterns is a critical component of understanding ME/CFS, including causal relationships (and hence appropriate management and therapies). But we don't have the required data yet.

    No doubt ME/CFS is not the only condition with this gap in data.
     

Share This Page