Automatic patient functionality assessment from multimodal data using deep learning techniques – Development and feasibility evaluation
Emese Sukei, Santiago de Leon Martinez, Pablo M. Olmos and Antonio Artés-Rodríguez
Wearable devices and mobile sensors enable the real-time collection of an abundant source of physiological and behavioural data unobtrusively. Unlike traditional in-person evaluation or ecological momentary assessment (EMA) questionnaire-based approaches, these data sources open many possibilities in remote patient monitoring. However, defining robust models is challenging due to the data’s noisy and frequently missing observations.
This work proposes an attention-based Long Short-Term Memory (LSTM) neural network-based pipeline for predicting mobility impairment based on WHODAS 2.0 evaluation from such digital biomarkers. Furthermore, we addressed the missing observation problem by utilising hidden Markov models and the possibility of including information from unlabelled samples via transfer learning. We validated our approach using two wearable/mobile sensor data sets collected in the wild and socio-demographic information about the patients.
Our results showed that in the WHODAS 2.0 mobility impairment prediction task, the proposed pipeline outperformed a prior baseline while additionally providing interpretability with attention heatmaps. Moreover, using a much smaller cohort via task transfer learning, the same model could learn to predict generalised anxiety severity accurately based on GAD-7 scores.