Clinical data is generated continuously, necessitating machine learning models to also adapt with the evolving data, without catastrophically forgetting the past. In this paper, we explore addressing the needs of such sequentially generated clinical data through continual learning. Specifically, we introduce a Continual learning Bayesian Long Short Term Memory (C-BLSTM), for learning a sequence of tasks. C-BLSTM includes architectural pruning through re-initializing redundant weights, regularization through variational inference, and memory replay through a coreset with class-balanced sampling. The C-BLSTM is demonstrated on two public healthcare data sets (MIMIC III and the PhysioNet Challenge 2012) for in-hospital mortality prediction. In these data sets, dynamic environments are simulated based on (i) ordinality of episodes; (ii) cardinality of patients and (iii) healthcare data site. The performance results show that the C-BLSTM, and hence continual learning, is effective in learning from sequence of clinical time series.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR, Biomedical Research Council - Diabetes Clinic of the Future Programme - HBMSIAF-PP grant
Grant Reference no. : H19/01/a0/023