Knowledge-Aware Deep Dual Networks for Text-Based Mortality Prediction

Published in ICDE, CCF A, 2019

Recommended citation: N. Liu, P. Lu, W. Zhang and J. Wang, Knowledge-Aware Deep Dual Networks for Text-Based Mortality Prediction, 2019 IEEE 35th International Conference on Data Engineering (ICDE), 2019, pp. 1406-1417, doi: 10.1109/ICDE.2019.00127. https://ieeexplore.ieee.org/document/8731459

Mortality prediction is one of the essential tasks in medical data mining and is significant for inferring clinical outcomes. With a large number of medical notes collected from hospitals, there is an urgent need for developing effective models for predicting mortality based on them. In contrast to structured electronic health records, medical notes are unstructured texts written by experienced caregivers and contain more complicated information about patients, posing more challenges for modeling. Most previous studies rely on tedious hand-crafted features or generating indirect features based on some statistical models such as topic modeling, which might incur information loss for later model training. Recently, some deep models have been proposed to unify the stages of feature construction and model training. However, domain concept knowledge has been neglected, which is important to gain a better understanding of medical notes. To address the above issues, we propose novel Knowledge-aware Deep Dual Networks (K-DDN) for the text-based mortality prediction task. Specifically, a simple deep dual network is first proposed to fuse the representations of medical knowledge and raw text for prediction. Afterward, we incorporate a co-attention mechanism into the basic model, guiding the knowledge and text representation learning with the help of each other. Experimental results on two publicly real-world datasets show the proposed deep dual networks outperform state-of-the-art methods and the co-attention mechanism can further improve the performance.

Download paper here

Recommended citation: Your Name, You. (2009). “Paper Title Number 1.” Journal 1. 1(1).