Deep learning models such as long short-term memory (LSTM) are valuable classifiers for time series data like hourly clinical statistics. However, access to health data is challenging due to privacy… Click to show full abstract
Deep learning models such as long short-term memory (LSTM) are valuable classifiers for time series data like hourly clinical statistics. However, access to health data is challenging due to privacy and legal issues. Homomorphic encryption (HE) offers a potential solution by encrypting confidential data. However, the computational complexity of encrypted operations makes it infeasible for training with gradient descent. Prior works only deal with encrypted procedures during the inference phase of the network. To this end, we design Collective Learning protocol, a secure protocol for sharing classified time-series data within entities to train parameters of the binary classifier model partially. The protocol encrypts each data’s feature activations using HE and trains the last layers using encrypted logistic regression. We evaluate our protocol on a benchmark LSTM network trained on the Medical Information Mart for Intensive Care (MIMIC-III) dataset. We selected the Cheon-Kim-Kim-Song (CKKS) encryption scheme because of its ability to work with real numbers using approximate arithmetic. Our protocol improved the area under the precision-recall curve (AUPRC) score of the in-hospital mortality prediction model. The same protocol was reimplemented using secure multi-party computation (MPC) and compared.
               
Click one of the above tabs to view related content.