Motivation The prevalence of high‐throughput experimental methods has resulted in an abundance of large‐scale molecular and functional interaction networks. The connectivity of these networks provides a rich source of information… Click to show full abstract
Motivation The prevalence of high‐throughput experimental methods has resulted in an abundance of large‐scale molecular and functional interaction networks. The connectivity of these networks provides a rich source of information for inferring functional annotations for genes and proteins. An important challenge has been to develop methods for combining these heterogeneous networks to extract useful protein feature representations for function prediction. Most of the existing approaches for network integration use shallow models that encounter difficulty in capturing complex and highly non‐linear network structures. Thus, we propose deepNF, a network fusion method based on Multimodal Deep Autoencoders to extract high‐level features of proteins from multiple heterogeneous interaction networks. Results We apply this method to combine STRING networks to construct a common low‐dimensional representation containing high‐level protein features. We use separate layers for different network types in the early stages of the multimodal autoencoder, later connecting all the layers into a single bottleneck layer from which we extract features to predict protein function. We compare the cross‐validation and temporal holdout predictive performance of our method with state‐of‐the‐art methods, including the recently proposed method Mashup. Our results show that our method outperforms previous methods for both human and yeast STRING networks. We also show substantial improvement in the performance of our method in predicting gene ontology terms of varying type and specificity. Availability and implementation deepNF is freely available at: https://github.com/VGligorijevic/deepNF. Supplementary information Supplementary data are available at Bioinformatics online.
               
Click one of the above tabs to view related content.