BACKGROUND The methods available for pre-processing EEG data are rapidly evolving as researchers gain access to vast computational resources; however, the field currently lacks a set of standardized approaches for… Click to show full abstract
BACKGROUND The methods available for pre-processing EEG data are rapidly evolving as researchers gain access to vast computational resources; however, the field currently lacks a set of standardized approaches for data characterization, efficient interactive quality control review procedures, and large-scale automated processing that is compatible with High Performance Computing (HPC) resources. NEW METHOD In this paper we describe an infrastructure for the development of standardized procedures for semi and fully automated pre-processing of EEG data. Our pipeline incorporates several methods to isolate cortical signal from noise, maintain maximal information from raw recordings and provide comprehensive quality control and data visualization. In addition, batch processing procedures are integrated to scale up analyses for processing hundreds or thousands of data sets using HPC clusters. RESULTS We demonstrate here that by using the EEG Integrated Platform Lossless (EEG-IP-L) pipeline's signal quality annotations, significant increase in data retention is achieved when applying subsequent post-processing ERP segment rejection procedures. Further, we demonstrate that the increase in data retention does not attenuate the ERP signal. CONCLUSIONS The EEG-IP-L state provides the infrastructure for an integrated platform that includes long-term data storage, minimal data manipulation and maximal signal retention, and flexibility in post processing strategies.
               
Click one of the above tabs to view related content.