Energy consumption is an important issue for resource-constrained wireless neural recording applications with limited data bandwidth. Compressed sensing (CS) is a promising framework for addressing this challenge because it can… Click to show full abstract
Energy consumption is an important issue for resource-constrained wireless neural recording applications with limited data bandwidth. Compressed sensing (CS) is a promising framework for addressing this challenge because it can compress data in an energy-efficient way. Recent work has shown that deep neural networks (DNNs) can serve as valuable models for CS of neural action potentials (APs). However, these models typically require impractically large datasets and computational resources for training, and they do not easily generalize to novel circumstances. Here, we propose a new CS framework, termed APGen, for the reconstruction of APs in a training-free manner. It consists of a deep generative network and an analysis sparse regularizer. We validate our method on two in vivo datasets. Even without any training, APGen outperformed model-based and data-driven methods in terms of reconstruction accuracy, computational efficiency, and robustness to AP overlap and misalignment. The computational efficiency of APGen and its ability to perform without training make it an ideal candidate for long-term, resource-constrained, and large-scale wireless neural recording. It may also promote the development of real-time, naturalistic brain-computer interfaces.
               
Click one of the above tabs to view related content.