This paper proposes a spectrum sensing algorithm from one-bit measurements in a cognitive radio sensor network. A likelihood ratio test (LRT) for the one-bit spectrum sensing problem is derived. Different… Click to show full abstract
This paper proposes a spectrum sensing algorithm from one-bit measurements in a cognitive radio sensor network. A likelihood ratio test (LRT) for the one-bit spectrum sensing problem is derived. Different from the one-bit spectrum sensing research work in the literature, the signal is assumed to be a discrete random correlated Gaussian process, where the correlation is only available within immediate successive samples of the received signal. The employed model facilitates the design of a powerful detection criteria with measurable analytical performance. One-bit spectrum sensing criterion is derived for one sensor which is then generalized to multiple sensors. Performance of the detector is analyzed by obtaining closed-form formulas for the probability of false alarm and the probability of detection. The proposed one-bit LRT detector exhibits comparable performance to that of non-one-bit detectors (i.e., quadratic and energy detectors) with the lower computational complexity. Simulation results corroborate the theoretical findings and confirm the efficacy of the proposed detector in the context of highly correlated signals.
               
Click one of the above tabs to view related content.