In this article, we consider the problem of adaptive detection of a multichannel subspace signal in the presence of constrained interference, which is assumed to be orthogonal to the signal… Click to show full abstract
In this article, we consider the problem of adaptive detection of a multichannel subspace signal in the presence of constrained interference, which is assumed to be orthogonal to the signal in the whitened space. We derive the gradient test, which is found to have the same form as the existing subspace-based generalized likelihood ratio test (SGLRT). In addition, we derive the statistical performance of the SGLRT in the presence of orthogonal interference for the case of signal mismatch, which means that the actual signal does not completely belong to the presumed signal subspace. Numerical examples are provided to verify the theoretical results. It is shown that both the orthogonal interference and signal mismatch can degrade the detection performance of the detectors.
               
Click one of the above tabs to view related content.