We leverage the attention mechanism to investigate and comprehend the contribution of each input symbol of the input sequence and their hidden representations for predicting the received symbol in the… Click to show full abstract
We leverage the attention mechanism to investigate and comprehend the contribution of each input symbol of the input sequence and their hidden representations for predicting the received symbol in the bidirectional recurrent neural network (BRNN)-based nonlinear equalizer. In this paper, we propose an attention-aided novel design of a partial BRNN-based nonlinear equalizer, and evaluate with both LSTM and GRU units in a single-channel DP-64QAM 30Gbaud coherent optical communication systems of 20 × 50 km standard single-mode fiber (SSMF) spans. Our approach maintains the Q-factor performance of the baseline equalizer with a significant complexity reduction of ∼56.16% in the number of real multiplications required to equalize per symbol (RMpS). In comparison of the performance under similar complexity, our approach outperforms the baseline by ∼0.2dB to ∼0.25dB at the optimal transmit power, and ∼0.3dB to ∼0.45dB towards the more nonlinear region.
               
Click one of the above tabs to view related content.