Sign Up to like & get
recommendations!
1
Published in 2022 at "IEEE Communications Letters"
DOI: 10.1109/lcomm.2022.3193644
Abstract: Pre-trained Models (PTMs) have reached the state-of-the-art (SOTA) on many Natural Language Processing (NLP) tasks and Computer Vision (CV) tasks, and are called the foundation models of artificial intelligence (AI) systems. In this letter, we…
read more here.
Keywords:
natural redundancy;
trained models;
receiver design;
pre trained ... See more keywords
Sign Up to like & get
recommendations!
2
Published in 2023 at "IEEE Transactions on Communications"
DOI: 10.1109/tcomm.2023.3247733
Abstract: A high-performance communication receiver desires a sufficient and accurate recognition of transmitted sources. In this paper, we propose a sequential decoding algorithm for the robust reception of sources with natural redundancy (NR) over the AWGN…
read more here.
Keywords:
language model;
natural redundancy;
causal language;
sequential decoding ... See more keywords