LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

PNG-Stega: Progressive Non-Autoregressive Generative Linguistic Steganography

Photo by ricvath from unsplash

The autoregressive-based model with the left-to-right generation order has been a predominant paradigm for generative linguistic steganography. However, such steganography does not perform well on semantic control and content planning,… Click to show full abstract

The autoregressive-based model with the left-to-right generation order has been a predominant paradigm for generative linguistic steganography. However, such steganography does not perform well on semantic control and content planning, which is forced by the secret message during the generation process. To mitigate this issue and efficiently produce high-quality steganographic texts (stegotexts), we present a Progressive Non-autoregressive Generative linguistic Steganography (PNG-Stega), which encodes secret messages and extends the context to generate stegotexts in a multi-round insertion manner. Each round continuously refines the generated steganographic sequences on the premise of the global information of the previous round, while striving to decline the adverse effects of steganographic encoding on text quality. Moreover, for enhancing the semantic internal dependency of stegotexts, we utilize a constraint word sequences extraction scheme to obtain keywords to initialize the skeleton of targeted stegotexts, then expand the existing keywords with insertion operations. Experimental results demonstrate that PNG-Stega outperforms compared methods in terms of imperceptibility and anti-steganalysis ability. In particular, PNG-Stega provides high information hiding efficiency, even exceeding the autoregressive methods by around 2 times.

Keywords: steganography; generative linguistic; progressive non; non autoregressive; linguistic steganography; png stega

Journal Title: IEEE Signal Processing Letters
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.