LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Dual Distance Optimized Deep Quantization With Semantics-Preserving

Photo by heftiba from unsplash

Recently, quantization has been an effective technique for large-scale image retrieval, which can encode feature vectors into compact codes. However, it is still a great challenge to improve the discriminative… Click to show full abstract

Recently, quantization has been an effective technique for large-scale image retrieval, which can encode feature vectors into compact codes. However, it is still a great challenge to improve the discriminative capability of codewords while minimizing the quantization error. This letter proposes Dual Distance Optimized Deep Quantization (D2ODQ) to deal with this issue, by minimizing the Euclidean distance between samples and codewords, and maximizing the minimum cosine distance between codewords. To generate the evenly distributed codebook, we find the general solution for the upper bound of the minimum cosine distance between codewords. Moreover, scaler constrained semantics-preserving loss is considered to avoid trivial quantization boundary, and ensure that a codeword can only quantize the features of one category. In contrast to state-of-the-art methods, our method has a better performance on three benchmark datasets.

Keywords: semantics; dual distance; distance optimized; optimized deep; quantization

Journal Title: IEEE Signal Processing Letters
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.