LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Nonlinear Discrete Hashing

Photo from wikipedia

In this paper, we propose a nonlinear discrete hashing approach to learn compact binary codes for scalable image search. Instead of seeking a single linear projection in most existing hashing… Click to show full abstract

In this paper, we propose a nonlinear discrete hashing approach to learn compact binary codes for scalable image search. Instead of seeking a single linear projection in most existing hashing methods, we pursue a multilayer network with nonlinear transformations to capture the local structure of data samples. Unlike most existing hashing methods that adopt an error-prone relaxation to learn the transformations, we directly solve the discrete optimization problem to eliminate the quantization error accumulation. Specifically, to leverage the similarity relationships between data samples and exploit the semantic affinities of manual labels, the binary codes are learned with the objective to: 1) minimize the quantization error between the original data samples and the learned binary codes; 2) preserve the similarity relationships in the learned binary codes; 3) maximize the information content with independent bits; and 4) maximize the accuracy of the predicted labels based on the binary codes. With an alternating optimization, the nonlinear transformation and the discrete quantization are jointly optimized in the hashing learning framework. Experimental results on four datasets including CIFAR10, MNIST, SUN397, and ILSVRC2012 demonstrate that the proposed approach is superior to several state-of-the-art hashing methods.

Keywords: data samples; nonlinear discrete; binary codes; error; hashing methods; discrete hashing

Journal Title: IEEE Transactions on Multimedia
Year Published: 2017

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.