In recent years, many achievements have been made in improving the performance of supervised cross-modal hashing. However, it remains an open issue on how to fully explore the data information… Click to show full abstract
In recent years, many achievements have been made in improving the performance of supervised cross-modal hashing. However, it remains an open issue on how to fully explore the data information to achieve fine-grained retrieval performance. Most methods employ logical labels or a binary similarity matrix to supervise the hash learning, losing a lot of useful information. From another point of view, the low expressiveness of dense hash code severely limits its preservation of fine-grained data information. With this motivation, in this paper, we propose a high-dimensional sparse hashing framework for cross-modal retrieval, i.e., High-dimensional Sparse Cross-modal Hashing, HSCH for short. It leverages not only high-level semantic labels but also low-level multi-modal features to construct a fine-grained similarity. In particular, based on two well-designed rules, i.e., multi-level and prioritized, it is able to avoid semantic conflicts. Additionally, it leverages the strong power of high-dimensional sparse hash codes to preserve the fine-grained similarity. Then, it efficiently solves the sparse and discrete constraints of sparse hash codes through an efficient discrete optimization algorithm. In light of this, it is much more efficient and scalable to large-scale datasets. More importantly, the computational complexity of HSCH in the retrieval phase is as efficient as those naive hashing methods that use dense hash codes. Moreover, to support online learning scenarios, this paper also extends HSCH into an online version, i.e., HSCH_on. Extensive experiments on three benchmark datasets demonstrate the superiority of our framework compared with some state-of-the-art cross-modal hashing approaches in terms of both accuracy and efficiency.
               
Click one of the above tabs to view related content.