IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i15p2644-d874042.html
   My bibliography  Save this article

Unsupervised Deep Relative Neighbor Relationship Preserving Cross-Modal Hashing

Author

Listed:
  • Xiaohan Yang

    (School of Computer Science and Technology, Shandong University of Technology, Zibo 255000, China)

  • Zhen Wang

    (School of Computer Science and Technology, Shandong University of Technology, Zibo 255000, China
    Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China)

  • Nannan Wu

    (School of Computer Science and Technology, Shandong University of Technology, Zibo 255000, China)

  • Guokun Li

    (School of Computer Science and Technology, Shandong University of Technology, Zibo 255000, China)

  • Chuang Feng

    (School of Computer Science and Technology, Shandong University of Technology, Zibo 255000, China)

  • Pingping Liu

    (Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China)

Abstract

The image-text cross-modal retrieval task, which aims to retrieve the relevant image from text and vice versa, is now attracting widespread attention. To quickly respond to the large-scale task, we propose an Unsupervised Deep Relative Neighbor Relationship Preserving Cross-Modal Hashing (DRNPH) to achieve cross-modal retrieval in the common Hamming space, which has the advantages of storage and efficiency. To fulfill the nearest neighbor search in the Hamming space, we demand to reconstruct both the original intra- and inter-modal neighbor matrix according to the binary feature vectors. Thus, we can compute the neighbor relationship among different modal samples directly based on the Hamming distances. Furthermore, the cross-modal pair-wise similarity preserving constraint requires the similar sample pair have an identical Hamming distance to the anchor. Therefore, the similar sample pairs own the same binary code, and they have minimal Hamming distances. Unfortunately, the pair-wise similarity preserving constraint may lead to an imbalanced code problem. Therefore, we propose the cross-modal triplet relative similarity preserving constraint, which demands the Hamming distances of similar pairs should be less than those of dissimilar pairs to distinguish the samples’ ranking orders in the retrieval results. Moreover, a large similarity marginal can boost the algorithm’s noise robustness. We conduct the cross-modal retrieval comparative experiments and ablation study on two public datasets, MIRFlickr and NUS-WIDE, respectively. The experimental results show that DRNPH outperforms the state-of-the-art approaches in various image-text retrieval scenarios, and all three proposed constraints are necessary and effective for boosting cross-modal retrieval performance.

Suggested Citation

  • Xiaohan Yang & Zhen Wang & Nannan Wu & Guokun Li & Chuang Feng & Pingping Liu, 2022. "Unsupervised Deep Relative Neighbor Relationship Preserving Cross-Modal Hashing," Mathematics, MDPI, vol. 10(15), pages 1-17, July.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:15:p:2644-:d:874042
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/15/2644/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/15/2644/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:15:p:2644-:d:874042. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.