Similarity space projection for web image search and annotation
- Tao Qin ,
- Tie-Yan Liu ,
- Lei Zhang ,
- Wei-Ying Ma
MIR '05: Proceedings of the 7th ACM SIGMM international workshop on Multimedia information retrieval |
Published by ACM
Web image search has been explored and developed in academic as well as commercial areas for over a decade. To measure the similarity between Web images and user queries, most of the existing Web image search systems try to convert an image to textual keywords by analyzing the textual information available (such as surrounding text and image filename) with or without leveraging image visual features (such as color, texture, shape). In this way, the existing systems transform “Web images” to the “query (text)” space so as to compare the relevance of images to the query. In this paper, we present a novel solution to Web image search – similarity space projection (SSP). This algorithm takes images and queries as two heterogeneous object peers, and projects them into a third Euclidean “similarity space” in which their similarity can be directly measured. The rule of projection guarantees that in the new space the relevant images are kept close to the corresponding query and those irrelevant ones are away from it. Experiments on real-world Web image collections showed that the proposed algorithm significantly outperformed traditional information retrieval models (such as vector space model) in the application of image search. Besides Web image search, we demonstrate that this algorithm can also be applied to image annotation scenario, and has promising performance. Thus, this algorithm unifies Web image search and image annotation into same framework.