4.4 Article

Word-embedding-based query expansion: Incorporating Deep Averaging Networks in Arabic document retrieval

Journal

JOURNAL OF INFORMATION SCIENCE
Volume 49, Issue 5, Pages 1168-1186

Publisher

SAGE PUBLICATIONS LTD
DOI: 10.1177/01655515211040659

Keywords

Automatic query expansion; Deep Averaging Networks; information retrieval; word embedding

Ask authors/readers for more resources

This article introduces the Deep Averaging Networks (DANs) architecture for query expansion, which utilizes the average word embedding vectors of the query terms through multiple neural network layers to represent the overall meaning of the query. The potential of DANs for Arabic document retrieval is explored, and it outperforms baseline methods when incorporated into specific expansion strategies.
One of the main issues associated with search engines is the query-document vocabulary mismatch problem, a long-standing problem in Information Retrieval (IR). This problem occurs when a user query does not match the content of stored documents, and it affects most search tasks. Automatic query expansion (AQE) is one of the most common approaches used to address this problem. Various AQE techniques have been proposed; these mainly involve finding synonyms or related words for the query terms. Word embedding (WE) is one of the methods that are currently receiving significant attention. Most of the existing AQE techniques focus on expanding the individual query terms rather the entire query during the expansion process, and this can lead to query drift if poor expansion terms are selected. In this article, we introduce Deep Averaging Networks (DANs), an architecture that feeds the average of the WE vectors produced by the Word2Vec toolkit for the terms in a query through several linear neural network layers. This average vector is assumed to represent the meaning of the query as a whole and can be used to find expansion terms that are relevant to the complete query. We explore the potential of DANs for AQE in Arabic document retrieval. We experiment with using DANs for AQE in the classic probabilistic BM25 model as well as for two recent expansion strategies: Embedding-Based Query Expansion approach (EQE1) and Prospect-Guided Query Expansion Strategy (V2Q). Although DANs did not improve all outcomes when used in the BM25 model, it outperformed all baselines when incorporated into the EQE1 and V2Q expansion strategies.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available