English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  CEQE: Contextualized Embeddings for Query Expansion

Naseri, S., Dalton, J., Yates, A., & Allan, J. (2021). CEQE: Contextualized Embeddings for Query Expansion. Retrieved from https://arxiv.org/abs/2103.05256.

Item is

Files

show Files
hide Files
:
arXiv:2103.05256.pdf (Preprint), 189KB
Name:
arXiv:2103.05256.pdf
Description:
File downloaded from arXiv at 2021-10-26 11:37
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Naseri, Shahrzad1, Author
Dalton, Jeffrey1, Author
Yates, Andrew2, Author           
Allan, James1, Author
Affiliations:
1External Organizations, ou_persistent22              
2Databases and Information Systems, MPI for Informatics, Max Planck Society, ou_24018              

Content

show
hide
Free keywords: Computer Science, Information Retrieval, cs.IR
 Abstract: In this work we leverage recent advances in context-sensitive language models
to improve the task of query expansion. Contextualized word representation
models, such as ELMo and BERT, are rapidly replacing static embedding models.
We propose a new model, Contextualized Embeddings for Query Expansion (CEQE),
that utilizes query-focused contextualized embedding vectors. We study the
behavior of contextual representations generated for query expansion in ad-hoc
document retrieval. We conduct our experiments on probabilistic retrieval
models as well as in combination with neural ranking models. We evaluate CEQE
on two standard TREC collections: Robust and Deep Learning. We find that CEQE
outperforms static embedding-based expansion methods on multiple collections
(by up to 18% on Robust and 31% on Deep Learning on average precision) and also
improves over proven probabilistic pseudo-relevance feedback (PRF) models. We
further find that multiple passes of expansion and reranking result in
continued gains in effectiveness with CEQE-based approaches outperforming other
approaches. The final model incorporating neural and CEQE-based expansion score
achieves gains of up to 5% in P@20 and 2% in AP on Robust over the
state-of-the-art transformer-based re-ranking model, Birch.

Details

show
hide
Language(s): eng - English
 Dates: 2021-03-092021
 Publication Status: Published online
 Pages: 15 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 2103.05256
BibTex Citekey: Naseri_2103.05256
URI: https://arxiv.org/abs/2103.05256
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show