English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Pretrained Transformers for Text Ranking: BERT and Beyond

MPS-Authors
/persons/resource/persons206666

Yates,  Andrew
Databases and Information Systems, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (public)

3404835.3462812.pdf
(Publisher version), 870KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Yates, A., Nogueira, R., & Lin, J. (2021). Pretrained Transformers for Text Ranking: BERT and Beyond. In F. Diaz, C. Shah, T. Suel, P. Castells, R. Jones, T. Sakai, et al. (Eds.), SIGIR '21 (pp. 2666-2668). New York, NY: ACM. doi:10.1145/3404835.3462812.


Cite as: http://hdl.handle.net/21.11116/0000-0009-6674-2
Abstract
There is no abstract available