English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

CompMix: A Benchmark for Heterogeneous Question Answering

MPS-Authors
/persons/resource/persons244397

Christmann,  Philipp
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons185343

Saha Roy,  Rishiraj
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons45720

Weikum,  Gerhard
Databases and Information Systems, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:2306.12235.pdf
(Preprint), 2MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Christmann, P., Saha Roy, R., & Weikum, G. (2023). CompMix: A Benchmark for Heterogeneous Question Answering. Retrieved from https://arxiv.org/abs/2306.12235.


Cite as: https://hdl.handle.net/21.11116/0000-000D-579D-1
Abstract
Fact-centric question answering (QA) often requires access to multiple,
heterogeneous, information sources. By jointly considering several sources like
a knowledge base (KB), a text collection, and tables from the web, QA systems
can enhance their answer coverage and confidence. However, existing QA
benchmarks are mostly constructed with a single source of knowledge in mind.
This limits capabilities of these benchmarks to fairly evaluate QA systems that
can tap into more than one information repository. To bridge this gap, we
release CompMix, a crowdsourced QA benchmark which naturally demands the
integration of a mixture of input sources. CompMix has a total of 9,410
questions, and features several complex intents like joins and temporal
conditions. Evaluation of a range of QA systems on CompMix highlights the need
for further research on leveraging information from heterogeneous sources.