日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

成果報告書

CompMix: A Benchmark for Heterogeneous Question Answering

MPS-Authors
/persons/resource/persons244397

Christmann,  Philipp
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons185343

Saha Roy,  Rishiraj
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons45720

Weikum,  Gerhard
Databases and Information Systems, MPI for Informatics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

arXiv:2306.12235.pdf
(プレプリント), 2MB

付随資料 (公開)
There is no public supplementary material available
引用

Christmann, P., Saha Roy, R., & Weikum, G. (2023). CompMix: A Benchmark for Heterogeneous Question Answering. Retrieved from https://arxiv.org/abs/2306.12235.


引用: https://hdl.handle.net/21.11116/0000-000D-579D-1
要旨
Fact-centric question answering (QA) often requires access to multiple,
heterogeneous, information sources. By jointly considering several sources like
a knowledge base (KB), a text collection, and tables from the web, QA systems
can enhance their answer coverage and confidence. However, existing QA
benchmarks are mostly constructed with a single source of knowledge in mind.
This limits capabilities of these benchmarks to fairly evaluate QA systems that
can tap into more than one information repository. To bridge this gap, we
release CompMix, a crowdsourced QA benchmark which naturally demands the
integration of a mixture of input sources. CompMix has a total of 9,410
questions, and features several complex intents like joins and temporal
conditions. Evaluation of a range of QA systems on CompMix highlights the need
for further research on leveraging information from heterogeneous sources.