4.7 Article Data Paper

BioASQ-QA: A manually curated corpus for Biomedical Question Answering

Journal

SCIENTIFIC DATA
Volume 10, Issue 1, Pages -

Publisher

NATURE PORTFOLIO
DOI: 10.1038/s41597-023-02068-4

Keywords

-

Ask authors/readers for more resources

The BioASQ QA benchmark dataset is comprehensive and realistic, containing questions, reference answers, and related materials. It offers a more challenging and realistic representation of biomedical experts' information needs compared to other datasets. Additionally, it includes ideal answers for research on multi-document summarization and combines structured and unstructured data. The dataset is continuously updated through the BioASQ challenge, making it a valuable resource for various research areas.
The BioASQ question answering (QA) benchmark dataset contains questions in English, along with golden standard (reference) answers and related material. The dataset has been designed to reflect real information needs of biomedical experts and is therefore more realistic and challenging than most existing datasets. Furthermore, unlike most previous QA benchmarks that contain only exact answers, the BioASQ-QA dataset also includes ideal answers (in effect summaries), which are particularly useful for research on multi-document summarization. The dataset combines structured and unstructured data. The materials linked with each question comprise documents and snippets, which are useful for Information Retrieval and Passage Retrieval experiments, as well as concepts that are useful in concept-to-text Natural Language Generation. Researchers working on paraphrasing and textual entailment can also measure the degree to which their methods improve the performance of biomedical QA systems. Last but not least, the dataset is continuously extended, as the BioASQ challenge is running and new data are generated.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available