|
# SciQ |
|
|
|
### Paper |
|
|
|
Title: `Crowdsourcing Multiple Choice Science Questions` |
|
|
|
Abstract: https://aclanthology.org/W17-4413.pdf |
|
|
|
The SciQ dataset contains 13,679 crowdsourced science exam questions about Physics, |
|
Chemistry and Biology, among others. The questions are in multiple-choice format |
|
with 4 answer options each. For the majority of the questions, an additional paragraph |
|
with supporting evidence for the correct answer is provided. |
|
|
|
Homepage: https://allenai.org/data/sciq |
|
|
|
|
|
### Citation |
|
|
|
``` |
|
@inproceedings{Welbl2017CrowdsourcingMC, |
|
title={Crowdsourcing Multiple Choice Science Questions}, |
|
author={Johannes Welbl and Nelson F. Liu and Matt Gardner}, |
|
booktitle={NUT@EMNLP}, |
|
year={2017} |
|
} |
|
``` |
|
|
|
### Groups and Tasks |
|
|
|
#### Groups |
|
|
|
* Not part of a group yet. |
|
|
|
#### Tasks |
|
|
|
* `sciq` |
|
|
|
### Checklist |
|
|
|
For adding novel benchmarks/datasets to the library: |
|
* [ ] Is the task an existing benchmark in the literature? |
|
* [ ] Have you referenced the original paper that introduced the task? |
|
* [ ] If yes, does the original paper provide a reference implementation? If so, have you checked against the reference implementation and documented how to run such a test? |
|
|
|
|
|
If other tasks on this dataset are already supported: |
|
* [ ] Is the "Main" variant of this task clearly denoted? |
|
* [ ] Have you provided a short sentence in a README on what each new variant adds / evaluates? |
|
* [ ] Have you noted which, if any, published evaluation setups are matched by this variant? |
|
|