File size: 1,912 Bytes
cce4082 1f5d8d4 8f2d337 1f5d8d4 8f2d337 f960401 1f5d8d4 f960401 1f5d8d4 8f2d337 1f5d8d4 24d84a9 1f5d8d4 24d84a9 1f5d8d4 24d84a9 1f5d8d4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
---
license: mit
---
# Dataset Card for "popqa-tp"
### Dataset Summary
PopQA-TP (PopQA Templated Paraphrases) is a dataset derived from PopQA (https://huggingface.co/datasets/akariasai/PopQA), created for the paper "Predicting Question-Answering Performance of Large Language Models
through Semantic Consistency". PopQA-TP takes each question in PopQA and paraphrases it using each of several manually-created templates specific to each question category. The paper investigates the relationship between
the semantic consistency of generated answers to each question's paraphrases and the accuracy (correctness) of the generated answer to the original question, evaluated by string match to one of the ground
truth answers. PopQA-TP can be used as a benchmark dataset for evaluating the semantic consistency of LLMs in the context of factiod question-answering (QA).
### Data Instances
#### popqa-tp
- **Size of downloaded dataset file:** 15.4 MB
### Data Fields
#### popqa-tp
- `paraphrase` (string): paraphrase of question from PopQA.
- `prop` (string): relationship type category of question.
- `template_id` (integer): integer ID of the paraphrase template used to create `paraphrase`. Value of 0 indicates it is the original question form from PopQA.
- `possible_answers` (list of strings): a list of the gold answers.
- `id` (integer): original ID of question from PopQA
### Citation Information
```
@inproceedings{rabinovich2023predicting,
title={Predicting Question-Answering Performance of Large Language Models Through Semantic Consistency},
author={Ella Rabinovich, Samuel Ackerman, Orna Raz, Eitan Farchi, Ateret Anaby-Tavor},
booktitle = "Proceedings of the 3rd Version of the Generation, Evaluation & Metrics (GEM) Workshop of The 2023 Conference on Empirical Methods in Natural Language Processing",
publisher = "Association for Computational Linguistics",
year={2023},}
}
```
|