Skip to content

This is the project repository of our ASE22 paper: Natural Test Generation for Precise Testing of Question Answering Software

License

Notifications You must be signed in to change notification settings

ShenQingchao/QAQA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

QAQA: Natural Test Generation for Precise Testing of Question Answering Software

QAQA is a QA software testing technique, which can generate natural test inputs and achieve precise testing. This repo is the artifact for paper Natural Test Generation for Precise Testing of Question Answering Software, which have accepted by ASE'22.

Reproducibility

Environment build

  1. download all datasets from this Link and put it in the root directory
  2. build SLAHAN environment following the tutorial SLAHAN # you can ignore this step, because the temp results have been saved in this repo.
  3. install all python dependent packages for QAQA with 'python=3.8.3'
pip install -r requirements.txt

# install nltk dependenct packages
python
import nltk
nltk.download('averaged_perceptron_tagger')
nltk.download('punkt')
exit()
  1. unzip the file benepar_en3.zip in 3rd_models
cd 3rd_models
unzip benepar_en3.zip

Run QAQA

  1. run SLAHAN to compress the seed question into a short question. The compressed questions are saved in path datasets/compress

  2. run QAQA to generate all new test cases and detected bugs. The results are saved in path QAQA/results/

cd script/
python run.py project_name  # valid project_name are 'boolq', 'squad2', 'narrative' 

Manual Labeling Results

For the results of manual labeling about false positive and naturalness , it were placed in the directory labeling_results


Citation

Please cite our paper if this work is useful for you.

@incollection{shen2022natural,
  title={Natural Test Generation for Precise Testing of Question Answering Software},
  author={Shen, Qingchao and Chen, Junjie and Zhang, Jie and Wang, Haoyu and Liu, Shuang and Tian, Menghan},
  booktitle={IEEE/ACM Automated Software Engineering (ASE)},
  year={2022}
}

About

This is the project repository of our ASE22 paper: Natural Test Generation for Precise Testing of Question Answering Software

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages