Skip to content

ykshi/text-style-transfer-benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 

Repository files navigation

Text Style Transfer Benchmark

This benchmark is built for neural text style transfer. We are still collecting the relevant results and paper, if you want to add your own paper on this benchmark, feel free to contact us by sending mail to szha2609 [at] uni.sydney.edu.au or ykshi.1991 [at] foxmail.com, we will update your data ASAP.

Yelp Dataset

Dataset: Yelp Content Preservation Naturalness Transfer Intensity
Model WMD BLEU B-P B-R B-F1 N-A N-C N-D ACCU EMD
GTAE(shi2021gtae) 0.1027 64.83 0.6991 0.7303 0.7149 0.6178 0.9272 0.6644 0.8870 0.8505
Language-Discriminator(yang2018) 0.1014 63.77 0.7292 0.7329 0.7314 0.5886 0.9074 0.6389 0.8940 0.8559
Struct(tian2018) 0.1224 61.98 0.7174 0.7220 0.7200 0.6205 0.9261 0.7002 0.8960 0.8574
StyleTrans-multi(dai2019) 0.1536 63.08 0.7145 0.7203 0.7175 0.6133 0.9102 0.6909 0.8730 0.8316
DualRL(luo2019) 0.1692 59.01 0.7125 0.6988 0.7057 0.5517 0.8996 0.6768 0.9050 0.8675
Texar(hu2018) 0.1921 57.82 -- -- -- 0.6934 0.9373 0.7066 0.8850 0.8429
StyleTrans-cond(dai2019) 0.2223 53.28 0.6205 0.6475 0.6341 0.6312 0.9109 0.6654 0.9290 0.8815
UnsuperMT(zhang2018) 0.2450 46.25 0.6060 0.6206 0.6134 0.5755 0.9040 0.6625 0.9770 0.9372
UnpairedRL(xu2018) 0.3122 46.09 0.4504 0.4709 0.4612 0.7136 0.9035 0.6493 0.5340 0.4989
DAR_Template(li2018) 0.4156 57.10 0.4970 0.5406 0.5185 0.6370 0.8984 0.6299 0.8410 0.7948
DAR_DeleteOnly(li2018) 0.4538 34.53 0.4158 0.4823 0.4490 0.6345 0.9072 0.5511 0.8750 0.8297
DAR_DeleteRetrieve(li2018) 0.4605 36.72 0.4268 0.4799 0.4534 0.6564 0.9359 0.5620 0.9010 0.8550
CAAE(shen2017) 0.5130 20.74 0.3585 0.3825 0.3710 0.4139 0.7006 0.5999 0.7490 0.7029
IMaT(jin2019) 0.5571 16.92 0.4750 0.4249 0.4501 0.4878 0.8407 0.6691 0.8710 0.8198
Multi_Decoder(fu2018) 0.5799 24.91 0.3117 0.3315 0.3223 0.4829 0.8394 0.6365 0.6810 0.6340
FineGrained-0.7(luo2019) 0.6239 11.36 0.4023 0.3404 0.3717 0.3665 0.7125 0.5332 0.3960 0.3621
FineGrained-0.9(luo2019) 0.6251 11.07 0.4030 0.3389 0.3713 0.3668 0.7148 0.5231 0.4180 0.3926
FineGrained-0.5(luo2019) 0.6252 11.72 0.3994 0.3436 0.3718 0.3608 0.7254 0.5395 0.3280 0.2985
BackTranslation(prabhumoye2018) 0.7566 2.81 0.2405 0.2024 0.2220 0.3686 0.5392 0.4754 0.9500 0.9117
Style_Emb(fu2018) 0.8796 3.24 0.0166 0.0673 0.0429 0.5788 0.9075 0.6450 0.4490 0.4119
DAR_RetrieveOnly(li2018) 0.8990 2.62 0.1368 0.1818 0.1598 0.8067 0.9717 0.7211 0.9610 0.9010
ARAE(zhao2018) 0.9047 5.95 0.1680 0.1478 0.1584 0.4476 0.8120 0.6969 0.8278 0.7880

Political Slant Dataset

Dataset:   Political Content Preservation Naturalness Transfer Intensity
Model WMD BLEU B-P B-R B-F1 N-A N-C N-D ACCU EMD
GTAE(shi2021gtae) 0.1506 65.61 0.6577 0.6706 0.6640 0.3310 0.7852 0.7318 0.900 0.8990
CAAE(shen2017) 0.4968 15.68 0.3217 0.3240 0.3230 0.2715 0.7052 0.7370 0.828 0.8259
DAR_DeleteOnly(li2018) 0.5000 30.76 0.3295 0.3932 0.3605 0.3155 0.8534 0.6490 0.958 0.9565
DAR(li2018) 0.5109 35.48 0.3352 0.3966 0.3649 0.3190 0.8472 0.7081 0.977 0.9747
DAR_RetrieveOnly(li2018) 0.7771 10.14 0.1590 0.1840 0.1709 0.3219 0.7854 0.7271 0.998 0.9960
ARAE(zhao2018) 1.0347 2.95 0.0203 0.0117 0.0158 0.3092 0.7763 0.7333 0.944 0.9412

Paper-news Title Dataset

Dataset:   Title Content Preservation Naturalness Transfer Intensity
Model WMD BLEU B-P B-R B-F1 N-A N-C N-D ACCU EMD
GTAE(shi2021gtae) 0.5161 19.67 0.1923 0.2134 0.2034 0.0949 0.4181 0.4567 0.956 0.9492
DAR_DeleteOnly(li2018) 0.8413 4.75 0.0939 0.0412 0.0677 0.3912 0.8374 0.4495 0.881 0.8687
DAR(li2018) 0.8567 5.04 0.0249 0.0212 0.0234 0.2462 0.7387 0.4625 0.933 0.9234
CAAE(shen2017) 0.9226 0.82 0.0067 -0.0099 -0.0008 0.2167 0.5627 0.4422 0.972 0.9612
DAR_RetrieveOnly(li2018) 0.9842 0.37 -0.0383 -0.0362 -0.0365 0.1490 0.5701 0.4261 0.995 0.9856
ARAE(zhao2018) 1.0253 0.00 -0.0447 -0.0539 -0.0486 0.2318 0.6061 0.4765 0.989 0.9782

Relevant Citation

@article{shi2021gtae,
  title={GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained Text Style Transfer},
  author={Shi, Yukai and Zhang, Sen and Zhou, Chenxing and Liang, Xiaodan and Yang, Xiaojun and Lin, Liang},
  journal={ACM Transactions on Intelligent Systems and Technology},
  year={2021}
}
@article{dai2019,
  title={Style transformer: Unpaired text style transfer without disentangled latent representation},
  author={Dai, Ning and Liang, Jianze and Qiu, Xipeng and Huang, Xuanjing},
  journal={arXiv preprint arXiv:1905.05621},
  year={2019}
}

@article{luo2019,
  title={A dual reinforcement learning framework for unsupervised text style transfer},
  author={Luo, Fuli and Li, Peng and Zhou, Jie and Yang, Pengcheng and Chang, Baobao and Sui, Zhifang and Sun, Xu},
  journal={arXiv preprint arXiv:1905.10060},
  year={2019}
}

@article{zhang2018,
	title="Style Transfer as Unsupervised Machine Translation",
	author="Zhirui {Zhang} and Shuo {Ren} and Shujie {Liu} and Jianyong {Wang} and Peng {Chen} and Mu {Li} and Ming {Zhou} and Enhong {Chen}",
	journal="arXiv preprint arXiv:1808.07894",
	notes="Sourced from Microsoft Academic - https://academic.microsoft.com/paper/2888173624",
	year="2018"
}

@article{xu2018,
	title="Unpaired Sentiment-to-Sentiment Translation: A Cycled Reinforcement Learning Approach",
	author="Jingjing {Xu} and Xu {Sun} and Qi {Zeng} and Xuancheng {Ren} and Xiaodong {Zhang} and Houfeng {Wang} and Wenjie {Li}",
	journal="arXiv preprint arXiv:1805.05181",
	notes="Sourced from Microsoft Academic - https://academic.microsoft.com/paper/2801454967",
	year="2018"
}

@inproceedings{li2018,
	title="Delete, retrieve, generate: A simple approach to sentiment and style transfer",
	author="Juncen {Li} and Robin {Jia} and He {He} and Percy {Liang}",
	booktitle="NAACL HLT 2018: 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
	volume="1",
	pages="1865--1874",
	year="2018"
}

@article{shen2017,
	title="Style transfer from non-parallel text by cross-alignment",
	author="Tianxiao {Shen} and Tao {Lei} and Regina {Barzilay} and Tommi S. {Jaakkola}",
	journal="Neural Information Processing Systems",
	pages="6830--6841",
	year="2017"
}

@article{jin2019,
  title={IMaT: Unsupervised Text Attribute Transfer via Iterative Matching and Translation},
  author={Jin, Zhijing and Jin, Di and Mueller, Jonas and Matthews, Nicholas and Santus, Enrico},
  journal={arXiv preprint arXiv:1901.11333},
  year={2019}
}

@article{fu2018,
	title="Style transfer in text: Exploration and evaluation",
	author="Zhenxin {Fu} and Xiaoye {Tan} and Nanyun {Peng} and Dongyan {Zhao} and Rui {Yan}",
	journal="National Conference on Artificial Intelligence",
	pages="663--670",
	year="2018"
}


@article{prabhumoye2018,
	title="Style transfer through back-translation",
	author="Shrimai {Prabhumoye} and Yulia {Tsvetkov} and Ruslan {Salakhutdinov} and Alan W {Black}",
	journal="Meeting of the Association for Computational Linguistics",
	volume="1",
	pages="866--876",
	year="2018"
}

@inproceedings{zhao2018,
	title="Adversarially regularized autoencoders",
	author="Junbo Jake {Zhao} and Yoon {Kim} and Kelly {Zhang} and Alexander M. {Rush} and Yann {LeCun}",
	booktitle="ICML 2018: Thirty-fifth International Conference on Machine Learning",
	pages="9405--9420",
	year="2018"
}

@article{hu2018,
  title={Texar: A modularized, versatile, and extensible toolkit for text generation},
  author={Hu, Zhiting and Shi, Haoran and Tan, Bowen and Wang, Wentao and Yang, Zichao and Zhao, Tiancheng and He, Junxian and Qin, Lianhui and Wang, Di and Ma, Xuezhe and others},
  journal={arXiv preprint arXiv:1809.00794},
  year={2018}
}

@inproceedings{yang2018,
  title={Unsupervised text style transfer using language models as discriminators},
  author={Yang, Zichao and Hu, Zhiting and Dyer, Chris and Xing, Eric P and Berg-Kirkpatrick, Taylor},
  booktitle={Advances in Neural Information Processing Systems},
  pages={7287--7298},
  year={2018}
}

@article{tian2018,
  title={Structured content preservation for unsupervised text style transfer},
  author={Tian, Youzhi and Hu, Zhiting and Yu, Zhou},
  journal={arXiv preprint arXiv:1810.06526},
  year={2018}
}