Skip to content

Commit 14e43c6

Browse files
authored
Update README.md
1 parent d75e8bd commit 14e43c6

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ This is the code for the EMNLP-IJCNLP paper [EDA: Easy Data Augmentation techniq
55

66
A blog post that explains EDA is [[here]](https://medium.com/@jason.20/these-are-the-easiest-data-augmentation-techniques-in-natural-language-processing-you-can-think-of-88e393fd610).
77

8+
Update: find an external implementation of EDA in Chinese [[here]](https://github.com/zhanlaoban/EDA_NLP_for_Chinese).
9+
810
By [Jason Wei](https://jasonwei20.github.io/research/) and Kai Zou.
911

1012
Note: **Do not** email me with questions, as I will not reply. Instead, open an issue.
@@ -76,7 +78,6 @@ If you use EDA in your paper, please cite us:
7678
publisher = "Association for Computational Linguistics",
7779
url = "https://www.aclweb.org/anthology/D19-1670",
7880
pages = "6383--6389",
79-
abstract = "We present EDA: easy data augmentation techniques for boosting performance on text classification tasks. EDA consists of four simple but powerful operations: synonym replacement, random insertion, random swap, and random deletion. On five text classification tasks, we show that EDA improves performance for both convolutional and recurrent neural networks. EDA demonstrates particularly strong results for smaller datasets; on average, across five datasets, training with EDA while using only 50{\textbackslash}{\%} of the available training set achieved the same accuracy as normal training with all available data. We also performed extensive ablation studies and suggest parameters for practical use.",
8081
}
8182
```
8283

0 commit comments

Comments
 (0)