Emergent Mind

Iterative Mask Filling: An Effective Text Augmentation Method Using Masked Language Modeling

(2401.01830)
Published Jan 3, 2024 in cs.CL , cs.AI , and cs.LG

Abstract

Data augmentation is an effective technique for improving the performance of machine learning models. However, it has not been explored as extensively in NLP as it has in computer vision. In this paper, we propose a novel text augmentation method that leverages the Fill-Mask feature of the transformer-based BERT model. Our method involves iteratively masking words in a sentence and replacing them with language model predictions. We have tested our proposed method on various NLP tasks and found it to be effective in many cases. Our results are presented along with a comparison to existing augmentation methods. Experimental results show that our proposed method significantly improves performance, especially on topic classification datasets.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.