Open Access System for Information Sharing

Login Library

 

Article
Cited 1 time in webofscience Cited 3 time in scopus
Metadata Downloads

Self-feeding training method for semi-supervised grammatical error correction SCIE SCOPUS

Title
Self-feeding training method for semi-supervised grammatical error correction
Authors
Kwon, SoonchoulLee, Gary Geunbae
Date Issued
2023-01
Publisher
Academic Press
Abstract
© 2022 Elsevier LtdGrammatical error correction (GEC) has been successful with deep and complex neural machine translation models, but the annotated data to train the model are scarce. We propose a novel self-feeding training method that generates incorrect sentences from freely available correct sentences. The proposed training method can generate appropriate wrong sentences from unlabeled sentences, using a data generation model trained as an autoencoder. It can also add artificial noise to correct sentences to automatically generate incorrect sentences. We show that the GEC models trained with the self-feeding training method are successful without extra annotated data or deeper neural network-based models, achieving F0.5 score of 0.5982 on the CoNLL-2014 Shared Task test data with a transformer model. The results also show that fully unlabeled training is possible for data-scarce domains and languages.
URI
https://oasis.postech.ac.kr/handle/2014.oak/115280
DOI
10.1016/j.csl.2022.101435
ISSN
0885-2308
Article Type
Article
Citation
Computer Speech and Language, vol. 77, 2023-01
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Views & Downloads

Browse