05/12/2020

Chinese grammatical correction using BERT-based pre-trained model

Hongfei Wang, Michiki Kurosawa, Satoru Katsumata, Mamoru Komachi

Keywords:

Abstract: In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a pre-trained model into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at AACL 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers