4.6 Article

Reusing Monolingual Pre-Trained Models by Cross-Connecting Seq2seq Models for Machine Translation

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Leveraging Pre-trained Checkpoints for Sequence Generation Tasks

Sascha Rothe et al.

TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (2020)

Article Computer Science, Artificial Intelligence

Multilingual Denoising Pre-training for Neural Machine Translation

Yinhan Liu et al.

TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (2020)

Article Computer Science, Artificial Intelligence

SpanBERT: Improving Pre-training by Representing and Predicting Spans

Mandar Joshi et al.

TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (2020)

Article Chemistry, Multidisciplinary

Image-To-Image Translation Using a Cross-Domain Auto-Encoder and Decoder

Jaechang Yoo et al.

APPLIED SCIENCES-BASEL (2019)