[1] ABUDUKELIMU H,LIU Y,CHEN X,et al.Learning distributed representations of uyghur words and morphemes [M]//Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data.Cham:Springer,2015:202-211. [2] 哈里旦木·阿布都克里木,刘 洋,孙茂松.神经机器翻译系统在维吾尔语-汉语翻译中的性能对比 [J].清华大学学报(自然科学版),2017,57(8):878-883. [3] SENNRICH R,HADDOW B,BIRCH A.Improving neural machine translation models with monolingual data[EB/OL].[2021-02-15].https://arxiv.org/abs/1511.06709. [4] PONCELAS A,SHTERIONOV D,WAY A,et al.Investigating backtranslation in neural machine translation [EB/OL].[2021-02-15].https://arxiv.org/abs/1804.06189. [5] FADAEE M,MONZ C.Back-translation sampling by targeting difficult words in neural machine translation [EB/OL].[2021-02-15].https://arxiv.org/abs/1808.09006. [6] HOANG V C D,KOEHN P,HAFFARI G,et al.Iterative back-translation for neural machine translation [C]//Proceedings of the 2nd Workshop on Neural Machine Translation and Generation.Melbourne,Australia:[s.n.],2018:18-24. [7] COTTERELL R,KREUTZER J.Explaining and generalizing back-translation through wake-sleep [EB/OL].[2021-02-15].https://arxiv.org/abs/1806.04402. [8] PONCELAS A,POPOVIC M,SHTERIONOV D,et al.Combining SMT and NMT back-translated data for efficient NMT [EB/OL].[2021-02-15].https://arxiv.org/abs/1909.03750. [9] LUO G X,YANG Y T,DONG R,et al.A joint back-translation and transfer learning method for low-resource neural machine translation [J/OL].[2020-02-15].https://doaj.org/article/3ee33e9edcd04597aa86ce4dd19ea6e7. [10] ZHOU Z H.Ensemble methods:foundations and algorithms [M].Boca Raton:CRC Press,2012. [11] BREIMAN L.Bagging predictors [J].Machine Learning,1996,24(2):123-140. [12] 李 睿,张九蕊,毛 莉.基于AdaBoost的弱分类器选择和整合算法 [J].兰州理工大学学报,2012,38(2):87-90. [13] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need [EB/OL].[2021-02-15].https://arxiv.org/abs/1706.03762. [14] SENNRICH R,HADDOW B,BIRCH A.Edinburgh neural machine translation systems for WMT 16 [EB/OL].[2021-02-15].https://arxiv.org/abs/1606.02891. [15] SENNRICH R,BIRCH A,CURREY A,et al.The university of Edinburgh's neural MT systems for WMT17 [EB/OL].[2021-02-15].https://arxiv.org/abs/1708.00726. [16] 李 北,王 强,肖 桐,等.面向神经机器翻译的集成学习方法分析 [J].中文信息学报,2019,33(3):42-51. [17] 张新路,李 晓,杨雅婷,等.面向维汉神经机器翻译的双向重排序模型分析 [J].北京大学学报(自然科学版),2020,56(1):31-38. [18] WANG Y,WU L,XIA Y,et al.Transductive ensemble learning for neural machine translation [C]//Proceedings of the AAAI Conference on Artificial Intelligence.New York:[s.n.],2020:6291-6298. [19] SUTSKEVER I,VINYALS O,LE Q V.Sequence to sequence learning with neural networks [EB/OL].[2021-02-15].https://arxiv.org/abs/1409.3215. [20] BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate [EB/OL].[2021-02-15].https://arxiv.org/abs/1409.0473v4. [21] GEHRING J,AULI M,GRANGIER D,et al.Convolutional sequence to sequence learning [C]//International Conference on Machine Learning.[S.l.]:PMLR,2017:1243-1252. [22] BA J L,KIROS J R,HINTON G E.Layer normalization [EB/OL].[2021-02-15].https://arxiv.org/abs/1607.06450. [23] OCH F J,GILDEA D,KHUDANPUR S,et al.A smorgasbord of features for statistical machine translation [C]//Proceedings of the Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics.Boston:[s.n.],2004:161-168. [24] IMAMURA K,SUMITA E.Ensemble and reranking:Using multiple models in the NICT-2 neural machine translation system at WAT2017 [C]//Proceedings of the 4th Workshop on Asian Translation (WAT2017).Taipei,China:[s.n.],2017:127-134. [25] JOHNSON R W.An introduction to the bootstrap [J].Teaching Statistics,2010,23(2):49-54. [26] SENNRICH R,HADDOW B,BIRCH A.Neural machine translation of rare words with subword units [EB/OL].[2021-02-15].https://arxiv.org/abs/1508.07909. [27] ZHANG J J,ZONG C Q.Exploiting source-side monolingual data in neural machine translation [C]//Proceedings of EMNLP2016.Austin,Texas:[s.n.],2016:1535-1545. [28] KINGMA D P,BA J.Adam:A method for stochastic optimization [EB/OL].[2021-02-15].https://arxiv.org/abs/1412.6980. [29] GAL Y,GHAHRAMANI Z.A theoretically grounded application of dropout in recurrent neural networks [J].Advances in Neural Information Processing Systems,2016,29:1019-1027. [30] PAPINENI K,ROUKOS S,WARD T,et al.BLEU:a method for automatic evaluation of machine translation [C]//Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics.Philadelphia:[s.n.],2002:311-318. [31] ZHANG Y,VOGEL S,WAIBEL A.Interpreting BLEU/NIST scores:How much improvement do we need to have a better system [C/OL].[2021-02-15].https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.368.1133& repl & type=pdf. |