- paper: Kaitao S. , Xu T. , Tao Q. , Jianfeng L. , Tie-Y. L. . (2019). MASS: Masked Sequence to Sequence Pre-training for Language Generation.
- paper: Dong, L. , Yang, N. , Wang, W. , Wei, F. , Liu, X. , & Wang, Y. , et al. (2019). Unified language model pre-training for natural language understanding and generation. NeurIPS 2019.
- code: https://github.com/microsoft/unilm
- note: including UniLM v1/v2, MiniLM, LayoutLM, and s2s-ft.
- paper: Lewis, M. , Liu, Y. , Goyal, N. , Ghazvininejad M. , Mohamed A. , Levy O. , Stoyanov, V. , & Zettlemoyer, L. . (2019). BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension.
- paper: Raffel, C. , Shazeer, N. , Roberts, A. , Lee, K. , Narang, S. , & Matena, M. , et al. (2019). Exploring the limits of transfer learning with a unified text-to-text transformer.
- code: https://github.com/google-research/text-to-text-transfer-transformer
- blog:
- 18家机构批量刷新SOTA!T5 is all you need! | rumor 李rumor 2022年01月19日
- link: https://github.com/google/seq2seq
- author: google
- note: a general-purpose encoder-decoder framework for tensorflow.
- https://github.com/IBM/pytorch-seq2seq
- author: IBM
- note: an open source framework for seq2seq models in pytorch.
- link: https://github.com/bentrevett/pytorch-seq2seq
- author: Ben Trevett
- note: tutorials on implementing a few sequence-to-sequence (seq2seq) models with pytorch and torchtext.
- link: https://github.com/ematvey/tensorflow-seq2seq-tutorials
- author: Matvey Ezhov
- note: dynamic seq2seq in tensorflow, step by step
- link: https://github.com/920232796/bert_seq2seq
- author: zhaohu xing
- note: a slight framework about unilm(bert)-based implementation with several examples by pytorch.
- link: https://github.com/CLUEbenchmark/CLGE
- author: CLUE benchmark
- note: Chinese Language Generation Evaluation.
- link: https://github.com/bytedance/lightseq
- author: Bytedance Inc.
- note: a high performance library for sequence processing and generation.