Skip to content

Latest commit

 

History

History
54 lines (42 loc) · 2.49 KB

Seq_to_seq.md

File metadata and controls

54 lines (42 loc) · 2.49 KB

Sequence/Sentence-oriented Operation

MASS

  • paper: Kaitao S. , Xu T. , Tao Q. , Jianfeng L. , Tie-Y. L. . (2019). MASS: Masked Sequence to Sequence Pre-training for Language Generation.

UniLM

  • paper: Dong, L. , Yang, N. , Wang, W. , Wei, F. , Liu, X. , & Wang, Y. , et al. (2019). Unified language model pre-training for natural language understanding and generation. NeurIPS 2019.
  • code: https://github.com/microsoft/unilm
  • note: including UniLM v1/v2, MiniLM, LayoutLM, and s2s-ft.

BART

  • paper: Lewis, M. , Liu, Y. , Goyal, N. , Ghazvininejad M. , Mohamed A. , Levy O. , Stoyanov, V. , & Zettlemoyer, L. . (2019). BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension.

T5

seq2seq

pytorch-seq2seq

pytorch-seq2seq

tensorflow-seq2seq-tutorials

bert_seq2seq

CLGE

LightSeq