Skip to content
Change the repository type filter

All

    Repositories list

    • Orion-MoE

      Public
      Orion-MoE8x7B is a large language model with a sparse MoE architecture, trained from scratch on a multilingual corpus of about 5 trillion tokens, including Chinese, English, Japanese, Korean, and more. Orion-MoE8x7B是一个具有8乘以70亿参数的生成式稀疏混合专家大语言模型,该模型在训练数据语言上涵盖了中文、英语、日语、韩语等多种语言。在主流的公开基准评测中表现优异。
      Apache License 2.0
      0600Updated Nov 27, 2024Nov 27, 2024
    • Orion

      Public
      Orion-14B is a family of models includes a 14B foundation LLM, and a series of models: a chat model, a long context model, a quantized model, a RAG fine-tuned model, and an Agent fine-tuned model. Orion-14B 系列模型包括一个具有140亿参数的多语言基座大模型以及一系列相关的衍生模型,包括对话模型,长文本模型,量化模型,RAG微调模型,Agent微调模型等。
      Python
      Apache License 2.0
      57790411Updated Jun 3, 2024Jun 3, 2024
    • Quick start of local vLLM inference service
      Dockerfile
      Apache License 2.0
      2810Updated May 26, 2024May 26, 2024
    • OrionStar-Yi-34B-Chat 是一款开源中英文Chat模型,由猎户星空基于Yi-34B开源模型、使用15W+高质量语料微调而成。
      Python
      Apache License 2.0
      2825841Updated Apr 9, 2024Apr 9, 2024