Change the repository type filter
All
Repositories list
4 repositories
Orion-MoE
PublicOrion-MoE8x7B is a large language model with a sparse MoE architecture, trained from scratch on a multilingual corpus of about 5 trillion tokens, including Chinese, English, Japanese, Korean, and more. Orion-MoE8x7B是一个具有8乘以70亿参数的生成式稀疏混合专家大语言模型,该模型在训练数据语言上涵盖了中文、英语、日语、韩语等多种语言。在主流的公开基准评测中表现优异。Orion
PublicOrion-14B is a family of models includes a 14B foundation LLM, and a series of models: a chat model, a long context model, a quantized model, a RAG fine-tuned model, and an Agent fine-tuned model. Orion-14B 系列模型包括一个具有140亿参数的多语言基座大模型以及一系列相关的衍生模型,包括对话模型,长文本模型,量化模型,RAG微调模型,Agent微调模型等。vllm_server
PublicOrionStar-Yi-34B-Chat
Public