Skip to content

Adding Grok-2 to chat model setup #3935

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Feb 2, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions docs/docs/chat/model-setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,21 @@ If you prefer to use a model from [OpenAI](../customize/model-providers/top-leve
]
```

### Grok-2 from xAI

If you prefer to use a model from [xAI](../customize/model-providers/top-level/xAI.md), then we recommend Grok-2.

```json title="config.json"
"models": [
{
"title": "Grok-2",
"provider": "xAI",
"model": "grok-2-latest",
"apiKey": "[XAI_API_KEY]"
}
]
```

### Gemini 1.5 Pro from Google

If you prefer to use a model from [Google](../customize/model-providers/top-level/gemini.md), then we recommend Gemini 1.5 Pro.
Expand Down
1 change: 1 addition & 0 deletions docs/docs/customize/model-types/chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,6 @@ If you have the ability to use any model, we recommend [Claude 3.5 Sonnet](../mo
Otherwise, some of the next best options are:

- [GPT-4o](../model-providers/top-level/openai.md)
- [Grok-2](../model-providers/top-level/xAI.md)
- [Gemini 1.5 Pro](../model-providers/top-level/gemini.md)
- [Llama3.1 405B](../tutorials/llama3.1.md)
4 changes: 2 additions & 2 deletions docs/docs/reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ Each model has specific configuration options tailored to its provider and funct
**Properties:**

- `title` (**required**): The title to assign to your model, shown in dropdowns, etc.
- `provider` (**required**): The provider of the model, which determines the type and interaction method. Options inclued `openai`, `ollama`, etc., see intelliJ suggestions.
- `provider` (**required**): The provider of the model, which determines the type and interaction method. Options inclued `openai`, `ollama`, `xAI`, etc., see IntelliJ suggestions.
- `model` (**required**): The name of the model, used for prompt template auto-detection. Use `AUTODETECT` special name to get all available models.
- `apiKey`: API key required by providers like OpenAI, Anthropic, and Cohere.
- `apiKey`: API key required by providers like OpenAI, Anthropic, Cohere, and xAI.
- `apiBase`: The base URL of the LLM API.
- `contextLength`: Maximum context length of the model, typically in tokens (default: 2048).
- `maxStopWords`: Maximum number of stop words allowed, to avoid API errors with extensive lists.
Expand Down