Skip to content

Commit 331e856

Browse files
authored
mistralai[minor]: Add llms entrypoint, update chat model integration (#5603)
* mistralai[minor]: Add llms entrypoint * chore: lint files * cr * docs * docs nit * docs nit * update mistral-large models to mistral-large-latest * fix tools * add codestral chat model tests * chore: lint files * cr
1 parent 64bf268 commit 331e856

14 files changed

+844
-82
lines changed

docs/core_docs/docs/integrations/chat/mistral.mdx

+6-11
Original file line numberDiff line numberDiff line change
@@ -7,21 +7,16 @@ import CodeBlock from "@theme/CodeBlock";
77
# ChatMistralAI
88

99
[Mistral AI](https://mistral.ai/) is a research organization and hosting platform for LLMs.
10-
They're most known for their family of 7B models ([`mistral7b` // `mistral-tiny`](https://mistral.ai/news/announcing-mistral-7b/), [`mixtral8x7b` // `mistral-small`](https://mistral.ai/news/mixtral-of-experts/)).
11-
1210
The LangChain implementation of Mistral's models uses their hosted generation API, making it easier to access their models without needing to run them locally.
1311

14-
## Models
15-
16-
Mistral's API offers access to two of their open source, and proprietary models:
12+
:::tip
13+
Want to run Mistral's models locally? Check out our [Ollama integration](/docs/integrations/chat/ollama).
14+
:::
1715

18-
- `open-mistral-7b` (aka `mistral-tiny-2312`)
19-
- `open-mixtral-8x7b` (aka `mistral-small-2312`)
20-
- `mistral-small-latest` (aka `mistral-small-2402`) (default)
21-
- `mistral-medium-latest` (aka `mistral-medium-2312`)
22-
- `mistral-large-latest` (aka `mistral-large-2402`)
16+
## Models
2317

24-
See [this page](https://docs.mistral.ai/guides/model-selection/) for an up to date list.
18+
Mistral's API offers access to two of their open source, and proprietary models.
19+
See [this page](https://docs.mistral.ai/getting-started/models/) for an up to date list.
2520

2621
## Setup
2722

Original file line numberDiff line numberDiff line change
@@ -0,0 +1,160 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# MistralAI\n",
8+
"\n",
9+
"```{=mdx}\n",
10+
":::tip\n",
11+
"Want to run Mistral's models locally? Check out our [Ollama integration](/docs/integrations/chat/ollama).\n",
12+
":::\n",
13+
"```\n",
14+
"\n",
15+
"Here's how you can initialize an `MistralAI` LLM instance:\n",
16+
"\n",
17+
"```{=mdx}\n",
18+
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
19+
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
20+
"\n",
21+
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
22+
"\n",
23+
"<Npm2Yarn>\n",
24+
" @langchain/mistralai\n",
25+
"</Npm2Yarn>\n",
26+
"```\n"
27+
]
28+
},
29+
{
30+
"cell_type": "code",
31+
"execution_count": 3,
32+
"metadata": {},
33+
"outputs": [
34+
{
35+
"name": "stdout",
36+
"output_type": "stream",
37+
"text": [
38+
"\n",
39+
"console.log('hello world');\n",
40+
"```\n",
41+
"This will output 'hello world' to the console.\n"
42+
]
43+
}
44+
],
45+
"source": [
46+
"import { MistralAI } from \"@langchain/mistralai\";\n",
47+
"\n",
48+
"const model = new MistralAI({\n",
49+
" model: \"codestral-latest\", // Defaults to \"codestral-latest\" if no model provided.\n",
50+
" temperature: 0,\n",
51+
" apiKey: \"YOUR-API-KEY\", // In Node.js defaults to process.env.MISTRAL_API_KEY\n",
52+
"});\n",
53+
"const res = await model.invoke(\n",
54+
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\"\n",
55+
");\n",
56+
"console.log(res);"
57+
]
58+
},
59+
{
60+
"cell_type": "markdown",
61+
"metadata": {},
62+
"source": [
63+
"Since the Mistral LLM is a completions model, they also allow you to insert a `suffix` to the prompt. Suffixes can be passed via the call options when invoking a model like so:"
64+
]
65+
},
66+
{
67+
"cell_type": "code",
68+
"execution_count": 4,
69+
"metadata": {},
70+
"outputs": [
71+
{
72+
"name": "stdout",
73+
"output_type": "stream",
74+
"text": [
75+
"\n",
76+
"console.log('hello world');\n",
77+
"```\n"
78+
]
79+
}
80+
],
81+
"source": [
82+
"const res = await model.invoke(\n",
83+
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\", {\n",
84+
" suffix: \"```\"\n",
85+
" }\n",
86+
");\n",
87+
"console.log(res);"
88+
]
89+
},
90+
{
91+
"cell_type": "markdown",
92+
"metadata": {},
93+
"source": [
94+
"As seen in the first example, the model generated the requested `console.log('hello world')` code snippet, but also included extra unwanted text. By adding a suffix, we can constrain the model to only complete the prompt up to the suffix (in this case, three backticks). This allows us to easily parse the completion and extract only the desired response without the suffix using a custom output parser."
95+
]
96+
},
97+
{
98+
"cell_type": "code",
99+
"execution_count": 1,
100+
"metadata": {},
101+
"outputs": [
102+
{
103+
"name": "stdout",
104+
"output_type": "stream",
105+
"text": [
106+
"\n",
107+
"console.log('hello world');\n",
108+
"\n"
109+
]
110+
}
111+
],
112+
"source": [
113+
"import { MistralAI } from \"@langchain/mistralai\";\n",
114+
"\n",
115+
"const model = new MistralAI({\n",
116+
" model: \"codestral-latest\",\n",
117+
" temperature: 0,\n",
118+
" apiKey: \"YOUR-API-KEY\",\n",
119+
"});\n",
120+
"\n",
121+
"const suffix = \"```\";\n",
122+
"\n",
123+
"const customOutputParser = (input: string) => {\n",
124+
" if (input.includes(suffix)) {\n",
125+
" return input.split(suffix)[0];\n",
126+
" }\n",
127+
" throw new Error(\"Input does not contain suffix.\")\n",
128+
"};\n",
129+
"\n",
130+
"const res = await model.invoke(\n",
131+
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\", {\n",
132+
" suffix,\n",
133+
" }\n",
134+
");\n",
135+
"\n",
136+
"console.log(customOutputParser(res));"
137+
]
138+
}
139+
],
140+
"metadata": {
141+
"kernelspec": {
142+
"display_name": "TypeScript",
143+
"language": "typescript",
144+
"name": "tslab"
145+
},
146+
"language_info": {
147+
"codemirror_mode": {
148+
"mode": "typescript",
149+
"name": "javascript",
150+
"typescript": true
151+
},
152+
"file_extension": ".ts",
153+
"mimetype": "text/typescript",
154+
"name": "typescript",
155+
"version": "3.7.2"
156+
}
157+
},
158+
"nbformat": 4,
159+
"nbformat_minor": 2
160+
}

examples/src/models/chat/chat_mistralai_tools.ts

+1-1
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ class CalculatorTool extends StructuredTool {
3333

3434
const model = new ChatMistralAI({
3535
apiKey: process.env.MISTRAL_API_KEY,
36-
model: "mistral-large",
36+
model: "mistral-large-latest",
3737
});
3838

3939
// Bind the tool to the model

examples/src/models/chat/chat_mistralai_wsa.ts

+1-1
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ const calculatorSchema = z
1414

1515
const model = new ChatMistralAI({
1616
apiKey: process.env.MISTRAL_API_KEY,
17-
model: "mistral-large",
17+
model: "mistral-large-latest",
1818
});
1919

2020
// Pass the schema and tool name to the withStructuredOutput method

examples/src/models/chat/chat_mistralai_wsa_json.ts

+1-1
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ const calculatorJsonSchema = {
2121

2222
const model = new ChatMistralAI({
2323
apiKey: process.env.MISTRAL_API_KEY,
24-
model: "mistral-large",
24+
model: "mistral-large-latest",
2525
});
2626

2727
// Pass the schema and tool name to the withStructuredOutput method

libs/langchain-mistralai/package.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@
4141
"license": "MIT",
4242
"dependencies": {
4343
"@langchain/core": ">0.1.56 <0.3.0",
44-
"@mistralai/mistralai": "^0.1.3",
44+
"@mistralai/mistralai": "^0.4.0",
4545
"uuid": "^9.0.0",
4646
"zod": "^3.22.4",
4747
"zod-to-json-schema": "^3.22.4"

0 commit comments

Comments
 (0)