Skip to content

Commit 1058368

Browse files
authored
docs[patch]: Add onlyWsa prop to chat model tabs, other nits (#5405)
* docs[patch]: Add onlyWsa prop to chat model tabs, other nits * fix link * chore: lint files * cr
1 parent b414e39 commit 1058368

File tree

3 files changed

+43
-14
lines changed

3 files changed

+43
-14
lines changed

docs/core_docs/docs/how_to/structured_output.ipynb

+2-6
Original file line numberDiff line numberDiff line change
@@ -29,20 +29,16 @@
2929
"\n",
3030
"## The `.withStructuredOutput()` method\n",
3131
"\n",
32-
"There are several strategies that models can use under the hood. For some of the most popular model providers, including [OpenAI](/docs/integrations/platforms/openai/), [Anthropic](/docs/integrations/platforms/anthropic/), and [Mistral](/docs/integrations/providers/mistralai/), LangChain implements a common interface that abstracts away these strategies called `.withStructuredOutput`.\n",
32+
"There are several strategies that models can use under the hood. For some of the most popular model providers, including [Anthropic](/docs/integrations/platforms/anthropic/), [Google VertexAI](/docs/integrations/platforms/google/), [Mistral](/docs/integrations/providers/mistralai/), and [OpenAI](/docs/integrations/platforms/openai/) LangChain implements a common interface that abstracts away these strategies called `.withStructuredOutput`.\n",
3333
"\n",
3434
"By invoking this method (and passing in [JSON schema](https://json-schema.org/) or a [Zod schema](https://zod.dev/)) the model will add whatever model parameters + output parsers are necessary to get back structured output matching the requested schema. If the model supports more than one way to do this (e.g., function calling vs JSON mode) - you can configure which method to use by passing into that method.\n",
3535
"\n",
36-
"You can find the [current list of models that support this method here](/docs/integrations/chat/).\n",
37-
"\n",
3836
"Let's look at some examples of this in action! We'll use Zod to create a simple response schema.\n",
3937
"\n",
4038
"```{=mdx}\n",
4139
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
4240
"\n",
43-
"<ChatModelTabs\n",
44-
" customVarName=\"model\"\n",
45-
"/>\n",
41+
"<ChatModelTabs onlyWsa={true} />\n",
4642
"```"
4743
]
4844
},
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
:::tip
2-
See [this section for general instructions on installing integration packages](/docs/get_started/installation#installing-integration-packages).
2+
See [this section for general instructions on installing integration packages](/docs/how_to/installation#installing-integration-packages).
33
:::

docs/core_docs/src/theme/ChatModelTabs.js

+40-7
Original file line numberDiff line numberDiff line change
@@ -25,23 +25,34 @@ function InstallationInfo({ children }) {
2525
}
2626

2727
const DEFAULTS = {
28-
openaiParams: `{\n model: "gpt-3.5-turbo-0125",\n temperature: 0\n}`,
28+
openaiParams: `{\n model: "gpt-3.5-turbo",\n temperature: 0\n}`,
2929
anthropicParams: `{\n model: "claude-3-sonnet-20240229",\n temperature: 0\n}`,
3030
fireworksParams: `{\n model: "accounts/fireworks/models/firefunction-v1",\n temperature: 0\n}`,
3131
mistralParams: `{\n model: "mistral-large-latest",\n temperature: 0\n}`,
32+
groqParams: `{\n model: "mixtral-8x7b-32768",\n temperature: 0\n}`,
33+
vertexParams: `{\n model: "gemini-1.5-pro",\n temperature: 0\n}`,
3234
};
3335

36+
const MODELS_WSA = ["openai", "anthropic", "mistral", "groq", "vertex"];
37+
3438
/**
3539
* @typedef {Object} ChatModelTabsProps - Component props.
36-
* @property {string} [openaiParams] - Parameters for OpenAI chat model. Defaults to `"{\n model: "gpt-3.5-turbo-0125",\n temperature: 0\n}"`
40+
* @property {string} [openaiParams] - Parameters for OpenAI chat model. Defaults to `"{\n model: "gpt-3.5-turbo",\n temperature: 0\n}"`
3741
* @property {string} [anthropicParams] - Parameters for Anthropic chat model. Defaults to `"{\n model: "claude-3-sonnet-20240229",\n temperature: 0\n}"`
3842
* @property {string} [fireworksParams] - Parameters for Fireworks chat model. Defaults to `"{\n model: "accounts/fireworks/models/firefunction-v1",\n temperature: 0\n}"`
3943
* @property {string} [mistralParams] - Parameters for Mistral chat model. Defaults to `"{\n model: "mistral-large-latest",\n temperature: 0\n}"`
44+
* @property {string} [groqParams] - Parameters for Groq chat model. Defaults to `"{\n model: "mixtral-8x7b-32768",\n temperature: 0\n}"`
45+
* @property {string} [vertexParams] - Parameters for Google VertexAI chat model. Defaults to `"{\n model: "gemini-1.5-pro",\n temperature: 0\n}"`
46+
*
4047
* @property {boolean} [hideOpenai] - Whether or not to hide OpenAI chat model.
4148
* @property {boolean} [hideAnthropic] - Whether or not to hide Anthropic chat model.
4249
* @property {boolean} [hideFireworks] - Whether or not to hide Fireworks chat model.
4350
* @property {boolean} [hideMistral] - Whether or not to hide Mistral chat model.
51+
* @property {boolean} [hideGroq] - Whether or not to hide Mistral chat model.
52+
* @property {boolean} [hideVertex] - Whether or not to hide Mistral chat model.
53+
*
4454
* @property {string} [customVarName] - Custom variable name for the model. Defaults to `"model"`.
55+
* @property {boolean} [onlyWsa] - Only display models which have `withStructuredOutput` implemented.
4556
*/
4657

4758
/**
@@ -56,49 +67,71 @@ export default function ChatModelTabs(props) {
5667
const anthropicParams = props.anthropicParams ?? DEFAULTS.anthropicParams;
5768
const fireworksParams = props.fireworksParams ?? DEFAULTS.fireworksParams;
5869
const mistralParams = props.mistralParams ?? DEFAULTS.mistralParams;
70+
const groqParams = props.groqParams ?? DEFAULTS.groqParams;
71+
const vertexParams = props.vertexParams ?? DEFAULTS.vertexParams;
5972
const providers = props.providers ?? [
6073
"openai",
6174
"anthropic",
6275
"fireworks",
6376
"mistral",
77+
"groq",
78+
"vertex",
6479
];
6580

6681
const tabs = {
6782
openai: {
68-
value: "OpenAI",
83+
value: "openai",
6984
label: "OpenAI",
7085
default: true,
7186
text: `import { ChatOpenAI } from "@langchain/openai";\n\nconst ${llmVarName} = new ChatOpenAI(${openaiParams});`,
7287
envs: `OPENAI_API_KEY=your-api-key`,
7388
dependencies: "@langchain/openai",
7489
},
7590
anthropic: {
76-
value: "Anthropic",
91+
value: "anthropic",
7792
label: "Anthropic",
7893
default: false,
7994
text: `import { ChatAnthropic } from "@langchain/anthropic";\n\nconst ${llmVarName} = new ChatAnthropic(${anthropicParams});`,
8095
envs: `ANTHROPIC_API_KEY=your-api-key`,
8196
dependencies: "@langchain/anthropic",
8297
},
8398
fireworks: {
84-
value: "FireworksAI",
99+
value: "fireworks",
85100
label: "FireworksAI",
86101
default: false,
87102
text: `import { ChatFireworks } from "@langchain/community/chat_models/fireworks";\n\nconst ${llmVarName} = new ChatFireworks(${fireworksParams});`,
88103
envs: `FIREWORKS_API_KEY=your-api-key`,
89104
dependencies: "@langchain/community",
90105
},
91106
mistral: {
92-
value: "MistralAI",
107+
value: "mistral",
93108
label: "MistralAI",
94109
default: false,
95110
text: `import { ChatMistralAI } from "@langchain/mistralai";\n\nconst ${llmVarName} = new ChatMistralAI(${mistralParams});`,
96111
envs: `MISTRAL_API_KEY=your-api-key`,
97112
dependencies: "@langchain/mistralai",
98113
},
114+
groq: {
115+
value: "groq",
116+
label: "Groq",
117+
default: false,
118+
text: `import { ChatGroq } from "@langchain/groq";\n\nconst ${llmVarName} = new ChatGroq(${groqParams});`,
119+
envs: `GROQ_API_KEY=your-api-key`,
120+
dependencies: "@langchain/groq",
121+
},
122+
vertex: {
123+
value: "vertex",
124+
label: "VertexAI",
125+
default: false,
126+
text: `import { ChatVertexAI } from "@langchain/google-vertexai";\n\nconst ${llmVarName} = new ChatVertexAI(${vertexParams});`,
127+
envs: `GOOGLE_APPLICATION_CREDENTIALS=credentials.json`,
128+
dependencies: "@langchain/google-vertexai",
129+
},
99130
};
100131

101-
const displayedTabs = providers.map((provider) => tabs[provider]);
132+
const displayedTabs = (props.onlyWsa ? MODELS_WSA : providers).map(
133+
(provider) => tabs[provider]
134+
);
102135

103136
return (
104137
<div>

0 commit comments

Comments
 (0)