Skip to content

docs[patch]: Add pointers from LLM pages to chat model pages #5719

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/core_docs/docs/integrations/llms/bedrock.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Bedrock

:::caution
You are currently on a page documenting the use of Amazon Bedrock models as [text completion models](/docs/concepts/#llms). Many popular models available on Bedrock are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/bedrock/).
:::

> [Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that makes Foundation Models (FMs)
> from leading AI startups and Amazon available via an API. You can choose from a wide range of FMs to find the model that is best suited for your use case.

Expand Down
6 changes: 6 additions & 0 deletions docs/core_docs/docs/integrations/llms/cohere.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Cohere

:::caution
You are currently on a page documenting the use of Cohere models as [text completion models](/docs/concepts/#llms). Many popular models available on Cohere are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/cohere/).
:::

import CodeBlock from "@theme/CodeBlock";

LangChain.js supports Cohere LLMs. Here's an example:
Expand Down
6 changes: 6 additions & 0 deletions docs/core_docs/docs/integrations/llms/fireworks.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ import CodeBlock from "@theme/CodeBlock";

# Fireworks

:::caution
You are currently on a page documenting the use of Fireworks models as [text completion models](/docs/concepts/#llms). Many popular models available on Fireworks are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/fireworks/).
:::

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

<IntegrationInstallTooltip></IntegrationInstallTooltip>
Expand Down
8 changes: 7 additions & 1 deletion docs/core_docs/docs/integrations/llms/google_vertex_ai.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# Google Vertex AI

Langchain.js supports two different authentication methods based on whether
:::caution
You are currently on a page documenting the use of Google Vertex models as [text completion models](/docs/concepts/#llms). Many popular models available on Google Vertex are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/google_vertex_ai/).
:::

LangChain.js supports two different authentication methods based on whether
you're running in a Node.js environment or a web environment.

## Setup
Expand Down
8 changes: 7 additions & 1 deletion docs/core_docs/docs/integrations/llms/ollama.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# Ollama

[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 2, locally.
:::caution
You are currently on a page documenting the use of Ollama models as [text completion models](/docs/concepts/#llms). Many popular models available on Ollama are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/ollama/).
:::

[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as Llama 3, locally.

Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. It optimizes setup and configuration details, including GPU usage.

Expand Down
6 changes: 6 additions & 0 deletions docs/core_docs/docs/integrations/llms/togetherai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@ import CodeBlock from "@theme/CodeBlock";

# Together AI

:::caution
You are currently on a page documenting the use of Together AI models as [text completion models](/docs/concepts/#llms). Many popular models available on Together AI are [chat completion models](/docs/concepts/#chat-models).

You may be looking for [this page instead](/docs/integrations/chat/togetherai/).
:::

Here's an example of calling a Together AI model as an LLM:

import TogetherAI from "@examples/models/llm/togetherai.ts";
Expand Down
20 changes: 15 additions & 5 deletions docs/core_docs/docs/integrations/platforms/aws.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,17 @@ keywords: [bedrock]

# AWS

All functionality related to [Amazon AWS](https://aws.amazon.com/) platform
All functionality related to the [Amazon AWS](https://aws.amazon.com/) platform.

## Chat Models

### Bedrock

See a [usage example](/docs/integrations/chat/bedrock).

```typescript
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
```

## LLMs

Expand All @@ -13,7 +23,7 @@ All functionality related to [Amazon AWS](https://aws.amazon.com/) platform
See a [usage example](/docs/integrations/llms/bedrock).

```typescript
import { Bedrock } from "langchain/llms/bedrock";
import { Bedrock } from "@langchain/community/llms/bedrock";
```

### SageMaker Endpoint
Expand All @@ -28,7 +38,7 @@ See a [usage example](/docs/integrations/llms/aws_sagemaker).
import {
SagemakerEndpoint,
SageMakerLLMContentHandler,
} from "langchain/llms/sagemaker_endpoint";
} from "@langchain/community/llms/sagemaker_endpoint";
```

## Text Embedding Models
Expand All @@ -38,7 +48,7 @@ import {
See a [usage example](/docs/integrations/text_embedding/bedrock).

```typescript
import { BedrockEmbeddings } from "langchain/embeddings/bedrock";
import { BedrockEmbeddings } from "@langchain/community/embeddings/bedrock";
```

## Document loaders
Expand All @@ -55,7 +65,7 @@ npm install @aws-sdk/client-s3
```

```typescript
import { S3Loader } from "langchain/document_loaders/web/s3";
import { S3Loader } from "@langchain/community/document_loaders/web/s3";
```

## Memory
Expand Down
Loading