Skip to content

Error TS2415: Class 'ChatOpenAI<CallOptions>' incorrectly extends base class 'BaseChatModel<CallOptions, AIMessageChunk>' #5675

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
5 tasks done
johnnyoshika opened this issue Jun 5, 2024 · 6 comments · Fixed by #5689
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@johnnyoshika
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Compiling a project that uses ChatOpenAI from @langchain/openai during a Docker build results in an error. To reproduce:

package.json:

  "scripts": {
    "build": "tsc"
  }

Dockerfile:

FROM node:18

WORKDIR /usr/src/app

ENV PORT 8080

RUN npm install

COPY . .
RUN npm run build

CMD ["npm","start"]
docker build --no-cache -t container-name .

Error Message and Stack Trace (if applicable)

node_modules/@langchain/openai/dist/chat_models.d.ts(68,22): error TS2415: Class 'ChatOpenAI<CallOptions>' incorrectly extends base class 'BaseChatModel<CallOptions, AIMessageChunk>'.
'getLsParams' is protected in type 'ChatOpenAI<CallOptions>' but public in type 'BaseChatModel<CallOptions, AIMessageChunk>'.

Description

I'm trying to build a project inside of a Docker build process and it fails with the error listed above.

I have to enable "skipLibCheck": true in my tsconfig.json in order to get around this problem.

Note that running the same command (npm run build, which runs tsc) directly in macOS or Windows does not result in this error. It seems to only occur during a docker build.

System Info

Node: 18
[email protected]
[email protected]
@langchain/[email protected]

Running docker build --no-cache -t container-name . command in macOS Terminal.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jun 5, 2024
Copy link

dosubot bot commented Jun 5, 2024

To resolve the TypeScript compilation error related to ChatOpenAI from @langchain/openai during a Docker build, you can try the following steps:

  1. Ensure all dependencies are installed correctly: Sometimes, Docker might not install all dependencies correctly. Make sure to run npm install inside your Docker container.

  2. Check TypeScript configuration: Ensure that your tsconfig.json is correctly configured and includes all necessary paths and settings. You can refer to the tsconfig.cjs.json used in @langchain/openai for guidance:

    {
      "extends": "./tsconfig.json",
      "compilerOptions": {
        "module": "commonjs",
        "declaration": false
      },
      "exclude": [
        "node_modules",
        "dist",
        "docs",
        "**/tests"
      ]
    }
  3. Install missing types: Sometimes, TypeScript might be missing type definitions. Ensure that you have installed all necessary type definitions, especially for @langchain/openai.

  4. Use a specific version of dependencies: There might be differences in how dependencies are resolved in Docker compared to macOS or Windows. Pin the versions of your dependencies in package.json. For example, ensure you are using compatible versions as specified in @langchain/openai:

    "dependencies": {
      "@langchain/core": ">=0.2.5 <0.3.0",
      "js-tiktoken": "^1.0.12",
      "openai": "^4.41.1",
      "zod": "^3.22.4",
      "zod-to-json-schema": "^3.22.3"
    },
    "devDependencies": {
      "@azure/identity": "^4.2.0",
      "@jest/globals": "^29.5.0",
      "@langchain/scripts": "~0.0.14",
      "@langchain/standard-tests": "workspace:*",
      "@swc/core": "^1.3.90",
      "@swc/jest": "^0.2.29",
      "dpdm": "^3.12.0",
      "eslint": "^8.33.0",
      "eslint-config-airbnb-base": "^15.0.0",
      "eslint-config-prettier": "^8.6.0",
      "eslint-plugin-import": "^2.27.5",
      "eslint-plugin-jest": "^27.6.0",
      "eslint-plugin-no-instanceof": "^1.0.1",
      "eslint-plugin-prettier": "^4.2.1",
      "jest": "^29.5.0",
      "jest-environment-node": "^29.6.4",
      "prettier": "^2.8.3",
      "release-it": "^15.10.1",
      "rimraf": "^5.0.1",
      "typescript": "~5.1.6"
    }
  5. Check for environment variables: The ChatOpenAI class requires certain environment variables to be set, such as OPENAI_API_KEY for OpenAI or AZURE_OPENAI_API_KEY, AZURE_OPENAI_API_INSTANCE_NAME, AZURE_OPENAI_API_DEPLOYMENT_NAME, and AZURE_OPENAI_API_VERSION for Azure. Ensure these are correctly set in your Docker environment.

Here is an example of how you might configure your Dockerfile to ensure all dependencies are installed correctly:

FROM node:18

# Create app directory
WORKDIR /usr/src/app

# Install app dependencies
COPY package*.json ./

RUN npm install

# Bundle app source
COPY . .

# Build the TypeScript code
RUN npm run build

# If you have a start script
CMD [ "npm", "start" ]

And ensure your tsconfig.json includes necessary configurations:

{
  "compilerOptions": {
    "target": "ES6",
    "module": "commonjs",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "outDir": "./dist"
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "**/*.spec.ts"]
}

Additionally, ensure that your environment variables are correctly set in your Docker environment. You can do this by adding the following to your Dockerfile:

# Set environment variables
ENV OPENAI_API_KEY=<your_openai_api_key>
ENV AZURE_OPENAI_API_KEY=<your_azure_openai_api_key>
ENV AZURE_OPENAI_API_INSTANCE_NAME=<your_azure_instance_name>
ENV AZURE_OPENAI_API_DEPLOYMENT_NAME=<your_azure_deployment_name>
ENV AZURE_OPENAI_API_VERSION=<your_azure_api_version>

If the issue persists, you might need to provide more specific error messages or logs to diagnose further. Additionally, ensure that the ChatOpenAI class and its dependencies are correctly imported and used as per the documentation:

import { ChatOpenAI } from "@langchain/openai";

// Example usage
const model = new ChatOpenAI({
  temperature: 0.9,
  model: "ft:gpt-3.5-turbo-0613:{ORG_NAME}::{MODEL_ID}",
});

const message = await model.invoke("Hi there!");
console.log(message);

Make sure that all required environment variables and configurations are correctly set up as described in the ChatOpenAI class documentation [1][2][3][4][5].

To continue talking to Dosu, mention @dosu.

@johnnyoshika
Copy link
Author

johnnyoshika commented Jun 5, 2024

I created this repo to easily reproduce the bug in 4 easy steps. It turns out that this problem even occurs outside of docker build step: https://github.com/johnnyoshika/langchain-ts2415

Demo:

demo.mov

@viswak-cn
Copy link

viswak-cn commented Jun 6, 2024

I can confirm that I am facing a related issue when I am trying to do something like the following:

import { createToolCallingAgent } from 'langchain/agents'
import { ChatOpenAI } from '@langchain/openai'
.
.
.
this.client = new ChatOpenAI({
    model: 'gpt-4o',
    temperature: 0,
})
.
.
.
const agent = createToolCallingAgent({
    llm: this.client,  // <---- The point where error occurs (even my TS linter is able to point it out)
    tools,
    prompt,
})

I get the following error:

error TS2322: Type 'ChatOpenAI<ChatOpenAICallOptions>' is not assignable to type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>'.
  Property 'getLsParams' is protected in type 'ChatOpenAI<ChatOpenAICallOptions>' but public in type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>'

@bracesproul
Copy link
Member

Sorry you were getting this error! If you upgrade to the latest version of @langchain/openai (0.1.2) it should be resolved. The issue here was I updated all instances of getLsParams to be public, instead of protected, but only cut a release for @langchain/core, meaning the released code for openai/other partner packages was incompatible with the released code in @langchain/core.

We're taking steps to add tests for this, so this type of issue can be caught in the future.

@johnnyoshika
Copy link
Author

@bracesproul thanks for the quick fix!

@AumeshMisra
Copy link
Contributor

Hi @bracesproul, I am seeing a similar issue with: #5736 even with the latest packages.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants