Skip to content

TypeError: chatMessage._getType is not a function #1573

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
riccardolinares opened this issue Jun 7, 2023 · 11 comments
Closed

TypeError: chatMessage._getType is not a function #1573

riccardolinares opened this issue Jun 7, 2023 · 11 comments

Comments

@riccardolinares
Copy link

I m following the YT tutorial, but after the first message sent correctly, the second one give me this error:

error TypeError: chatMessage._getType is not a function
    at file:///Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:67:33
    at Array.map (<anonymous>)
    at ConversationalRetrievalQAChain.getChatHistoryString (file:///Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:66:18)
    at ConversationalRetrievalQAChain._call (file:///Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:90:60)
    at ConversationalRetrievalQAChain.call (file:///Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/langchain/dist/chains/base.js:65:39)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async handler (webpack-internal:///(api)/./pages/api/chat.ts:45:26)
    at async Object.apiResolver (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/api-utils/node.js:372:9)
    at async DevServer.runApi (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/next-server.js:513:9)
    at async Object.fn (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/next-server.js:815:35)
    at async Router.execute (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/router.js:243:32)
    at async DevServer.runImpl (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/base-server.js:432:29)
    at async DevServer.run (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/dev/next-dev-server.js:814:20)
    at async DevServer.handleRequestImpl (/Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/base-server.js:375:20)
    at async /Users/riccardolinares/Projects/gpt4-pdf-chatbot-langchain/node_modules/next/dist/server/base-server.js:157:99

Version:
"langchain": "^0.0.91",

@borel
Copy link

borel commented Jun 7, 2023

Hey @riccardolinares , how did you manage chat history ?

maybe you mixmatch custom and default chat memory history.

@riccardolinares
Copy link
Author

riccardolinares commented Jun 7, 2023

I am working with Next.js and I create an API Route that take the question and the history from the body of the request.

api/chat:

  const { question, history } = req.body;

    const response = await chain.call({
      question: question,
      chat_history: history || [],
    });

On the page this is my code:

index.tsx

const [messageState, setMessageState] = useState<{
   messages: Message[];
   pending?: string;
   history: [string, string][];
   pendingSourceDocs?: Document[];
 }>({
   messages: [
     {
       message: 'Hi, what would you like to learn about this document?',
       type: 'apiMessage',
     },
   ],
   history: [],
 });

 const { messages, history } = messageState;

and if the response from the API Route is ok, I update the MessageState as follow:

try {
     const response = await fetch('/api/chat', {
       method: 'POST',
       headers: {
         'Content-Type': 'application/json',
       },
       body: JSON.stringify({
         question,
         history,
       }),
     });
     const data = await response.json();
     console.log('data', data);

     if (data.error) {
       setError(data.error);
     } else {
       setMessageState((state) => ({
         ...state,
         messages: [
           ...state.messages,
           {
             type: 'apiMessage',
             message: data.text,
             sourceDocs: data.sourceDocuments,
           },
         ],
         history: [...state.history, [question, data.text]],
       }));
     }

I am not sure if it is clear enough... in case let me know

@borel
Copy link

borel commented Jun 7, 2023

Hey @riccardolinares , what kind of chain are you using

    const response = await chain.call({
      question: question,
      chat_history: history || [],
    });

Not sure it expect to pass directly the chat_history like that

@riccardolinares
Copy link
Author

These are my chain and model:

const model = new OpenAI({
    temperature: 0, // increase temepreature to get more creative answers
    modelName: 'gpt-3.5-turbo', //change this to gpt-4 if you have access
  });

  const chain = ConversationalRetrievalQAChain.fromLLM(
    model,
    vectorstore.asRetriever(),
    {
      qaTemplate: QA_PROMPT,
      questionGeneratorTemplate: CONDENSE_PROMPT,
      returnSourceDocuments: true, //The number of source documents returned is 4 by default
    },
  );

btw it seems like the _getType() function is not found at all :/

@bryceamacker
Copy link

Also getting this issue, I believe it's a new breaking change but not sure where it's introduced.

@borel
Copy link

borel commented Jun 8, 2023

Hi @bryceamacker @riccardolinares

The documentation is pretty bad for this case .

There is two solutions to manage custom history using ConversationalRetrievalQAChain.

  1. First you can use a chat_history as a string and use concatenation like this
    const chatHistory = question + res.text;

  2. You can use an array as chat_history but you have to use the required langchain model ( https://js.langchain.com/docs/modules/schema/chat-messages ) like this
    const chatHistory = [new HumanChatMessage("Is Karim Benzema is a great plyer"),new AIChatMessage("Yes its is !")];

@siddarthvader
Copy link
Contributor

this works, thanks for the help @borel

@riccardolinares
Copy link
Author

I was using an array of strings to manage the chat history. Converting the array into a single string solved my problem! :)

Thanks!

@davideuler
Copy link

I came across this issue too after upgrade langchain to 0.0.90. And I fixed the issue in my fork.

https://github.com/davideuler/gpt4-pdf-chatbot-langchain-chromadb

@esponges
Copy link

I came across this issue too after upgrade langchain to 0.0.90. And I fixed the issue in my fork.

https://github.com/davideuler/gpt4-pdf-chatbot-langchain-chromadb

This is a clean approach, using @borel second strategy. Thanks!

@kirandash
Copy link

This worked for me:

import { AIMessage, HumanMessage } from "@langchain/core/messages";
chatHistory.push(new HumanMessage(initialPrompt), new AIMessage(response.text));

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants