Skip to content

List index out of range _parse_chat_history_gemini #859

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jpaodev opened this issue Apr 14, 2025 · 2 comments
Open

List index out of range _parse_chat_history_gemini #859

jpaodev opened this issue Apr 14, 2025 · 2 comments

Comments

@jpaodev
Copy link

jpaodev commented Apr 14, 2025

I'm getting the following error in file https://github.com/langchain-ai/langchain-google/blob/main/libs/vertexai/langchain_google_vertexai/chat_models.py

    input = context.run(step.invoke, input, config)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5440, in invoke
    return self.bound.invoke(
           ^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 331, in invoke
    self.generate_prompt(
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 894, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 719, in generate
    self._generate_with_cache(
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 960, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1374, in _generate
    return self._generate_gemini(
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1609, in _generate_gemini
    request = self._prepare_request_gemini(messages=messages, stop=stop, **kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1464, in _prepare_request_gemini
    system_instruction, contents = _parse_chat_history_gemini(
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/users/user/project/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 432, in _parse_chat_history_gemini
    content = parsed_content[0]
              ~~~~~~~~~~~~~~^^^
IndexError: list index out of range
During task with name 'agent' and id '843160a3-cf5c-da4f-13e9-db8b673e916b'

A safety check should be in place, e.g. parsed_content[0] if parsed_content else {"content": ""}

@langcarl langcarl bot added the investigate label Apr 14, 2025
@lkuligin
Copy link
Collaborator

could you share a reproducible snippet, please?

@jpaodev
Copy link
Author

jpaodev commented Apr 15, 2025

@lkuligin

from datetime import datetime, timezone

from langchain_core.prompts import ChatPromptTemplate
# ....

from typing import Annotated, Any, Dict, List, Union
from typing_extensions import TypedDict

from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from app.llm.base import llm_handler, llm
# ...
from langgraph.prebuilt import create_react_agent

class State(TypedDict):
    messages: Annotated[list, add_messages]
    selected_tools: list[str]
    tree: dict[str, Any]

class RepoAgent:
    def __init__(self, handler: Union[TestHandler, TestHandler2], branch: str | None = None, system_message: str | None = None):
        self.handler = handler
        self.branch = branch
        self.system_message = system_message
        self.agent = self.get_repo_agent()
        
    def get_repo_agent(self):
        if not llm:
            raise ValueError("LLM not initialized")
        
        builder = StateGraph(State)

        tools = get_repo_agent_tools(self.handler, self.branch)
        if not self.handler.project:
            raise ValueError("No Project has been provided.")
        
        def fetch_tree(state: State):
            return state
        
        def agent(state: State):
            # Map tool IDs to actual tools
            # based on the state's selected_tools list.
            if not llm:
                raise ValueError("LLM not initialized")
            selected_tools = tools
            # Bind the selected tools to the LLM for the current interaction.
            llm_with_tools = llm.bind_tools(selected_tools)
            # Invoke the LLM with the current messages and return the updated message list.
            return {"messages": [llm_with_tools.invoke(state["messages"], config={"callbacks": [CostCallbackHandler(self.handler.usage_information)]})]}


        pb_react_agent = create_react_agent(llm, tools=tools, prompt=self.system_message)
        builder.add_node("agent", agent)

        builder.add_edge(START, "fetch_tree")
        builder.add_node("fetch_tree", fetch_tree)
        builder.add_node("pb_agent", pb_react_agent)
        builder.add_edge("fetch_tree", "pb_agent")
        
        graph = builder.compile()
        return graph

    def get_agent(self):
        return self.agent

This was the example code snippet that was failing for me, thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants