Skip to content

Structured Output #895

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
r0path opened this issue Apr 29, 2025 · 6 comments
Open

Structured Output #895

r0path opened this issue Apr 29, 2025 · 6 comments

Comments

@r0path
Copy link

r0path commented Apr 29, 2025

The native google SDK / api allows for structured output however langchain's implementation uses an optional tool. can we have it use this actual structured output (as supported by the API) instead of optional tool output? the current implementation makes usage of langchain / vertexai not usable for production.

@lkuligin
Copy link
Collaborator

lkuligin commented May 7, 2025

could you elaborate a little bit more, please?

@rek7
Copy link

rek7 commented May 7, 2025

@lkuligin it seems (from the code and documentation) that structured output in langchain is only done by using an optional tool call (ie a tool that has the output schema) instead of forced structured output like the normal google genai SDK API. This is bad primarily because it gives the "ai" an option to not respond with the requested format. this happens a lot from my testing.

@windkit
Copy link
Contributor

windkit commented May 8, 2025

@rek7 can you provde an example code that the output is not as instructed? So far I have used ChatGoogleGenerativeAI with Gemini 2.0 and 2.5 models with no issues with a Pydantic class. It would be great if you can provide a minimal reproducible example so we can hunt the issue down.

@jmorenobrasero
Copy link

@r0path @rek7 Have you tried to use "json_mode" as method?
Like this:

from langchain_google_vertexai import ChatVertexAI
from pydantic import BaseModel, Field


class Joke(BaseModel):
    setup: str = Field(description="Question to set up a joke")
    punchline: str = Field(description="Answer to resolve the joke")


structured_model = ChatVertexAI(model_name="gemini-2.0-flash").with_structured_output(
    Joke, method="json_mode"
)

@Nayphilim
Copy link

@r0path @rek7 Have you tried to use "json_mode" as method? Like this:

from langchain_google_vertexai import ChatVertexAI
from pydantic import BaseModel, Field


class Joke(BaseModel):
    setup: str = Field(description="Question to set up a joke")
    punchline: str = Field(description="Answer to resolve the joke")


structured_model = ChatVertexAI(model_name="gemini-2.0-flash").with_structured_output(
    Joke, method="json_mode"
)

i have tried this and found that it does not work for more complex models that include optional, union etc due to some kind of incompatibility with protobuf. Heres the error i get:

Traceback (most recent call last):
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/rules/message.py", line 36, in to_proto
    return self._descriptor(**value)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Parameter to CopyFrom() must be instance of same class: expected <class 'Schema'> got <class 'dict'>.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/rules/message.py", line 36, in to_proto
    return self._descriptor(**value)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: Protocol message Schema has no "anyOf" field.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/nathan/Documents/opentrace/insight-agent/src/insight_agent/agents/base_agent/base.py", line 465, in _generate_tool_timeline_event
    analysis = await structured_llm.ainvoke(prompt_messages)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3075, in ainvoke
    input = await coro_with_context(part(), context, create_task=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5429, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 392, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 958, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 916, in agenerate
    raise exceptions[0]
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 1084, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1797, in _agenerate
    return await self._agenerate_gemini(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1670, in _agenerate_gemini
    request=self._prepare_request_gemini(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1521, in _prepare_request_gemini
    generation_config = self._generation_config_gemini(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 1448, in _generation_config_gemini
    return GenerationConfig(
           ^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/message.py", line 728, in __init__
    pb_value = marshal.to_proto(pb_type, value)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/marshal.py", line 235, in to_proto
    pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/rules/message.py", line 46, in to_proto
    return self._wrapper(value)._pb
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/message.py", line 728, in __init__
    pb_value = marshal.to_proto(pb_type, value)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/marshal.py", line 233, in to_proto
    return {k: self.to_proto(recursive_type, v) for k, v in value.items()}
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/marshal.py", line 233, in <dictcomp>
    return {k: self.to_proto(recursive_type, v) for k, v in value.items()}
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/marshal.py", line 235, in to_proto
    pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/marshal/rules/message.py", line 46, in to_proto
    return self._wrapper(value)._pb
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/opentrace/insight-agent/.venv/lib/python3.11/site-packages/proto/message.py", line 724, in __init__
    raise ValueError(
ValueError: Unknown field for Schema: anyOf

@Nayphilim
Copy link

is anyone else experiencing issues with structured output not making use of the field description annotations?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants