Open
Description
We apply additional instrumentation to gather prompts, completions and token usage, but don't have the model used to make the request, yet.
It is stored as a private attribute on the OpenAIResponsesModel
class, so it's hard to access.
It is possible we will need to move our instrumentation up to the OpenAI client level to extract all of the default parameters. We should be able to get model params (temperature, top_p, etc) from there, too.
Metadata
Metadata
Assignees
Labels
No labels