You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AI agents need configurable settings for LLM parameters (e.g., temperature, thinking time), but it's unclear where to manage them—globally, per model, per agent, or per request.
Currently the Agent class can programmatically set them.
Possible Solutions (Briefly):
Hierarchical Settings: Global → Model → Agent → Chat → Request.
Model-Specific Properties: The model should expose modifiable parameters for users those can then be set per Agent in the Configuration
Chat UI Controls: A settings icon in chat to modify llm settings at request time.
The text was updated successfully, but these errors were encountered:
Enhance request settings handling by reusing existing preferences, introducing scope management, and adding client-specific settings.
Tasks
Reuse current request settings preferences to ensure consistency.
Introduce a scope object with three optional properties:
modelId
providerId
agentId
Ensure proper merging of scopes in the following priority order:
Agent > Model > Provider
Implement client settings to retain:
Tool calls
Thinking state
Introduce request settings for LLM-specific properties to allow fine-tuned configuration.
Chat UI should use a chat scope with request settings, where the chat scope takes precedence.
@planger you had local changes on how request settings are handled, we would love to have your input for our suggestion as well as some insights in your planned implementation
Feature Description:
AI agents need configurable settings for LLM parameters (e.g., temperature, thinking time), but it's unclear where to manage them—globally, per model, per agent, or per request.
Currently the Agent class can programmatically set them.
Possible Solutions (Briefly):
Hierarchical Settings: Global → Model → Agent → Chat → Request.
Model-Specific Properties: The model should expose modifiable parameters for users those can then be set per Agent in the Configuration
Chat UI Controls: A settings icon in chat to modify llm settings at request time.
The text was updated successfully, but these errors were encountered: