Skip to content

[Theia AI] Add LLMSettings to UI #15100

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Tracked by #15405
eneufeld opened this issue Mar 3, 2025 · 3 comments
Closed
Tracked by #15405

[Theia AI] Add LLMSettings to UI #15100

eneufeld opened this issue Mar 3, 2025 · 3 comments
Labels

Comments

@eneufeld
Copy link
Contributor

eneufeld commented Mar 3, 2025

Feature Description:

AI agents need configurable settings for LLM parameters (e.g., temperature, thinking time), but it's unclear where to manage them—globally, per model, per agent, or per request.
Currently the Agent class can programmatically set them.

Possible Solutions (Briefly):
Hierarchical Settings: Global → Model → Agent → Chat → Request.
Model-Specific Properties: The model should expose modifiable parameters for users those can then be set per Agent in the Configuration
Chat UI Controls: A settings icon in chat to modify llm settings at request time.

@eneufeld eneufeld mentioned this issue Mar 3, 2025
68 tasks
@eneufeld
Copy link
Contributor Author

eneufeld commented Mar 6, 2025

Improve Request Settings and Scope Management

Summary

Enhance request settings handling by reusing existing preferences, introducing scope management, and adding client-specific settings.

Tasks

  • Reuse current request settings preferences to ensure consistency.
  • Introduce a scope object with three optional properties:
    • modelId
    • providerId
    • agentId
  • Ensure proper merging of scopes in the following priority order:
    • Agent > Model > Provider
  • Implement client settings to retain:
    • Tool calls
    • Thinking state
  • Introduce request settings for LLM-specific properties to allow fine-tuned configuration.
  • Chat UI should use a chat scope with request settings, where the chat scope takes precedence.

@planger you had local changes on how request settings are handled, we would love to have your input for our suggestion as well as some insights in your planned implementation

@planger
Copy link
Contributor

planger commented Mar 7, 2025

@eneufeld Looks good to me, thanks!
I've opened the PR for the changes how requests are centralized now, see #15146

@sdirix sdirix added the TheiaAI label Mar 20, 2025
@eneufeld
Copy link
Contributor Author

Fixed via #15092

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants