Skip to content

Fix LiteLLMModel API key usage in CLI #788

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from Feb 25, 2025
Merged

Fix LiteLLMModel API key usage in CLI #788

merged 1 commit into from Feb 25, 2025

Conversation

ghost
Copy link

@ghost ghost commented Feb 25, 2025

This PR fixes LiteLLMModel API key handling in the CLI. Before this, when running without --api-key, the CLI defaulted to using OPENAI_API_KEY even for non-OpenAI models, preventing LiteLLM's built-in API key detection based on the model provider.

I've tested the fix with:

export OPENAI_API_KEY="sk-openai-key"
export ANTHROPIC_API_KEY="sk-anthropic-key"

smolagent "What is the 42nd decimal digit of π?" --model-type "LiteLLMModel" --model-id "anthropic/claude-3-7-sonnet-latest"

Since the load_model function is called in both smolagent and webagent, this fix should improve this behavior for both.

Closes #787

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Good catch!

@albertvillanova albertvillanova merged commit 1c49ae2 into huggingface:main Feb 25, 2025
3 checks passed
@ghost ghost deleted the fix-litellm-api-key branch February 25, 2025 13:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] smolagent CLI uses OPENAI_API_KEY for LiteLLMModel when no API key is provided
1 participant