We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refs:
The text was updated successfully, but these errors were encountered:
One possible design:
llm chat -T 'Datasette("https://private.datasette.io/content", key="private-datasette"})'
This would not require additional code in core, it could be a convention that plugins know how to look things up in LLM keys.
(I originally had secret="name-of-key" but using key= is more consistent with how things are named already.)
secret="name-of-key"
key=
Sorry, something went wrong.
The llm.get_key() method is currently undocumented and confusing to use:
llm.get_key()
>>> import llm >>> llm.get_key("hi") Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: get_key() missing 1 required positional argument: 'key_alias' >>> help(llm.get_key) >>> llm.get_key(key_alias="hi") Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: get_key() missing 1 required positional argument: 'explicit_key' >>> help(llm.get_key) >>> llm.get_key(None, key_alias="hi") >>> llm.get_key(None, key_alias="openai") 'sk-proj...
llm.get_key() is now a documented utility, closes #1094
7eb8acb
Refs #1093, simonw/llm-tools-datasette#2
Now that llm.get_key() is documented I'm going to suggest people use that.
6850cbd
No branches or pull requests
Refs:
The text was updated successfully, but these errors were encountered: