-
Hello! My company installed an Open WebUI instance that I would like to use. I do not have access to the ollama instance running background. Does anyone define a function like |
Beta Was this translation helpful? Give feedback.
Answered by
karthink
May 23, 2025
Replies: 2 comments
-
I don't use Open WebUI so I can't test this, but from the docs it looks like something like this should work: (gptel-make-openai "OpenWebUI"
:host "localhost:3000"
:protocol "http"
:endpoint "/api/chat/completions"
:stream t
:models '(llama3.1))
(setq gptel-backend (gptel-get-backend "OpenWebUI")
gptel-model 'llama3.1)
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
aburlot
-
Works like a charm. I missed the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I don't use Open WebUI so I can't test this, but from the docs it looks like something like this should work:
:stream t
as well.