Skip to content

Define a backend for open webui #849

Closed Answered by karthink
aburlot asked this question in Q&A
Discussion options

You must be logged in to vote

I don't use Open WebUI so I can't test this, but from the docs it looks like something like this should work:

(gptel-make-openai "OpenWebUI"
  :host "localhost:3000"
  :protocol "http"
  :endpoint "/api/chat/completions"
  :stream t
  :models '(llama3.1))
  
(setq gptel-backend (gptel-get-backend "OpenWebUI")
      gptel-model 'llama3.1)
  • Adjust the list of models as required.
  • I don't know if it supports streaming, so you may need to remove the :stream t as well.

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by aburlot
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants