Skip to content

feat: Add litellm and configurable service loading #74

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

thesteganos
Copy link

This commit introduces LiteLLM as a new optional service and provides a mechanism to selectively load Docker services.

Key changes:

  • Added litellm service to docker-compose.yml.
    • Uses ghcr.io/berriai/litellm:main-stable image.
    • Exposes port 4000 internally.
    • Mounts a configuration file from ./litellm/config.yaml.
    • Depends on ollama and redis.
  • Created litellm/config.yaml with a default configuration to connect to a local Ollama model.
  • Updated Caddyfile to include a reverse proxy for litellm (defaulting to port 8009).
  • Modified start_services.py:
    • Added a --services command-line argument to specify a comma-separated list of services to start (e.g., litellm,ollama,open-webui).
    • If --services is not provided, all services are started by default.
    • Service dependencies are automatically resolved and started. For example, selecting open-webui will also start ollama. Selecting litellm will start ollama and redis.
    • The supabase services are always started.
    • The caddy service is automatically started if any other application services are selected.
  • Updated README.md to:
    • Document the new --services flag and its usage.
    • List available services.
    • Explain how to configure litellm using litellm/config.yaml.
    • Provide details on accessing litellm via Caddy.

This commit introduces LiteLLM as a new optional service and provides a mechanism to selectively load Docker services.

Key changes:

- Added `litellm` service to `docker-compose.yml`.
    - Uses `ghcr.io/berriai/litellm:main-stable` image.
    - Exposes port 4000 internally.
    - Mounts a configuration file from `./litellm/config.yaml`.
    - Depends on `ollama` and `redis`.
- Created `litellm/config.yaml` with a default configuration to connect to a local Ollama model.
- Updated `Caddyfile` to include a reverse proxy for `litellm` (defaulting to port 8009).
- Modified `start_services.py`:
    - Added a `--services` command-line argument to specify a comma-separated list of services to start (e.g., `litellm,ollama,open-webui`).
    - If `--services` is not provided, all services are started by default.
    - Service dependencies are automatically resolved and started. For example, selecting `open-webui` will also start `ollama`. Selecting `litellm` will start `ollama` and `redis`.
    - The `supabase` services are always started.
    - The `caddy` service is automatically started if any other application services are selected.
- Updated `README.md` to:
    - Document the new `--services` flag and its usage.
    - List available services.
    - Explain how to configure `litellm` using `litellm/config.yaml`.
    - Provide details on accessing `litellm` via Caddy.
@leex279
Copy link
Collaborator

leex279 commented May 31, 2025

Thx, I will take a look after the other PR is merged #72

We also working at a UI Configurator for the services, so we need to take a look if we can combine your efforts or how it makes sense to include.

image

@thesteganos
Copy link
Author

Looks cool. How can i help?

@leex279
Copy link
Collaborator

leex279 commented Jun 7, 2025

@thesteganos sry for the late response. Was sick and busy then last week. Started today working on that and took your PR as base/idea. I like the custom_services approach but not in the command directly, instead in a json file the webapp now saves.

wip branch in my git: https://github.com/leex279/local-ai-packaged-dev/tree/feature/webapp-custom-services-integration

image
image

Feel free to bring in/help with ideas. I try at the moment to get the environment variables configurator to work properly as well and then I need to clean up some stuff.

@coleam00 coleam00 added the enhancement New feature or request label Jun 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants