Open
Description
When the .env
file is configured for DeepSeek (or any other OpenAI-compatible provider) using
LLM_PROVIDER=deepseek
DEEPSEEK_API_KEY=<redacted>
DEEPSEEK_BASE_URL=https://api.deepseek.com/v1
the running containers still try to call https://api.openai.com/v1.
Because the key is not an OpenAI key, every LLM request fails with:
Incorrect API key provided … You can find your API key at https://platform.openai.com/account/api-keys.
It looks like the code only checks OPENAI_API_KEY / OPENAI_BASE_URL and silently ignores LLM_PROVIDER, DEEPSEEK_API_KEY, and DEEPSEEK_BASE_URL.
To Reproduce
Clone the repo (commit 12f9e9…, tip of main as of 2025-04-30).
Create .env exactly as above (no OPENAI_* variables).
Run docker compose up -d.
Execute:
curl -s http://localhost:3002/v1/extract \
-H 'Content-Type: application/json' \
-d '{"urls":["https://firecrawl.dev"],"prompt":"What is the page title?"}'
The API responds with
{"success":false,"error":"Internal server error"}.
Check docker logs firecrawl-api-1 → repeated 401 errors hitting api.openai.com.
LLM_PROVIDER=deepseek (or togetherai, groq, etc.) should make the worker:
route calls to DEEPSEEK_BASE_URL;
sign them with DEEPSEEK_API_KEY;
never hit api.openai.com unless LLM_PROVIDER=openai.
Environment
Component Version
Firecrawl docker image firecrawl-api:latest (12f9e9)
OS (host) macOS 14 (Apple Silicon)
Docker 26.1
Node (inside container) 20.x
dditional Context
Manually setting both OPENAI_API_KEY and OPENAI_BASE_URL to the DeepSeek values works, so it’s purely an env-var mapping issue.
If LLM_PROVIDER is meant for the (soon-to-be) Cloud version only, the OSS README might need clarification, otherwise supporting the mapping in the OSS code would be ideal.