Skip to content

Some documentation updates for consistency #7674

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 2, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 10 additions & 9 deletions docs/modules/usage/installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -73,15 +73,16 @@ docker run -it --rm --pull=always \

You'll find OpenHands running at http://localhost:3000!

You can also [connect OpenHands to your local filesystem](https://docs.all-hands.dev/modules/usage/runtimes#connecting-to-your-filesystem),
You can also [connect OpenHands to your local filesystem](https://docs.all-hands.dev/modules/usage/runtimes/docker#connecting-to-your-filesystem),
run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/modules/usage/how-to/headless-mode),
interact with it via a [friendly CLI](https://docs.all-hands.dev/modules/usage/how-to/cli-mode),
or run it on tagged issues with [a GitHub action](https://docs.all-hands.dev/modules/usage/how-to/github-action).

## Setup

Upon launching OpenHands, you'll see a Settings page. You **must** select an `LLM Provider` and `LLM Model` and enter a corresponding `API Key`.
These can be changed at any time by selecting the `Settings` button (gear icon) in the UI.
After launching OpenHands, you **must** select an `LLM Provider` and `LLM Model` and enter a corresponding `API Key`.
This can be done during the initial settings popup or by selecting the `Settings`
button (gear icon) in the UI.

If the required model does not exist in the list, you can toggle `Advanced` options and manually enter it with the correct prefix
in the `Custom Model` text box.
Expand All @@ -93,17 +94,17 @@ OpenHands requires an API key to access most language models. Here's how to get

#### Anthropic (Claude)

1. [Create an Anthropic account](https://console.anthropic.com/)
2. [Generate an API key](https://console.anthropic.com/settings/keys)
3. [Set up billing](https://console.anthropic.com/settings/billing)
1. [Create an Anthropic account](https://console.anthropic.com/).
2. [Generate an API key](https://console.anthropic.com/settings/keys).
3. [Set up billing](https://console.anthropic.com/settings/billing).

Consider setting usage limits to control costs.

#### OpenAI

1. [Create an OpenAI account](https://platform.openai.com/)
2. [Generate an API key](https://platform.openai.com/api-keys)
3. [Set up billing](https://platform.openai.com/account/billing/overview)
1. [Create an OpenAI account](https://platform.openai.com/).
2. [Generate an API key](https://platform.openai.com/api-keys).
3. [Set up billing](https://platform.openai.com/account/billing/overview).

Now you're ready to [get started with OpenHands](./getting-started).

Expand Down
2 changes: 1 addition & 1 deletion docs/modules/usage/llms/azure-llms.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ You will need your ChatGPT deployment name which can be found on the deployments
<deployment-name> below.
:::

1. Enable `Advanced` options
1. Enable `Advanced` options.
2. Set the following:
- `Custom Model` to azure/<deployment-name>
- `Base URL` to your Azure API Base URL (e.g. `https://example-endpoint.openai.azure.com`)
Expand Down
6 changes: 3 additions & 3 deletions docs/modules/usage/llms/llms.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ recommendations for model selection. Our latest benchmarking results can be foun

Based on these findings and community feedback, the following models have been verified to work reasonably well with OpenHands:

- anthropic/claude-3-5-sonnet-20241022 (recommended)
- anthropic/claude-3-5-haiku-20241022
- anthropic/claude-3-7-sonnet-20250219 (recommended)
- deepseek/deepseek-chat
- OpenHands LM
- gpt-4o

:::warning
Expand Down Expand Up @@ -56,10 +56,10 @@ We have a few guides for running OpenHands with specific model providers:
- [Azure](llms/azure-llms)
- [Google](llms/google-llms)
- [Groq](llms/groq)
- [Local LLMs with SGLang or vLLM](llms/../local-llms.md)
- [LiteLLM Proxy](llms/litellm-proxy)
- [OpenAI](llms/openai-llms)
- [OpenRouter](llms/openrouter)
- [Local LLMs with SGLang or vLLM](llms/../local-llms.md)

### API retries and rate limits

Expand Down
2 changes: 1 addition & 1 deletion docs/modules/usage/llms/local-llms.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Ensure `config.toml` exists by running `make setup-config` which will create one

```
[core]
workspace_base="./workspace"
workspace_base="/path/to/your/workspace"

[llm]
embedding_model="local"
Expand Down
18 changes: 8 additions & 10 deletions docs/modules/usage/runtimes-index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,20 @@
A Runtime is an environment where the OpenHands agent can edit files and run
commands.

By default, OpenHands uses a Docker-based runtime, running on your local computer.
By default, OpenHands uses a [Docker-based runtime](./runtimes/docker), running on your local computer.
This means you only have to pay for the LLM you're using, and your code is only ever sent to the LLM.

We also support "remote" runtimes, which are typically managed by third-parties.
They can make setup a bit simpler and more scalable, especially
if you're running many OpenHands conversations in parallel (e.g. to do evaluation).
We also support other runtimes, which are typically managed by third-parties.

Additionally, we provide a "local" runtime that runs directly on your machine without Docker,
Additionally, we provide a [Local Runtime](./runtimes/local) that runs directly on your machine without Docker,
which can be useful in controlled environments like CI pipelines.

## Available Runtimes

OpenHands supports several different runtime environments:

- [Docker Runtime](./runtimes/docker.md) - The default runtime that uses Docker containers for isolation (recommended for most users)
- [OpenHands Remote Runtime](./runtimes/remote.md) - Cloud-based runtime for parallel execution (beta)
- [Modal Runtime](./runtimes/modal.md) - Runtime provided by our partners at Modal
- [Daytona Runtime](./runtimes/daytona.md) - Runtime provided by Daytona
- [Local Runtime](./runtimes/local.md) - Direct execution on your local machine without Docker
- [Docker Runtime](./runtimes/docker.md) - The default runtime that uses Docker containers for isolation (recommended for most users).
- [OpenHands Remote Runtime](./runtimes/remote.md) - Cloud-based runtime for parallel execution (beta).
- [Modal Runtime](./runtimes/modal.md) - Runtime provided by our partners at Modal.
- [Daytona Runtime](./runtimes/daytona.md) - Runtime provided by Daytona.
- [Local Runtime](./runtimes/local.md) - Direct execution on your local machine without Docker.
20 changes: 9 additions & 11 deletions docs/modules/usage/runtimes/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ that contains our Runtime server, as well as some basic utilities for Python and
You can also [build your own runtime image](../how-to/custom-sandbox-guide).

## Connecting to Your filesystem
One useful feature here is the ability to connect to your local filesystem. To mount your filesystem into the runtime:
A useful feature is the ability to connect to your local filesystem. To mount your filesystem into the runtime:
1. Set `WORKSPACE_BASE`:

```bash
Expand Down Expand Up @@ -40,20 +40,20 @@ but seems to work well on most systems.

## Hardened Docker Installation

When deploying OpenHands in environments where security is a priority, you should consider implementing a hardened Docker configuration. This section provides recommendations for securing your OpenHands Docker deployment beyond the default configuration.
When deploying OpenHands in environments where security is a priority, you should consider implementing a hardened
Docker configuration. This section provides recommendations for securing your OpenHands Docker deployment beyond the default configuration.

### Security Considerations

The default Docker configuration in the README is designed for ease of use on a local development machine. If you're running on a public network (e.g. airport WiFi),
you should implement additional security measures.
The default Docker configuration in the README is designed for ease of use on a local development machine. If you're
running on a public network (e.g. airport WiFi), you should implement additional security measures.

### Network Binding Security

By default, OpenHands binds to all network interfaces (`0.0.0.0`), which can expose your instance to all networks the host is connected to. For a more secure setup:
By default, OpenHands binds to all network interfaces (`0.0.0.0`), which can expose your instance to all networks the
host is connected to. For a more secure setup:

1. **Restrict Network Binding**:

Use the `runtime_binding_address` configuration to restrict which network interfaces OpenHands listens on:
1. **Restrict Network Binding**: Use the `runtime_binding_address` configuration to restrict which network interfaces OpenHands listens on:

```bash
docker run # ...
Expand All @@ -63,9 +63,7 @@ By default, OpenHands binds to all network interfaces (`0.0.0.0`), which can exp

This configuration ensures OpenHands only listens on the loopback interface (`127.0.0.1`), making it accessible only from the local machine.

2. **Secure Port Binding**:

Modify the `-p` flag to bind only to localhost instead of all interfaces:
2. **Secure Port Binding**: Modify the `-p` flag to bind only to localhost instead of all interfaces:

```bash
docker run # ... \
Expand Down
16 changes: 9 additions & 7 deletions docs/modules/usage/runtimes/local.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,26 @@
# Local Runtime

The Local Runtime allows the OpenHands agent to execute actions directly on your local machine without using Docker. This runtime is primarily intended for controlled environments like CI pipelines or testing scenarios where Docker is not available.
The Local Runtime allows the OpenHands agent to execute actions directly on your local machine without using Docker.
This runtime is primarily intended for controlled environments like CI pipelines or testing scenarios where Docker is not available.

:::caution
**Security Warning**: The Local Runtime runs without any sandbox isolation. The agent can directly access and modify files on your machine. Only use this runtime in controlled environments or when you fully understand the security implications.
**Security Warning**: The Local Runtime runs without any sandbox isolation. The agent can directly access and modify
files on your machine. Only use this runtime in controlled environments or when you fully understand the security implications.
:::

## Prerequisites

Before using the Local Runtime, ensure that:

1. You have followed the [Development setup instructions](https://github.com/All-Hands-AI/OpenHands/blob/main/Development.md).
1. You can run OpenHands using the [Development workflow](https://github.com/All-Hands-AI/OpenHands/blob/main/Development.md).
2. tmux is available on your system.

## Configuration

To use the Local Runtime, besides required configurations like the model, API key, you'll need to set the following options via environment variables or the [config.toml file](https://github.com/All-Hands-AI/OpenHands/blob/main/config.template.toml) when starting OpenHands:
To use the Local Runtime, besides required configurations like the LLM provider, model and API key, you'll need to set
the following options via environment variables or the [config.toml file](https://github.com/All-Hands-AI/OpenHands/blob/main/config.template.toml) when starting OpenHands:

- Via environment variables:
Via environment variables:

```bash
# Required
Expand All @@ -27,7 +30,7 @@ export RUNTIME=local
export WORKSPACE_BASE=/path/to/your/workspace
```

- Via `config.toml`:
Via `config.toml`:

```toml
[core]
Expand Down Expand Up @@ -59,4 +62,3 @@ The Local Runtime is particularly useful for:
- CI/CD pipelines where Docker is not available.
- Testing and development of OpenHands itself.
- Environments where container usage is restricted.
- Scenarios where direct file system access is required.
2 changes: 1 addition & 1 deletion docs/modules/usage/runtimes/modal.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,5 @@ You'll then need to set the following environment variables when starting OpenHa
docker run # ...
-e RUNTIME=modal \
-e MODAL_API_TOKEN_ID="your-id" \
-e MODAL_API_TOKEN_SECRET="your-secret" \
-e MODAL_API_TOKEN_SECRET="modal-api-key" \
```
9 changes: 6 additions & 3 deletions docs/modules/usage/runtimes/remote.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# OpenHands Remote Runtime

OpenHands Remote Runtime is currently in beta (read [here](https://runtime.all-hands.dev/) for more details), it allows you to launch runtimes in parallel in the cloud.
Fill out [this form](https://docs.google.com/forms/d/e/1FAIpQLSckVz_JFwg2_mOxNZjCtr7aoBFI2Mwdan3f75J_TrdMS1JV2g/viewform) to apply if you want to try this out!
:::note
This runtime is specifically designed for agent evaluation purposes only through the
[OpenHands evaluation harness](https://github.com/All-Hands-AI/OpenHands/tree/main/evaluation). It should not be used to launch production OpenHands applications.
:::

NOTE: This runtime is specifically designed for agent evaluation purposes only through [OpenHands evaluation harness](https://github.com/All-Hands-AI/OpenHands/tree/main/evaluation). It should not be used to launch production OpenHands applications.
OpenHands Remote Runtime is currently in beta (read [here](https://runtime.all-hands.dev/) for more details), it allows you to launch runtimes
in parallel in the cloud. Fill out [this form](https://docs.google.com/forms/d/e/1FAIpQLSckVz_JFwg2_mOxNZjCtr7aoBFI2Mwdan3f75J_TrdMS1JV2g/viewform) to apply if you want to try this out!
10 changes: 5 additions & 5 deletions docs/sidebars.ts
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,11 @@ const sidebars: SidebarsConfig = {
label: 'Groq',
id: 'usage/llms/groq',
},
{
type: 'doc',
label: 'Local LLMs with SGLang or vLLM',
id: 'usage/llms/local-llms',
},
{
type: 'doc',
label: 'LiteLLM Proxy',
Expand All @@ -156,11 +161,6 @@ const sidebars: SidebarsConfig = {
label: 'OpenRouter',
id: 'usage/llms/openrouter',
},
{
type: 'doc',
label: 'Local LLMs with SGLang or vLLM',
id: 'usage/llms/local-llms',
},
],
},
],
Expand Down
Loading