-
Notifications
You must be signed in to change notification settings - Fork 81
feat(mcp): introduce MCP server #1094
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,124 @@ | ||
# Wren MCP Server | ||
|
||
The **Wren MCP Server** is a **Model Context Protocol (MCP) server** that provides tools for interacting with **Wren Engine** to facilitate AI agent integration. | ||
|
||
## Requirements | ||
|
||
Before setting up the Wren MCP Server, ensure you have the following dependency installed: | ||
|
||
- **[uv](https://docs.astral.sh/uv/getting-started/installation/#installing-uv)** - A fast and efficient Python package manager. | ||
|
||
## Environment Variables | ||
|
||
The server requires the following environment variables to be set: | ||
|
||
| Variable | Description | | ||
|----------|------------| | ||
| `WREN_URL` | The URL of the **Wren Ibis server**. | | ||
| `CONNECTION_INFO_FILE` | The path to the **required connection info file**. | | ||
| `MDL_PATH` | The path to the **MDL file**. | | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Additionally, we should note that the mdl is not the same as a standard engine mdl. it need There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks for pointing this out. I added a section to mention it. By the way, |
||
|
||
### Connection Info | ||
|
||
The following JSON is a connection info of a Postgres. You can find the requried fields for each data source in the [source code](https://github.com/Canner/wren-engine/blob/4ac283ee0754b12a8c3b0a6f13b32c935fcb7b0d/ibis-server/app/model/__init__.py#L75). | ||
```json | ||
{ | ||
"host": "localhost", | ||
"port": "5432", | ||
"user": "test", | ||
"password": "test", | ||
"database": "test" | ||
} | ||
``` | ||
|
||
### The `dataSource` field is requried. | ||
|
||
In the MDL, the `dataSource` field is required to indicate which data source should be connected. | ||
|
||
### `.env` File Support | ||
|
||
Wren MCP Server supports an `.env` file for easier environment configuration. You can define all the required environment variables in this file. | ||
|
||
--- | ||
|
||
## Installation & Usage | ||
|
||
### 1. Set the Python Envrionment | ||
|
||
Use the `uv` command to create a virtual envrionment and activate it: | ||
``` | ||
> uv venv | ||
Using CPython 3.11.11 | ||
Creating virtual environment at: .venv | ||
Activate with: source .venv/bin/activate | ||
> source .venv/bin/activate | ||
> uv run app/wren.py | ||
Loaded MDL etc/mdl.json | ||
Loaded connection info etc/pg_conneciton.json | ||
``` | ||
You would see that the MDL and connection info are loaded. Then, you can use `Ctrl + C` terminate the process. | ||
|
||
### 2. Start Wren Engine and Ibis Server | ||
|
||
- If you **already have a running Wren Engine**, ensure that `WREN_URL` is correctly set to point to your server. | ||
- If you **don't have a running engine**, you can start one using Docker: | ||
|
||
```sh | ||
cd docker | ||
docker compose up | ||
``` | ||
|
||
### 3. Set Environment Variables | ||
|
||
Make sure all required environment variables are properly configured, either in your system or within a `.env` file. | ||
|
||
### 4. Configure the MCP Server | ||
|
||
Create a configuration file with the following structure: | ||
|
||
```json | ||
{ | ||
"mcpServers": { | ||
"wren": { | ||
"command": "uv", | ||
"args": [ | ||
"--directory", | ||
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/wren-engine/mcp-server", | ||
"run", | ||
"app/wren.py" | ||
], | ||
"autoApprove": [], | ||
"disabled": false | ||
} | ||
} | ||
} | ||
``` | ||
|
||
#### Notes: | ||
- You **may need to provide the full path** to the `uv` executable in the `"command"` field. You can find it using: | ||
- **MacOS/Linux**: `which uv` | ||
- **Windows**: `where uv` | ||
- Ensure that the **absolute path** to the MCP server directory is used in the configuration. | ||
- For more details, refer to the [MCP Server Guide](https://modelcontextprotocol.io/quickstart/server#test-with-commands). | ||
|
||
### 5. Choose an AI Agent That Supports MCP Server | ||
|
||
The following AI agents are compatible with Wren MCP Server and deploy the MCP config: | ||
|
||
- **[Claude Desktop](https://modelcontextprotocol.io/quickstart/user)** | ||
- **[Cline](https://docs.cline.bot/mcp-servers/mcp-quickstart)** | ||
|
||
### 6. Check the Wren Engine is Connected | ||
|
||
You can ask the AI agent to perform a health check for Wren Engine. | ||
|
||
### 7. Start the Conversation | ||
|
||
Now, you can start asking questions through your AI agent and interact with Wren Engine. | ||
|
||
--- | ||
|
||
## Additional Resources | ||
|
||
- **Wren Engine Documentation**: [Wren AI](https://getwren.ai/) | ||
- **MCP Protocol Guide**: [Model Context Protocol](https://modelcontextprotocol.io/) |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,52 @@ | ||
from pydantic import BaseModel | ||
from pydantic.fields import Field | ||
|
||
|
||
class Column(BaseModel): | ||
name: str | ||
type: str | ||
expression: str = None | ||
isCalculated: bool = False | ||
relationship: str = None | ||
description: str = None | ||
|
||
|
||
class TableReference(BaseModel): | ||
catalog: str = None | ||
mdl_schema: str = Field(alias="schema", default=None) | ||
table: str | ||
|
||
|
||
class Model(BaseModel): | ||
name: str | ||
tableReference: TableReference | ||
columns: list[Column] | ||
primaryKey: str = None | ||
description: str = None | ||
|
||
|
||
class Relationship(BaseModel): | ||
name: str | ||
models: list[str] | ||
join_type: str | ||
join_condition: str | ||
|
||
|
||
class View(BaseModel): | ||
name: str | ||
statement: str | ||
description: str = None | ||
|
||
|
||
class Manifest(BaseModel): | ||
catalog: str = "wren" | ||
mdl_schema: str = Field(alias="schema", default="public") | ||
models: list[Model] | ||
relationships: list[Relationship] | ||
views: list[View] | ||
description: str = None | ||
|
||
|
||
class TableColumns(BaseModel): | ||
table_name: str | ||
column_names: list[str] = None |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
import base64 | ||
import orjson | ||
|
||
|
||
def dict_to_base64_string(data: dict[str, any]) -> str: | ||
return base64.b64encode(orjson.dumps(data)).decode("utf-8") | ||
|
||
|
||
def json_to_base64_string(data: str) -> str: | ||
return base64.b64encode(data.encode("utf-8")).decode("utf-8") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should specify that the connection information is stored in a JSON file and provide an example, as shown below