Skip to content

Commit e64bc82

Browse files
committed
Fix build, rewrite in google dev style, rearrange structure, add some more details and such
1 parent cf07fc2 commit e64bc82

File tree

2 files changed

+109
-107
lines changed

2 files changed

+109
-107
lines changed

docs/docs/guides/integrations/mcp.md

+107-105
Original file line numberDiff line numberDiff line change
@@ -1,85 +1,102 @@
1-
# Model Context Protocol (MCP)
1+
# Model Context Protocol (MCP) and Weave
22

33
<a target="_blank" href="https://colab.research.google.com/drive/174VzXlU5Qcgvjt4OoIWN-guTxJcOefAh?usp=sharing">
44
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
55
</a>
66

7-
The Model Context Protocol (MCP) serves as a unified communication standard that enables AI applications to exchange information with Large Language Models (LLMs). Similar to how universal connectors revolutionized hardware connectivity, MCP promises to create a consistent interface for LLMs to access various data sources and interact with external tools, eliminating the need for custom integrations for each new service.
7+
The Model Context Protocol (MCP) is a standardized communication protocol that enables AI applications to exchange information with large language models (LLMs). Similar to universal connectors that transformed hardware compatibility, MCP provides an interface for LLMs to access various data sources and interact with external tools, all without requiring custom integrations for each new service.
88

9-
The Weave integration allows you to trace your MCP Client and MCP Server. This integration provides detailed visibility into tool calls, resource access, and prompt generation within MCP-based systems.
9+
The Weave integration lets you trace activity between your MCP client and MCP server. It gives you detailed visibility into tool calls, resource access, and prompt generation across MCP-based systems.
1010

11-
We automatically patch the key methods of [`mcp.server.fastmcp.FastMCP`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/server/fastmcp/server.py#L109) and [`mcp.ClientSession`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/client/session.py#L84) class with a [`weave.op()`](../tracking/ops.md) decorator.
11+
## How it works
1212

13-
We trace the following key components -- [**Tools**](https://modelcontextprotocol.io/docs/concepts/tools), [**Resources**](https://modelcontextprotocol.io/docs/concepts/resources), [**Prompts**](https://modelcontextprotocol.io/docs/concepts/prompts)
13+
:::important
14+
Currently, the integration captures client-side and server-side operations separately, but does not provide end-to-end visibility into their interaction. There's an ongoing proposal to add OpenTelemetry trace support to MCP to enable end-to-end observability. For more information, see [GitHub discussion #269](https://github.com/modelcontextprotocol/modelcontextprotocol/discussions/269).
15+
:::
16+
17+
The Weave integration automatically traces key components of the Model Context Protocol (MCP) by patching core methods with the [`weave.op()`](../tracking/ops.md) decorator. Specifically, it patches methods in the [`mcp.server.fastmcp.FastMCP`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/server/fastmcp/server.py#L109) and [`mcp.ClientSession`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/client/session.py#L84) classes.
18+
19+
Through this integration, Weave traces the following MCP components:
20+
21+
- [Tools](https://modelcontextprotocol.io/docs/concepts/tools)
22+
- [Resources](https://modelcontextprotocol.io/docs/concepts/resources)
23+
- [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts)
1424

1525
[![mcp_trace_timeline.png](imgs/mcp/mcp_trace_timeline.png)](https://wandb.ai/ayut/mcp_example/weave/traces?filter=%7B%22opVersionRefs%22%3A%5B%22weave%3A%2F%2F%2Fayut%2Fmcp_example%2Fop%2Frun_client%3A*%22%5D%7D&peekPath=%2Fayut%2Fmcp_example%2Fcalls%2F01966bbe-cc5e-7012-b45f-bf10617d8c1e%3FhideTraceTree%3D0)
1626

17-
## Using the Integration
1827

19-
### Installation
28+
## Use the integration
29+
30+
The Weave integration works with both the MCP server and client. Once installed, you can enable tracing with just two additional lines of code—one to import `weave`, and another to initialize it.
31+
32+
### Prerequisites
2033

21-
To use the MCP integration with weave, you'll need to install both weave and the MCP package:
34+
Before you begin, install the required packages:
2235

2336
```bash
2437
pip install -qq mcp[cli] weave
2538
```
2639

27-
### Server-Side Integration
40+
### Configuration
41+
42+
The MCP integration can be configured through environment variables:
43+
44+
- `MCP_TRACE_LIST_OPERATIONS`: Set to `true` to trace list operations (`list_tools`, `list_resources`, etc.)
45+
46+
### Server-side integration
2847

29-
The code snippet below shows how to write a very simple `FastMCP server`. You can start tracing with two extra lines of code:
48+
To trace an MCP server, add two lines to your existing `FastMCP` setup: one to import Weave and one to initialize the client. Once added, tool, resource, and prompt operations will be automatically traced.
3049

3150
```python
32-
# highlight-next-line
51+
# Import Weave (required for tracing)
3352
import weave
3453
from mcp.server.fastmcp import FastMCP
3554

36-
# Initialize Weave
37-
# highlight-next-line
55+
# Initialize Weave with your project name
3856
weave_client = weave.init("my-project")
3957

40-
# Create an MCP server
58+
# Set up the MCP server
4159
mcp = FastMCP("Demo")
4260

43-
# Define tools (will be traced)
61+
# Define a tool (this call will be traced)
4462
@mcp.tool()
4563
def add(a: int, b: int) -> int:
46-
"""Add two numbers"""
64+
"""Add two numbers."""
4765
return a + b
4866

49-
# Define resources (will be traced)
67+
# Define a resource (this call will be traced)
5068
@mcp.resource("greeting://{name}")
5169
def get_greeting(name: str) -> str:
52-
"""Get a personalized greeting"""
70+
"""Return a personalized greeting."""
5371
return f"Hello, {name}!"
5472

55-
# Define prompts (will be traced)
73+
# Define a prompt (this call will be traced)
5674
@mcp.prompt()
5775
def review_code(code: str) -> str:
58-
"""Generate a code review prompt"""
76+
"""Return a prompt for reviewing code."""
5977
return f"Please review this code:\n\n{code}"
6078

61-
# Run the server
79+
# Start the server
6280
mcp.run(transport="stdio")
6381
```
6482

65-
### Client-Side Integration
83+
### Client-side integration
6684

67-
Similary on the client side, add two lines of code to enable tracing your MCP client:
85+
On the client side, tracing also requires just two changes: import Weave and initialize it. All tool calls, resource accesses, and prompt requests will be traced automatically.
6886

6987
```python
70-
# highlight-next-line
88+
# Import Weave (required for tracing)
7189
import weave
7290
from mcp import ClientSession, StdioServerParameters
7391
from mcp.client.stdio import stdio_client
7492

75-
# Initialize Weave
76-
# highlight-next-line
93+
# Initialize Weave with your project name
7794
weave_client = weave.init("my-project")
7895

79-
# The MCP client operations will be automatically traced
96+
# Set up and run the MCP client
8097
async with stdio_client(server_params) as (read, write):
8198
async with ClientSession(read, write) as session:
82-
# Initialize the connection
99+
# Initialize the session
83100
await session.initialize()
84101

85102
# Call a tool (this will be traced)
@@ -92,112 +109,97 @@ async with stdio_client(server_params) as (read, write):
92109
prompt = await session.get_prompt("review_code", arguments={"code": "print('Hello')"})
93110
```
94111

95-
### Configuration
96-
97-
The MCP integration can be configured through environment variables:
98-
99-
- `MCP_TRACE_LIST_OPERATIONS`: Set to "true" to trace list operations (`list_tools`, `list_resources`, etc.)
100-
101-
## Why is tracing ability needed?
102-
103-
As a developer you can fall into one of three categories:
112+
## Tutorial: `mcp_demo` example
104113

105-
- **MCP server-side developer**: You want to expose multiple tools, resources, and prompts to the MCP client. You expose your existing application's tools, resources, etc., or you have built agents or have multiple agents orchestrated by an orchestrator agent.
114+
The [`mcp_example`](https://github.com/wandb/weave/tree/master/examples/mcp_demo) demonstrates an integration between the Model Context Protocol (MCP) and Weave for tracing. It showcases how to instrument both the client and server components to capture detailed traces of their interactions.
106115

107-
- **MCP client-side developer**: You might want to plug your client-side application into multiple MCP servers. A core part of your client-side logic is making LLM calls to decide which tool to call or which resource to fetch.
108-
109-
- **MCP server and client developer**: You are developing both the server and the client.
110-
111-
If you fall into the first two categories, you want to know when each tool is called, what the execution flow looks like, the token count, and latency of different components in your server or client-side logic.
112-
113-
If you are developing both the server and client, the ability to see a unified trace timeline (we don't yet capture server-client interaction) can help you quickly iterate through both server and client-side logic.
114-
115-
:::Note
116-
Currently our integration captures client-side and server-side operations separately, but does not provide visibility into their interaction. There's an ongoing proposal created by us to add OpenTelemetry trace support to MCP that would enable end-to-end observability - see [GitHub discussion #269](https://github.com/modelcontextprotocol/modelcontextprotocol/discussions/269).
117-
:::
116+
### Run the example
118117

119-
With an observability layer you can:
118+
1. Clone the `weave` repository and navigate to the `mcp_demo` example:
120119

121-
- Quickly iterate through your application
122-
- Audit the workflow or execution logic
123-
- Identify bottlenecks
120+
```bash
121+
git clone https://github.com/wandb/weave
122+
cd weave/examples/mcp_demo
123+
```
124124

125-
## Quickstart Guide
125+
The example includes two main files:
126126

127-
We also have a quickstart guide to show how you can create a MCP server and client and trace it using Weave.
127+
- `example_server.py`: A demo MCP server built with `FastMCP`. It defines tools, resources, and prompts.
128+
- `example_client.py`: A client that connects to the server and interacts with its components.
128129

129-
### Using the Example Code
130+
2. Install the required dependencies manually:
130131

131-
1. Clone the Weave repository and navidate to the MCP demo directory to run the quickstart guide:
132132
```bash
133-
git clone https://github.com/wandb/weave
134-
cd weave/examples/mcp_demo
133+
pip install mcp[cli] weave
135134
```
136135

137-
2. Install the required dependencies:
138-
Alternatively you can do:
136+
3. Run the demo:
137+
139138
```bash
140-
pip install mcp[cli] weave
139+
python example_client.py example_server.py
141140
```
142141

143-
The example consists of two main files:
144-
- `example_server.py`: A demo MCP server built using `FastMCP` with various tools, resources, and prompts.
145-
- `example_client.py`: A client that connects to the server.
142+
This command launches both the client and server. The client starts an interactive CLI where you can test various features.
146143

147-
To run the example:
144+
### Client CLI commands
148145

149-
```bash
150-
python example_client.py example_server.py
151-
```
146+
The client interface supports the following commands:
152147

153-
This will start the client, which will connect to and interact with the server. The client provides a command-line interface with the following options:
148+
| Command | Description |
149+
|-----------------------|-----------------------------------------|
150+
| `tools` | List available tools |
151+
| `resources` | List available resources |
152+
| `prompts` | List available prompts |
153+
| `add <a> <b>` | Add two numbers |
154+
| `bmi <weight> <height>` | Calculate Body Mass Index |
155+
| `weather <city>` | Get weather data for a city |
156+
| `greeting <name>` | Get a personalized greeting |
157+
| `user <id>` | Retrieve a user profile |
158+
| `config` | Fetch app configuration |
159+
| `code-review <code>` | Generate a code review prompt |
160+
| `debug <error>` | Generate a debugging prompt |
161+
| `demo` | Run a full demo of all available features. This will run each feature in sequence and produce a full trace timeline of interactions in the Weave UI. |
162+
| `q` | Quit the session |
154163

155-
- `tools` - List all available tools
156-
- `resources` - List all available resources
157-
- `prompts` - List all available prompts
158-
- `add <a> <b>` - Add two numbers
159-
- `bmi <weight_kg> <height_m>` - Calculate BMI
160-
- `weather <city>` - Get weather for a city
161-
- `greeting <name>` - Get personalized greeting
162-
- `user <id>` - Get user profile
163-
- `config` - Get application configuration
164-
- `code-review <code>` - Get code review prompt
165-
- `debug <error>` - Get debug error prompt
166-
- `demo` - Run demos for all features
167-
- `q` - Exit the session
164+
### Understanding the example
168165

169-
If you type `demo`, the client will run through all the features. The image shown above is the trace timeline you should get by doing so.Typing `q` will close the process. If you want to play with the available features, try individual commands listed above.
166+
The `example_server.py` server defines the following:
170167

171-
:::Tip
172-
By default, you will just see the `run_client` traces. Click on the Ops selection box and select "All Calls"
168+
- _Tools_: Functions such as `add()`, `calculate_bmi()`, `fetch_weather()`
169+
- _Resources_: Endpoints like `greeting://{name}`, `config://app`, `users://{id}/profile`
170+
- _Prompts_: Templates like `review_code()` and `debug_error()`
173171

174-
![mcp_all_calls.png](imgs/mcp/mcp_all_calls.png)
172+
All server-side operations are automatically traced by Weave when you initialize the client with `weave.init()`.
175173

176-
Doing so will show you `FastMCP` methods (tools, resources, prompts) traced by the integration. You can see the arguments given to the tools and returned values.
174+
The `example_client.py` client demonstrates how to:
177175

178-
[![mcp_fastmcp.png](imgs/mcp/mcp_fastmcp.png)](https://wandb.ai/ayut/mcp_example/weave/traces?peekPath=%2Fayut%2Fmcp_example%2Fcalls%2F01966bc2-aca1-7021-a626-aecfe677b1b4%3FhideTraceTree%3D0)
179-
:::
176+
- Connect to an MCP server
177+
- Discover available tools, resources, and prompts
178+
- Call tools with parameters
179+
- Read from resource URIs
180+
- Generate prompts with arguments
181+
182+
Weave traces all client-side calls to provide a complete view of interactions between the client and server.
180183

181-
### Understanding the Example
184+
## FAQ
182185

183-
#### Server-Side Code
186+
### Why is MCP tracing needed?
184187

185-
The example server (`example_server.py`) defines:
188+
As an LLM application developer, you fall into one of three categories:
186189

187-
1. **Tools**: Functions like `add()`, `calculate_bmi()`, and `fetch_weather()`
188-
2. **Resources**: Data endpoints like `greeting://{name}`, `config://app`, and `users://{user_id}/profile`
189-
3. **Prompts**: Templates like `review_code()` and `debug_error()`
190+
- _MCP server-side developer_: You want to expose multiple tools, resources, and prompts to the MCP client. You expose your existing application's tools, resources, etc., or you have built agents or have multiple agents orchestrated by an orchestrator agent.
190191

191-
All of these components are automatically traced by Weave, allowing you to monitor their usage, inputs, and outputs. This tracing is enabled by just initializing a weave client (`weave.init`).
192+
- _MCP client-side developer_: You want to plug your client-side application into multiple MCP servers. A core part of your client-side logic is making LLM calls to decide which tool to call or which resource to fetch.
192193

193-
#### Client-Side Code
194+
- _MCP server and client developer_: You are developing both the server and the client.
194195

195-
The example client (`example_client.py`) demonstrates:
196+
If you fall into either of the first two categories, you want to know when each tool is called, what the execution flow looks like, the token count, and latency of different components in your server or client-side logic.
196197

197-
1. How to connect to an MCP server
198-
2. How to discover available tools, resources, and prompts
199-
3. How to call tools with parameters
200-
4. How to access resources with URIs
201-
5. How to generate prompts with arguments
198+
If you are developing both the server and client, the ability to see a unified trace timeline can help you quickly iterate through both server and client-side logic.
199+
200+
In any case, an observability layer allows you to:
201+
202+
- Quickly iterate through your application
203+
- Audit the workflow or execution logic
204+
- Identify bottlenecks
202205

203-
All client-side operations are also traced by Weave, creating a complete picture of the client-server interaction.

docs/sidebars.ts

+2-2
Original file line numberDiff line numberDiff line change
@@ -201,9 +201,9 @@ const sidebars: SidebarsConfig = {
201201
collapsible: true,
202202
collapsed: false,
203203
label: "Protocols",
204-
link: { type: "doc", id: "guides/protocols/index" },
204+
link: { type: "doc", id: "guides/integrations/index"},
205205
items: [
206-
"guides/integrations/mcp",
206+
{type: "doc", id: "guides/integrations/mcp", label: "MCP"},
207207
],
208208
},
209209
],

0 commit comments

Comments
 (0)