You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"/>
5
5
</a>
6
6
7
-
The Model Context Protocol (MCP) serves as a unified communication standard that enables AI applications to exchange information with Large Language Models (LLMs). Similar to how universal connectors revolutionized hardware connectivity, MCP promises to create a consistent interface for LLMs to access various data sources and interact with external tools, eliminating the need for custom integrations for each new service.
7
+
The Model Context Protocol (MCP) is a standardized communication protocol that enables AI applications to exchange information with large language models (LLMs). Similar to universal connectors that transformed hardware compatibility, MCP provides an interface for LLMs to access various data sources and interact with external tools, all without requiring custom integrations for each new service.
8
8
9
-
The Weave integration allows you to trace your MCP Client and MCP Server. This integration provides detailed visibility into tool calls, resource access, and prompt generation within MCP-based systems.
9
+
The Weave integration lets you trace activity between your MCP client and MCP server. It gives you detailed visibility into tool calls, resource access, and prompt generation across MCP-based systems.
10
10
11
-
We automatically patch the key methods of [`mcp.server.fastmcp.FastMCP`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/server/fastmcp/server.py#L109) and [`mcp.ClientSession`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/client/session.py#L84) class with a [`weave.op()`](../tracking/ops.md) decorator.
11
+
## How it works
12
12
13
-
We trace the following key components -- [**Tools**](https://modelcontextprotocol.io/docs/concepts/tools), [**Resources**](https://modelcontextprotocol.io/docs/concepts/resources), [**Prompts**](https://modelcontextprotocol.io/docs/concepts/prompts)
13
+
:::important
14
+
Currently, the integration captures client-side and server-side operations separately, but does not provide end-to-end visibility into their interaction. There's an ongoing proposal to add OpenTelemetry trace support to MCP to enable end-to-end observability. For more information, see [GitHub discussion #269](https://github.com/modelcontextprotocol/modelcontextprotocol/discussions/269).
15
+
:::
16
+
17
+
The Weave integration automatically traces key components of the Model Context Protocol (MCP) by patching core methods with the [`weave.op()`](../tracking/ops.md) decorator. Specifically, it patches methods in the [`mcp.server.fastmcp.FastMCP`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/server/fastmcp/server.py#L109) and [`mcp.ClientSession`](https://github.com/modelcontextprotocol/python-sdk/blob/b4c7db6a50a5c88bae1db5c1f7fba44d16eebc6e/src/mcp/client/session.py#L84) classes.
18
+
19
+
Through this integration, Weave traces the following MCP components:
The Weave integration works with both the MCP server and client. Once installed, you can enable tracing with just two additional lines of code—one to import `weave`, and another to initialize it.
31
+
32
+
### Prerequisites
20
33
21
-
To use the MCP integration with weave, you'll need to install both weave and the MCP package:
34
+
Before you begin, install the required packages:
22
35
23
36
```bash
24
37
pip install -qq mcp[cli] weave
25
38
```
26
39
27
-
### Server-Side Integration
40
+
### Configuration
41
+
42
+
The MCP integration can be configured through environment variables:
43
+
44
+
-`MCP_TRACE_LIST_OPERATIONS`: Set to `true` to trace list operations (`list_tools`, `list_resources`, etc.)
45
+
46
+
### Server-side integration
28
47
29
-
The code snippet below shows how to write a very simple`FastMCP server`. You can start tracing with two extra lines of code:
48
+
To trace an MCP server, add two lines to your existing`FastMCP` setup: one to import Weave and one to initialize the client. Once added, tool, resource, and prompt operations will be automatically traced.
30
49
31
50
```python
32
-
#highlight-next-line
51
+
#Import Weave (required for tracing)
33
52
import weave
34
53
from mcp.server.fastmcp import FastMCP
35
54
36
-
# Initialize Weave
37
-
# highlight-next-line
55
+
# Initialize Weave with your project name
38
56
weave_client = weave.init("my-project")
39
57
40
-
#Create an MCP server
58
+
#Set up the MCP server
41
59
mcp = FastMCP("Demo")
42
60
43
-
# Define tools (will be traced)
61
+
# Define a tool (this call will be traced)
44
62
@mcp.tool()
45
63
defadd(a: int, b: int) -> int:
46
-
"""Add two numbers"""
64
+
"""Add two numbers."""
47
65
return a + b
48
66
49
-
# Define resources (will be traced)
67
+
# Define a resource (this call will be traced)
50
68
@mcp.resource("greeting://{name}")
51
69
defget_greeting(name: str) -> str:
52
-
"""Get a personalized greeting"""
70
+
"""Return a personalized greeting."""
53
71
returnf"Hello, {name}!"
54
72
55
-
# Define prompts (will be traced)
73
+
# Define a prompt (this call will be traced)
56
74
@mcp.prompt()
57
75
defreview_code(code: str) -> str:
58
-
"""Generate a code review prompt"""
76
+
"""Return a prompt for reviewing code."""
59
77
returnf"Please review this code:\n\n{code}"
60
78
61
-
#Run the server
79
+
#Start the server
62
80
mcp.run(transport="stdio")
63
81
```
64
82
65
-
### Client-Side Integration
83
+
### Client-side integration
66
84
67
-
Similary on the client side, add two lines of code to enable tracing your MCP client:
85
+
On the client side, tracing also requires just two changes: import Weave and initialize it. All tool calls, resource accesses, and prompt requests will be traced automatically.
68
86
69
87
```python
70
-
#highlight-next-line
88
+
#Import Weave (required for tracing)
71
89
import weave
72
90
from mcp import ClientSession, StdioServerParameters
73
91
from mcp.client.stdio import stdio_client
74
92
75
-
# Initialize Weave
76
-
# highlight-next-line
93
+
# Initialize Weave with your project name
77
94
weave_client = weave.init("my-project")
78
95
79
-
#The MCP client operations will be automatically traced
96
+
#Set up and run the MCP client
80
97
asyncwith stdio_client(server_params) as (read, write):
81
98
asyncwith ClientSession(read, write) as session:
82
-
# Initialize the connection
99
+
# Initialize the session
83
100
await session.initialize()
84
101
85
102
# Call a tool (this will be traced)
@@ -92,112 +109,97 @@ async with stdio_client(server_params) as (read, write):
The MCP integration can be configured through environment variables:
98
-
99
-
-`MCP_TRACE_LIST_OPERATIONS`: Set to "true" to trace list operations (`list_tools`, `list_resources`, etc.)
100
-
101
-
## Why is tracing ability needed?
102
-
103
-
As a developer you can fall into one of three categories:
112
+
## Tutorial: `mcp_demo` example
104
113
105
-
-**MCP server-side developer**: You want to expose multiple tools, resources, and prompts to the MCP client. You expose your existing application's tools, resources, etc., or you have built agents or have multiple agents orchestrated by an orchestrator agent.
114
+
The [`mcp_example`](https://github.com/wandb/weave/tree/master/examples/mcp_demo) demonstrates an integration between the Model Context Protocol (MCP) and Weave for tracing. It showcases how to instrument both the client and server components to capture detailed traces of their interactions.
106
115
107
-
-**MCP client-side developer**: You might want to plug your client-side application into multiple MCP servers. A core part of your client-side logic is making LLM calls to decide which tool to call or which resource to fetch.
108
-
109
-
-**MCP server and client developer**: You are developing both the server and the client.
110
-
111
-
If you fall into the first two categories, you want to know when each tool is called, what the execution flow looks like, the token count, and latency of different components in your server or client-side logic.
112
-
113
-
If you are developing both the server and client, the ability to see a unified trace timeline (we don't yet capture server-client interaction) can help you quickly iterate through both server and client-side logic.
114
-
115
-
:::Note
116
-
Currently our integration captures client-side and server-side operations separately, but does not provide visibility into their interaction. There's an ongoing proposal created by us to add OpenTelemetry trace support to MCP that would enable end-to-end observability - see [GitHub discussion #269](https://github.com/modelcontextprotocol/modelcontextprotocol/discussions/269).
117
-
:::
116
+
### Run the example
118
117
119
-
With an observability layer you can:
118
+
1. Clone the `weave` repository and navigate to the `mcp_demo` example:
120
119
121
-
- Quickly iterate through your application
122
-
- Audit the workflow or execution logic
123
-
- Identify bottlenecks
120
+
```bash
121
+
git clone https://github.com/wandb/weave
122
+
cd weave/examples/mcp_demo
123
+
```
124
124
125
-
## Quickstart Guide
125
+
The example includes two main files:
126
126
127
-
We also have a quickstart guide to show how you can create a MCP server and client and trace it using Weave.
127
+
-`example_server.py`: A demo MCP server built with `FastMCP`. It defines tools, resources, and prompts.
128
+
-`example_client.py`: A client that connects to the server and interacts with its components.
128
129
129
-
### Using the Example Code
130
+
2. Install the required dependencies manually:
130
131
131
-
1. Clone the Weave repository and navidate to the MCP demo directory to run the quickstart guide:
132
132
```bash
133
-
git clone https://github.com/wandb/weave
134
-
cd weave/examples/mcp_demo
133
+
pip install mcp[cli] weave
135
134
```
136
135
137
-
2. Install the required dependencies:
138
-
Alternatively you can do:
136
+
3. Run the demo:
137
+
139
138
```bash
140
-
pip install mcp[cli] weave
139
+
python example_client.py example_server.py
141
140
```
142
141
143
-
The example consists of two main files:
144
-
-`example_server.py`: A demo MCP server built using `FastMCP` with various tools, resources, and prompts.
145
-
-`example_client.py`: A client that connects to the server.
142
+
This command launches both the client and server. The client starts an interactive CLI where you can test various features.
146
143
147
-
To run the example:
144
+
### Client CLI commands
148
145
149
-
```bash
150
-
python example_client.py example_server.py
151
-
```
146
+
The client interface supports the following commands:
152
147
153
-
This will start the client, which will connect to and interact with the server. The client provides a command-line interface with the following options:
|`bmi <weight> <height>`| Calculate Body Mass Index |
155
+
|`weather <city>`| Get weather data for a city |
156
+
|`greeting <name>`| Get a personalized greeting |
157
+
|`user <id>`| Retrieve a user profile |
158
+
|`config`| Fetch app configuration |
159
+
|`code-review <code>`| Generate a code review prompt |
160
+
|`debug <error>`| Generate a debugging prompt |
161
+
|`demo`| Run a full demo of all available features. This will run each feature in sequence and produce a full trace timeline of interactions in the Weave UI. |
162
+
|`q`| Quit the session |
154
163
155
-
-`tools` - List all available tools
156
-
-`resources` - List all available resources
157
-
-`prompts` - List all available prompts
158
-
-`add <a> <b>` - Add two numbers
159
-
-`bmi <weight_kg> <height_m>` - Calculate BMI
160
-
-`weather <city>` - Get weather for a city
161
-
-`greeting <name>` - Get personalized greeting
162
-
-`user <id>` - Get user profile
163
-
-`config` - Get application configuration
164
-
-`code-review <code>` - Get code review prompt
165
-
-`debug <error>` - Get debug error prompt
166
-
-`demo` - Run demos for all features
167
-
-`q` - Exit the session
164
+
### Understanding the example
168
165
169
-
If you type `demo`, the client will run through all the features. The image shown above is the trace timeline you should get by doing so.Typing `q` will close the process. If you want to play with the available features, try individual commands listed above.
166
+
The `example_server.py` server defines the following:
170
167
171
-
:::Tip
172
-
By default, you will just see the `run_client` traces. Click on the Ops selection box and select "All Calls"
168
+
-_Tools_: Functions such as `add()`, `calculate_bmi()`, `fetch_weather()`
169
+
-_Resources_: Endpoints like `greeting://{name}`, `config://app`, `users://{id}/profile`
170
+
-_Prompts_: Templates like `review_code()` and `debug_error()`
173
171
174
-

172
+
All server-side operations are automatically traced by Weave when you initialize the client with `weave.init()`.
175
173
176
-
Doing so will show you `FastMCP` methods (tools, resources, prompts) traced by the integration. You can see the arguments given to the tools and returned values.
174
+
The `example_client.py` client demonstrates how to:
- Discover available tools, resources, and prompts
178
+
- Call tools with parameters
179
+
- Read from resource URIs
180
+
- Generate prompts with arguments
181
+
182
+
Weave traces all client-side calls to provide a complete view of interactions between the client and server.
180
183
181
-
### Understanding the Example
184
+
##FAQ
182
185
183
-
#### Server-Side Code
186
+
###Why is MCP tracing needed?
184
187
185
-
The example server (`example_server.py`) defines:
188
+
As an LLM application developer, you fall into one of three categories:
186
189
187
-
1.**Tools**: Functions like `add()`, `calculate_bmi()`, and `fetch_weather()`
188
-
2.**Resources**: Data endpoints like `greeting://{name}`, `config://app`, and `users://{user_id}/profile`
189
-
3.**Prompts**: Templates like `review_code()` and `debug_error()`
190
+
-_MCP server-side developer_: You want to expose multiple tools, resources, and prompts to the MCP client. You expose your existing application's tools, resources, etc., or you have built agents or have multiple agents orchestrated by an orchestrator agent.
190
191
191
-
All of these components are automatically traced by Weave, allowing you to monitor their usage, inputs, and outputs. This tracing is enabled by just initializing a weave client (`weave.init`).
192
+
-_MCP client-side developer_: You want to plug your client-side application into multiple MCP servers. A core part of your client-side logic is making LLM calls to decide which tool to call or which resource to fetch.
192
193
193
-
#### Client-Side Code
194
+
-_MCP server and client developer_: You are developing both the server and the client.
194
195
195
-
The example client (`example_client.py`) demonstrates:
196
+
If you fall into either of the first two categories, you want to know when each tool is called, what the execution flow looks like, the token count, and latency of different components in your server or client-side logic.
196
197
197
-
1. How to connect to an MCP server
198
-
2. How to discover available tools, resources, and prompts
199
-
3. How to call tools with parameters
200
-
4. How to access resources with URIs
201
-
5. How to generate prompts with arguments
198
+
If you are developing both the server and client, the ability to see a unified trace timeline can help you quickly iterate through both server and client-side logic.
199
+
200
+
In any case, an observability layer allows you to:
201
+
202
+
- Quickly iterate through your application
203
+
- Audit the workflow or execution logic
204
+
- Identify bottlenecks
202
205
203
-
All client-side operations are also traced by Weave, creating a complete picture of the client-server interaction.
0 commit comments