superserve.serve_mcp() to deploy MCP (Model Context Protocol) servers as scalable HTTP endpoints via Ray Serve.
Quick Start
Create a new MCP server:mcp_servers/weather/server.py:
http://localhost:8000/weather/mcp.
Adding Tools
Define tools using FastMCP’s@mcp.tool() decorator:
Using MCP Servers with Agents
Here’s an example using Pydantic AI:superserve up, both the MCP server and agent start on the same port. The agent can then use tools from the MCP server.
serve_mcp Options
| Option | Type | Default | Description |
|---|---|---|---|
mcp_server | FastMCP | required | FastMCP instance (must use stateless_http=True) |
name | str | None | Server name (inferred from directory if not set) |
num_cpus | float | 1 | CPU cores per replica |
num_gpus | float | 0 | GPUs per replica |
memory | str | ”512MB” | Memory per replica |
replicas | int | 1 | Number of replicas |
route_prefix | str | /{name} | URL prefix (endpoint: /{name}/mcp) |