Docs   IntegrationsModel Context Protocol

Using Model Context Protocol (MCP) with Rig

The Model Context Protocol (MCP) by Anthropic provides a standardized way for AI agents to dynamically discover, register, and invoke external tools through a common interface. This enables tools to be treated as first-class citizens in LLM-based workflows and allows for seamless integration between agents and external capabilities.

What is MCP?

MCP is a protocol that enables language models and agents to:

  • Discover available tools from a remote server.
  • Initialize and declare their capabilities.
  • Invoke registered tools and receive responses.

It defines both client and server roles. The server hosts tools and responds to invocations, while the client discovers and uses those tools.

Getting started

Creating an MCP client requires the rmcp crate to be installed. You can run the following one-line command below in your terminal to install it quickly:

cargo add rmcp -F client,macros,transport-streamable-http-client-reqwest,\
    transport-streamable-http-server

You will also additionally need the rmcp feature enabled on the rig-core crate. If you haven’t done so already, you can run the following one-line command in your terminal to add it to your project:

cargo add rig-core -F rmcp

Initialising an MCP client

To connect to the server and fetch tool metadata, you need to create a client using the rmcp crate. The complete unabridged example code can be found from the GitHub repo.

let transport =
    rmcp::transport::StreamableHttpClientTransport::from_uri("http://localhost:8080");
 
let client_info = ClientInfo {
    protocol_version: Default::default(),
    capabilities: ClientCapabilities::default(),
    client_info: Implementation {
        name: "rig-core".to_string(),
        version: "0.23.0".to_string(),
    },
};
 
let client = client_info.serve(transport).await.inspect_err(|e| {
    tracing::error!("client error: {:?}", e);
})?;

Listing MCP tools

Once initialized, the client can list tools available on the MCP server:

let tools: Vec<Tool> = client.list_tools(Default::default()).await?.tools;
 
println!("Tools: {:?}", tools);

Using MCP tools

Now that you’ve retrieved the tool list from the MCP server, you can now pass it into a Rig agent:

let completion_model = providers::openai::Client::from_env();
 
let agent = completion_model
    .agent("gpt-4o")
    .rmcp_tools(tools, client.peer().to_owned())
    .build();
 
let response = agent.prompt("Add 10 + 10").await?;
tracing::info!("Agent response: {:?}", response);

Creating Your Own MCP Server

Building your own MCP server with rmcp allows you to expose custom tools that AI agents can discover and use. By including the server and client in a single program, you can also additionally use techniques like spawning a Tokio task for the MCP server then connect to it from the client in the main program.

Installing Dependencies

To create an MCP server, ensure you have the server features enabled:

cargo add rmcp -F server,macros,transport-streamable-http-server
cargo add tokio -F full

Defining Your Server

Create a server struct and define the tools you want to expose using the #[server] macro:

use rmcp::prelude::*;
 
#[derive(Server)]
#[server(
    name = "my-calculator-server",
    version = "1.0.0"
)]
struct CalculatorServer;
 
#[server_impl]
impl CalculatorServer {
    #[tool(description = "Add two numbers together")]
    async fn add(&self, a: f64, b: f64) -> Result<f64> {
        Ok(a + b)
    }
 
    #[tool(description = "Multiply two numbers")]
    async fn multiply(&self, a: f64, b: f64) -> Result<f64> {
        Ok(a * b)
    }
}

Running the Server

Once you’ve defined your server, start it on a specific port:

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let server = CalculatorServer;
 
    let transport = rmcp::transport::StreamableHttpServerTransport::new(
        "127.0.0.1:8080".parse()?
    );
 
    server.serve(transport).await?;
 
    Ok(())
}

Your MCP server is now running and ready to accept connections from MCP clients! The server will automatically handle tool discovery, capability negotiation, and tool invocations according to the MCP protocol.

Next Steps

With your server running on http://localhost:8080, you can now connect to it using the MCP client code shown earlier in this guide. The tools you’ve defined will be automatically discovered and made available to your AI agents.