A complete guide to Model Context Protocol (MCP) + Hands-on

A complete guide to Model Context Protocol (MCP) + Hands-on

We all have exchanged messages with LLMs for a while now. Let's start with what is missing with LLMs. If we ask a simple question, “What's the weather like in London today?” we get an honest reply saying that it doesn’t have real-time access to data. One way to mitigate this problem is to convert LLMs into agents.

So what are agents?

Agents are LLMs that are equipped with sophisticated tools and resources that increase the intelligence of LLMs by severalfold and bridge the knowledge gap between the user’s questions and LLM’s context. 

The agents need to communicate efficiently with these external tools and resources to make the most out of them. They also need an array of tools depending on the application. For example, a weather agent may need real-time access to weather data but a travel assistant may need to query a database, check flight timings, etc. The possibilities and tools are endless.

The endless possibilities bring several vendors to the market. For example, Google Drive and email services are be provided by Google but Slack messaging service is obviously provided by Slack. This only means that all the vendors need to speak a universal language rather than trying to establish their own. 

In other words, companies need to follow standards. Standards ease communication between different systems and make them seamless. 
Standards are now new in communication between systems. One of the now well-established and famous ones is REST APIs. Whenever we construct APIs, we follow this standard. But, LLMs being fairly new, protocols are yet to be well established.

One of the earliest protocols was developed by Anthropic and named Model Context Protocol (MCP). It is a protocol to establish context to LLM models and hence the name.

So What is MCP in simple terms?

MCP can be seen as a layer between LLM and tools/resources. It can be seen as a translator that translates the language of the different tools into a unified language that the client speaks. The client here is an app or a product that runs the LLM inside it. 

Cursor, windsurf or a simple chat application are all examples of clients. In a client-server architecture, the client needs to communicate with a server. And the MCP server is what speaks that unified language to any client. So, any tool or resource provider (DBs, Slack, Google Drive, etc) needs to build an MCP server to make them accessible to LLM agents. Once the MCP server is built for your tool, your tool is out there in the wild for any LLM to make use of it.

So, with that motivation, let's build an MCP server, and use it with Claude's desktop client.

Hands-on MCP Server

Now that we have understood the basics, let's dive into developing an MCP server. Though it can be done in several languages such as Python, Node, Java, Cotlin and C#, I will stick with Python here. 

Problem Statement

One of the problems with LLMs as coding assistants is that they do not have the latest documentation information about coding libraries. For example, the Claude 3.7 Sonnet was cut off in 2024 and it does not have access to any documentation after that. 

Let's say I want a PyTorch coding assistant. If I ask Claude, “What is the .. “. It answers by saying, “PyTorch does not have the function”. The function was introduced in version 2.6.0 as can be seen from the release notes here. Let's address this problem by building a PyTorch documentation MCP server.

Below are the simple steps to develop the PyTorch documentation MCP server, though it applies to developing any MCP server in Python.

  • Start by installing UV which is emerging as a better alternative to using pip, poetry, and the like. 
curl -LsSf https://astral.sh/uv/install.sh | sh
  • Initialize the MCP server project 
uv init mcp-py-example
  • open the project with VS Code, cursor, or any other Python IDE of your choice
cd mcp-py-example
code .
  • In the VS code editor, open the terminal, create a virtual env, and activate it
source .venv/bin/activate
  • Now install dependencies which are httpx and of course MCP. Note the installation with UV and not with pip
uv add mcp[cli] httpx

We are now ready to build the MCP server. Let's start writing the code.

Then, we can create an MCP server by first importing the FastMCP class from mcp.server.fastmcp, by creating the mcp variable by creating an instance of the class. Then, the most important bit is to decorate the function with @mcp.tool() and we are done!

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("docs")

@mcp.tool()
def get_documentation(query: str, library: str):
"""
Search the latest documentation of any provided library using the query.
Currently only supports PyTorch.

Args:
query: The query to search for (e.g. "what does torch.randn do?")
library: The library to search in (e.g. "pytorch")

Returns:
Text from the latest documentation
"""

if library != "pytorch":
raise ValueError(f"Library {library} is not supported by this tool")

query = f"site:{documentation_url} {query}"
results = search(query)

if len(results["organic"]) == 0:
return f"No seearch results for the query {query}"

text = ""
for result in results["organic"]:
text += data_from_url(result["link"])
return text

The above is the simplest implementation of an MCP server. The next important bit is to have mcp.run() in your code. For example, in the same file, I would say,

if __name__ == "__main__":
mcp.run(transport="stdio")

The above function indeed is missing a couple of functions which are search() and data_from_url(). The entire code is available for free in Github here

With that, we pretty much wrap-up creating an MCP server. The server can be readily integrated with a client like Claude desktop app. 

Visual Explanation

If you wish to watch me code the full MCP server and integrate it with the Claude desktop app, please have a look at the below video. It also covers how we can debug MCP servers which can be quite tricky:

Conclusion

We are only scratching the surface of what is possible with MCP servers. Almost all tools used by developers and businesses today are getting MCP server integration to avoid missing out on the LLM wave. 

Hope that was useful!