Artificial Intelligence 15 min read

Introduction to Anthropic's Model Context Protocol (MCP) with Example Implementations

The article presents Anthropic’s open‑source Model Context Protocol (MCP) – a client‑server framework that standardizes how large language models securely access resources, prompts, and tools (the “HTTP of AI”) – and demonstrates its use through a hot‑fix scraper and a dynamic chatbot that discovers and invokes tools via JSON‑formatted calls.

Tencent Cloud Developer
Tencent Cloud Developer
Tencent Cloud Developer
Introduction to Anthropic's Model Context Protocol (MCP) with Example Implementations

The article introduces Anthropic's open‑source Model Context Protocol (MCP), a protocol designed to enable seamless integration between large language models (LLMs) and external data sources or tools. MCP aims to become the "HTTP of AI", providing a standardized way for LLMs to access real‑time data, invoke tools, and exchange resources securely.

Basic Concepts

MCP follows a client‑server architecture. The host (client) is the LLM application that initiates the connection, while the server provides context, tools, and prompts. The protocol includes built‑in permission controls so that data owners retain access rights.

Architecture

The host is the LLM application (e.g., Claude for Desktop).

The client runs inside the host and maintains a 1:1 connection with the server, handling protocol communication.

The server supplies resources, tools, and prompts, and keeps API keys private, improving security.

Resources

{
uri:
string
;
// Unique identifier for the resource
name:
string
;
// Human‑readable name
description?:
string
;
// Optional description
mimeType?:
string
;
// Optional MIME type
}

Prompts

{
name:
string
;
// Unique identifier for the prompt
description?:
string
;
// Human‑readable description
arguments?: [
// Optional list of arguments
{
name:
string
;
// Argument identifier
description?:
string
;
// Argument description
required?:
boolean
;
// Whether argument is required
}
]
}

Tools

{
name:
string
;
// Unique identifier for the tool
description?:
string
;
// Human‑readable description
inputSchema: {
// JSON Schema for the tool's parameters
type
:
"object"
,
properties: { ... }
// Tool‑specific parameters
}
}

Sampling Flow

{
messages: [
{
role:
"user"
|
"assistant"
,
content: {
type:
"text"
|
"image"
,
text?:
string
,
data?:
string
,
// base64 encoded
mimeType?:
string
}
}
],
modelPreferences?: { … },
systemPrompt?:
string
,
includeContext?:
"none"
|
"thisServer"
|
"allServers"
,
temperature?:
number
,
maxTokens:
number
,
stopSequences?:
string
[],
metadata?: Record<
string
, unknown>
}

Example 1 – Building a Tool to Fetch the Latest Path of Exile 2 Hotfix

1. Import the FastMCP class and define the target URL.

from typing import Any
from mcp.server.fastmcp import FastMCP
# Initialize FastMCP server
mcp = FastMCP("Path of Exile 2 hotfix")
target_url = "https://www.pathofexile.com/forum/view-forum/2212"

2. Implement the core crawling function.

async def poe2_hotfix(url: str):
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3'
}
async with httpx.AsyncClient() as client:
try:
response = await client.get(url, headers=headers, timeout=30.0)
soup = BeautifulSoup(response.text, 'html.parser')
table = soup.find('table')
result_text = ""
if table:
for row in table.find_all('tr'):
cells = row.find_all('td')
if cells:
for cell in cells:
result_text += cell.get_text(strip=True) + '\n'
result_text += '-' * 50 + '\n'  # separator
else:
print('未找到表格元素')
return result_text
except Exception:
return None

3. Register the tool with the @mcp.tool() decorator.

@mcp.tool()
async def find_poe2_hotfix() -> str:
hotfix_data = await poe2_hotfix(target_url)
if not hotfix_data:
return "Unable to find any hotfix in office"
return hotfix_data

4. Run the server.

if __name__ == "__main__":
# Initialize and run the server
mcp.run(transport='stdio')

5. Test the tool with MCP Inspector.

pip install mcp
mcp dev server.py

After the server starts, the new tool appears in the MCP Inspector UI. Executing the tool returns the parsed hotfix information, which the LLM can then incorporate into its answer.

Example 2 – Simple Chatbot Using Dynamic Tool Discovery

The chatbot first discovers all available tools from connected MCP servers:

all_tools = []
for server in self.servers:
tools = await server.list_tools()
all_tools.extend(tools)

It then builds a system prompt that lists the tools and instructs the model to output a strict JSON object when a tool call is required:

system_message = (
    "You are a helpful assistant with access to these tools:\n\n"
    f"{tools_description}\n"
    "Choose the appropriate tool based on the user's question. If no tool is needed, reply directly.\n\n"
    "IMPORTANT: When you need to use a tool, you must ONLY respond with the exact JSON object format below, nothing else:\n"
    "{\n"
    "    \"tool\": \"tool-name\",\n"
    "    \"arguments\": {\n"
    "        \"argument-name\": \"value\"\n"
    "    }\n"
    "}\n\n"
    "After receiving a tool's response:\n"
    "1. Transform the raw data into a natural, conversational response\n"
    "2. Keep responses concise but informative\n"
    "3. Focus on the most relevant information\n"
    "4. Use appropriate context from the user's question\n"
    "5. Avoid simply repeating the raw data\n"
    "Please use only the tools that are explicitly defined above."
)

The model then either calls a tool (returning the JSON) or replies directly. The tool result is fed back to the model for final rendering.

Overall, the article provides a complete walkthrough—from protocol fundamentals to concrete Python implementations—demonstrating how MCP can standardize LLM‑tool interactions, improve security, and accelerate AI application development.

PythonMCPOpen-sourceLLM integrationModel Context ProtocolAI protocolsTool Calling
Tencent Cloud Developer
Written by

Tencent Cloud Developer

Official Tencent Cloud community account that brings together developers, shares practical tech insights, and fosters an influential tech exchange community.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.