Artificial Intelligence 16 min read

Function Calling and Model Context Protocol (MCP): Bridging Large Language Models with Real‑World Systems

The article reviews the shortcomings of traditional large language models, explains how function calling extends LLMs beyond pure text, introduces the Model Context Protocol (MCP) as a standardized USB‑C‑like interface for AI tools, and demonstrates a Python MCP example that integrates LLMs with Tencent Advertising APIs.

Tencent Cloud Developer
Tencent Cloud Developer
Tencent Cloud Developer
Function Calling and Model Context Protocol (MCP): Bridging Large Language Models with Real‑World Systems

Since the release of ChatGPT in late 2022, large language models (LLMs) have become a cornerstone of natural language processing, yet they suffer from two major drawbacks: static knowledge that cannot provide up‑to‑date information and hallucinations caused by statistical generation.

Function calling, first introduced by OpenAI in GPT‑4‑Turbo, allows an LLM to request external functions or services, thereby supplying fresh context and reducing factual hallucinations. The article details the workflow: the user query and available tools are packaged as a context, the LLM decides which tool to invoke, returns the function name and arguments, the API executes the call, and the result is fed back to the model.

To address the lack of a unified tool definition, the Model Context Protocol (MCP) was open‑sourced by Anthropic in November 2024. MCP is described as the "USB‑C" of AI, providing a standard JSON‑RPC 2.0 based communication layer that can operate over STDIO or Server‑Sent Events. The protocol defines three message types (request, response, notification) and a client‑server architecture consisting of MCP Hosts, MCP Clients, MCP Servers, local data sources, and remote services.

The article then presents a concrete MCP use‑case in the Tencent advertising scenario. A lightweight Python MCP server is built to expose the "Get supported creative templates" API. The raw JSON response is translated into a model‑friendly format, and the tool is registered with the MCP SDK using a decorator:

@mcp.tool(name="hello", description="Retrieve supported creative templates for a given ad group ID")
async def hello(adgroup_id: int) -> str:
    # business logic here
    ...

To bridge MCP tools with OpenAI‑style function calling, a conversion helper is provided:

def convert_tool_to_function(tool):
    """Convert an MCP tool object into an OpenAI function description"""
    properties = tool.inputSchema.get('properties', {})
    parameters = {
        "type": "object",
        "properties": {
            name: {
                "type": prop.get('type', 'string'),
                "description": prop.get('description', '')
            } for name, prop in properties.items()
        },
        "required": tool.inputSchema.get('required', [])
    }
    return {
        "type": "function",
        "function": {
            "name": tool.name,
            "description": tool.description,
            "parameters": parameters
        }
    }

During inference, the LLM outputs a tool_calls field; the host maps the called function back to the corresponding MCP tool, invokes it via the MCP client, and feeds the result back to the model. The article shows sample JSON‑RPC request/response definitions and a short loop that repeatedly processes tool calls.

Finally, the author notes that MCP is not a competitor to function calling but a complementary protocol that decouples tool implementation from the LLM, enabling reusable, language‑agnostic services. Open‑source MCP implementations, SDKs for Python, TypeScript, Java, and Kotlin, and real‑world integrations such as Google Ads and Claude Agents are highlighted as evidence of a rapidly maturing ecosystem.

PythonLLMMCPAPIFunction CallingModel Context ProtocolAI integration
Tencent Cloud Developer
Written by

Tencent Cloud Developer

Official Tencent Cloud community account that brings together developers, shares practical tech insights, and fosters an influential tech exchange community.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.