OpenAI Adds MCP Support to Agents SDK, Advancing Standardized AI Workflows
OpenAI has updated its Agents SDK to support Anthropic's Model Context Protocol (MCP), enabling developers to connect AI agents with diverse data sources and tools through a standardized interface, while providing official documentation, example code, caching, and tracing features to streamline AI workflow integration.
OpenAI announced a major update to its Agents SDK, now supporting the Model Context Protocol (MCP) originally introduced by Anthropic, allowing AI models to retrieve data from business tools, software, databases, and development environments.
The integration lets developers create bidirectional connections between data sources and AI applications such as chatbots, and many companies (e.g., Block, Apollo, Replit) have already adopted MCP since its open‑source release in November.
With over 1,000 community‑built MCP servers available, the protocol’s network effect is growing, making it increasingly valuable as more tools become MCP‑compatible.
Two types of MCP servers are defined: stdio servers that run as local subprocesses, and HTTP over SSE servers that run remotely and are accessed via URL. Developers can connect to them using the MCPServerStdio and MCPServerSse classes.
async with MCPServerStdio(
params={
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", samples_dir],
}
) as server:
tools = await server.list_tools()Agents SDK automatically calls list_tools() on each MCP server during an agent’s execution, exposing available tools to the LLM, and invokes call_tool() when a specific tool is needed.
agent = Agent(
name="Assistant",
instructions="Use the tools to achieve the task",
mcp_servers=[mcp_server_1, mcp_server_2]
)To reduce the performance overhead of repeatedly fetching tool lists, developers can enable caching by setting cache_tools_list=True in the server constructors, and manually clear the cache with invalidate_tools_cache() when necessary.
The SDK also includes built‑in tracing that captures all MCP‑related operations, such as tool‑list requests and function‑call details.
Overall, MCP’s standardised interface acts like a "USB‑C" for AI models, simplifying integration with external resources, lowering development costs, and enhancing context awareness in AI workflows.
OpenAI’s adoption of MCP signals a move toward more interoperable, efficient AI agents, with future plans to extend MCP support to the ChatGPT desktop client and further accelerate AI‑driven productivity.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.