Building a Web MCP Client and Server with CopilotKit, LangChain, and Next.js
This article walks through the overall design, implementation, and troubleshooting of a web‑based MCP client and server using CopilotKit, LangChain, Next.js, and related TypeScript libraries, covering setup, core code, environment configuration, and deployment steps.
Introduction
MCP (Model Context Protocol) is an open protocol introduced by Anthropic that standardizes interfaces for seamless integration of large language models (LLMs) with external data sources and tools, similar to a USB plug‑and‑play architecture.
Concept Overview
MCP Hosts (applications like Claude Desktop, IDEs, AI apps) that want to access data or tools via MCP. MCP Clients that connect one‑to‑one with an MCP Server, analogous to a database client. MCP Servers that implement specific functionality based on the MCP protocol. Local Data Sources for local access. Remote Services for API‑based remote access.
The goal is to build a Web version of the MCP Client and an MCP Server.
Technology Stack
System requirements: Node.js >= 18 (tested with v20). Core dependencies: CopilotKit , LangChain and their ecosystems.
CopilotKit – React UI and infrastructure for AI copilots.
LangChain.js & LangGraph – building agents.
langchainjs-mcp-adapters – lightweight wrapper for MCP compatibility.
modelcontextprotocol/typescript-sdk – MCP TypeScript SDK.
open-mcp-client – open‑source MCP client from CopilotKit.
mcp-server-supos – a usable MCP Server implementation.
Client
The client UI consists of a left panel for managing the MCP Server and a right panel for the chatbot.
Technical Solution
Note: This client is based on the open‑source open-mcp-client and has been customized.
The codebase is split into two parts:
/agent – LangGraph agent (originally Python, rewritten in JavaScript as /agent-js ).
/app – Front‑end application built with Next.js and CopilotKit.
Agent Core Code (agent.js)
import { ChatOpenAI } from "@langchain/openai";
// ...model definition
const model = new ChatOpenAI({ temperature: 0, model: "gpt-4o", apiKey: stateApiKey });
// ...client creation and connection
const client = new MultiServerMCPClient(newMcpConfig);
await client.initializeConnections();
const tools = client.getTools();
const agent = createReactAgent({ llm: model, tools });
const response = await agent.invoke({ messages: state.messages });
return [new Command({ goto: END, update: { messages: response.messages } })];Model Helper (model.js)
import { BaseChatModel } from "@langchain/core/language_models/chat_models";
import { ChatOpenAI } from "@langchain/openai";
import { ChatAnthropic } from "@langchain/anthropic";
import { ChatMistralAI } from "@langchain/mistralai";
function getModel(state) {
const model = process.env.MODEL || state.model;
if (state.modelSdk === "openai") return new ChatOpenAI({ temperature: 0, model, apiKey: stateApiKey });
if (state.modelSdk === "anthropic") return new ChatAnthropic({ temperature: 0, modelName: model, apiKey: stateApiKey });
if (state.modelSdk === "mistralai") return new ChatMistralAI({ temperature: 0, modelName: model, apiKey: stateApiKey });
throw new Error("Invalid model specified");
}
export { getModel };State Definition (state.js)
export const AgentStateAnnotation = Annotation.Root({
model: Annotation<string>,
modelSdk: Annotation<string>,
apiKey: Annotation<string>,
mcp_config: Annotation<Connection>,
...CopilotKitStateAnnotation.spec,
});
export type AgentState = typeof AgentStateAnnotation.State;Build and Run
Define langgraph.json with the agent name (e.g., sample_agent ).
Add a .env file in /agent-js with required keys (e.g., LANGSMITH_API_KEY , OPENAI_API_KEY ).
Use the LangChain CLI to develop and run the agent: "scripts": { "dev": "pnpx concurrently \"pnpm dev-frontend\" \"pnpm dev-agent-js\"" }
Server
The server is built with the MCP TypeScript SDK. It provides tools for querying SupOS APIs and streams real‑time MQTT data to the client.
Core Server Code (index.ts)
#!/usr/bin/env node
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import fetch from "node-fetch";
import { z } from "zod";
import fs, { readFileSync } from "fs";
import mqtt from "mqtt";
// Environment validation
if (!process.env.SUPOS_API_URL) { console.error("SUPOS_API_URL not set"); process.exit(1); }
// ...createMcpServer, define tools "get-model-topic-detail" and "get-all-topic-realtime-data"
// ...runServer with stdio transportUtility Functions (utils.ts)
export function createFilePath(filedir = ".cache", filename = "all_topic_realdata.json") {
const rootPath = process.cwd();
const filePath = path.resolve(rootPath, filedir, filename);
const dirPath = path.dirname(filePath);
if (!fs.existsSync(dirPath)) fs.mkdirSync(dirPath, { recursive: true });
return filePath;
}
export function readFileSync(filePath, options) { try { return fs.readFileSync(filePath, options); } catch (err) { return `读取文件时出错: ${err}`; } }How to Use
Client : Any MCP‑compatible client (e.g., Claude for Desktop, VSCode Cline plugin) can connect to the server.
Server : Besides the official examples, many platforms (e.g., mcp.so, Glama) host MCP servers.
Examples
Integrate the web client with a todoist-mcp-server instance.
Use the server with Claude or VSCode Cline by configuring the proper npx or node execution path.
Conclusion
The article shares practical experience building an MCP client and server, highlights common pitfalls (ESM vs CJS, environment variable handling, cross‑platform command paths), and demonstrates that a full‑stack solution can be assembled using CopilotKit, LangChain, and the MCP TypeScript SDK.
Rare Earth Juejin Tech Community
Juejin, a tech community that helps developers grow.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.