Artificial Intelligence 11 min read

A Beginner-friendly Overview of LLMs, Transformers, Prompts, Function Calling, MCP and Agents

This article provides a concise, easy-to-understand introduction to large language models, the transformer architecture, prompt engineering, temperature settings, function calling, the Model Context Protocol (MCP), agent communication (A2A), and future AI programming trends, using simple analogies and illustrative examples.

Tencent Technical Engineering
Tencent Technical Engineering
Tencent Technical Engineering
A Beginner-friendly Overview of LLMs, Transformers, Prompts, Function Calling, MCP and Agents

The author aims to explain core AI concepts—LLM, Transformer, Prompt, Function calling, MCP, Agent, and A2A—in the simplest possible way, acknowledging that some details may be approximate.

1. LLM (Large Language Model)

An LLM works like a text‑completion game: given a sequence of tokens (words), it predicts the next token repeatedly.

Example: "Xiao Ming ate ice cream, result => stomach ache" demonstrates how the model processes input, generates hidden Q/K/V vectors for each token, computes attention scores, applies softmax, and finally selects the most probable next token.

2. Transformer (Self‑Attention)

Each token is represented by three vectors: Query (Q), Key (K) and Value (V). The attention weight for a token is calculated by the dot‑product of its Q with all previous K vectors, followed by a softmax normalization.

Weighted V vectors are summed to form a context vector, which is then used to predict the next token.

Q_current = [0.8, 0.2]

Attention scores are converted to probabilities using the softmax function, e.g., 0.53 for "Xiao Ming", 0.54 for "ate", 0.27 for "ice cream", and 0.37 for "result".

3. Prompt (Prompt Engineering)

Prompts guide the model’s behavior, such as "You are a ...". The system role in the API request is the true prompt, while user messages are the actual conversation input.

4. Understanding the API

The API accepts parameters like temperature (controls randomness) and tools (function calling). Different temperature values are suitable for code generation (≈0), data extraction (≈1), and creative writing (≈1.5).

5. Function Calling

Function calling lets the model request external tools. The workflow is: (1) declare tools and send user input, (2) model returns a tool name and arguments, (3) developer calls the tool, (4) tool result is fed back to the model, (5) final answer is produced.

'The current temperature in Paris is 14°C (57.2°F).'

6. Agent (Intelligent Agent)

An agent combines a LLM with task planning, memory, and tool usage. Example: an e‑commerce customer‑service agent that uses a query_order tool to fetch order status.

7. MCP (Model Context Protocol)

MCP standardizes tool integration, allowing agents to register tools via a server (ListTools, CallTool) without hard‑coding calls.

8. A2A (Agent‑to‑Agent Communication)

A2A extends MCP so agents can invoke each other’s capabilities in a plug‑and‑play manner.

9. Future Outlook

AI will reshape software development; routine coding may be automated, creating a new role of AI‑programming engineers who focus on integrating deterministic tools via protocols like MCP.

AILLMMCPTransformeragentFunction Callingprompt
Tencent Technical Engineering
Written by

Tencent Technical Engineering

Official account of Tencent Technology. A platform for publishing and analyzing Tencent's technological innovations and cutting-edge developments.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.