Backend Development 21 min read

Overview of RPC and How to Build a Custom RPC Framework Using Netty

This article explains the fundamentals and key characteristics of Remote Procedure Call (RPC), outlines common use cases and popular frameworks, and then walks through the design and implementation of a simple custom RPC system—including a bespoke binary protocol, client‑side proxy generation, serialization, Netty‑based encoding/decoding, server‑side request handling, and result delivery—using Java and Netty.

JD Tech
JD Tech
JD Tech
Overview of RPC and How to Build a Custom RPC Framework Using Netty

Remote Procedure Call (RPC) is a communication technique that lets a program request services from a remote computer without dealing with low‑level network details, and it is widely used in distributed systems to simplify client‑server interactions.

Key characteristics of RPC include transparency (remote calls look like local method invocations), a client‑server model, serialization/deserialization of request parameters, support for synchronous and asynchronous calls, error handling for network failures, and the ability to work over various protocols such as TCP.

Typical application scenarios are micro‑service communication, mobile‑backend interactions, cross‑platform service calls, public API services, big‑data processing frameworks (e.g., Hadoop, Spark), cloud‑native environments, and cross‑data‑center service invocation.

Common open‑source RPC frameworks mentioned are JD‑RPC, gRPC, Dubbo, JSON‑RPC, and Apache Thrift.

Implementing a custom RPC framework starts with defining a binary protocol consisting of a 16‑byte header (magic value, message type, status, request ID, body size) followed by a body that carries the serialized payload.

The client side uses a dynamic proxy factory to generate a stub that hides serialization, network transmission, and error handling from the caller. Example proxy creation code:

// Interface method
ShoppingCart shopping(String pin);

// Create proxy instance
IShoppingCartService serviceProxy = ProxyFactory.factory(IShoppingCartService.class)
    .setSerializerType(SerializerType.JDK) // use JDK native serialization
    .newProxyInstance();

// Invoke as if it were a local call
ShoppingCart result = serviceProxy.shopping("userPin");
log.info("result={}", JSONObject.toJSONString(result));

The ProxyFactory builds metadata, selects a caller strategy, and returns a proxy generated with ByteBuddy:

public class ProxyFactory
{
    // ... omitted fields ...
    public I newProxyInstance() {
        ServiceData serviceData = new ServiceData(group, providerName, version != null ? version : "1.0.0");
        Calller caller = newCaller().timeoutMillis(timeoutMillis);
        Strategy strategy = StrategyConfigContext.of(strategy, retries);
        Object handler;
        switch (invokeType) {
            case "syncCall":
                handler = new SyncCaller(serviceData, caller);
                break;
            case "asyncCall":
                handler = new AsyncCaller(client.appName(), serviceData, caller, strategy);
                break;
            default:
                throw new RuntimeException("未知类型: " + invokeType);
        }
        return ProxyEnum.getDefault().newProxy(interfaceClass, handler);
    }
    // ... omitted ...
}

The SyncCaller builds a StarGateRequest , selects a channel via load balancing, and sends the request, returning either the result or a Future for asynchronous calls:

public Object syncCall(@Origin Method method, @AllArguments @RuntimeType Object[] args) throws Throwable {
    StarGateRequest request = createRequest(methodName, args);
    Invoker invoker = new FastFailInvoker();
    Future
future = invoker.invoke(request, method.getReturnType());
    if (sync) {
        return future.getResult();
    } else {
        return future;
    }
}

Serialization (default JDK) converts the request object to a byte array:

public
byte[] writeObject(T obj) {
    ByteArrayOutputStream buf = OutputStreams.getByteArrayOutputStream();
    try (ObjectOutputStream output = new ObjectOutputStream(buf)) {
        output.writeObject(obj);
        output.flush();
        return buf.toByteArray();
    } catch (IOException e) {
        ThrowUtil.throwException(e);
    } finally {
        OutputStreams.resetBuf(buf);
    }
    return null;
}

The Netty encoder ( StarGateEncoder ) writes the header fields and the serialized body to the channel:

private void doEncodeRequest(RequestPayload request, ByteBuf out) {
    byte sign = StarGateProtocolHeader.toSign(request.serializerCode(), StarGateProtocolHeader.REQUEST);
    long invokeId = request.invokeId();
    byte[] bytes = request.bytes();
    int length = bytes.length;
    out.writeShort(StarGateProtocolHeader.Head)
       .writeByte(sign)
       .writeByte(0x00)
       .writeLong(invokeId)
       .writeInt(length)
       .writeBytes(bytes);
}

On the server side, a decoder ( StarGateDecoder ) parses the incoming bytes back into a RequestPayload , which the ServiceHandler then deserializes and processes via reflection:

public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
    if (msg instanceof RequestPayload) {
        StarGateRequest request = new StarGateRequest((RequestPayload) msg);
        byte code = request.serializerCode();
        Serializer serializer = SerializerFactory.getSerializer(code);
        Message message = serializer.readObject(request.bytes(), Message.class);
        process(message);
    } else {
        ReferenceCountUtil.release(msg);
    }
}

The process method loads the service implementation class, matches the method signature, invokes it via reflection, and wraps the result in a ResultWrapper which is then serialized and sent back to the client:

ResultWrapper result = new ResultWrapper();
result.setResult(realResult);
byte code = request.serializerCode();
Serializer serializer = SerializerFactory.getSerializer(code);
Response response = new Response(request.invokeId());
response.bytes(code, serializer.writeObject(result));
response.status(Status.OK.value());
channel.writeAndFlush(response).addListener(future -> {
    if (future.isSuccess()) {
        log.info("响应成功");
    } else {
        log.error("响应失败", future.cause());
    }
});

The client’s inbound handler ( consumerHandler ) receives the ResponseMessage , deserializes the Result , matches the original Future by invoke ID, and completes the call, allowing synchronous callers to obtain the result or asynchronous callers to handle it via callbacks.

In summary, the article provides a step‑by‑step guide to building a minimal RPC system on top of Netty, covering protocol design, dynamic proxy generation, serialization, Netty encoder/decoder implementation, server‑side request processing, and client‑side result handling, while noting that production‑grade RPC frameworks require additional features such as load balancing, circuit breaking, rate limiting, and service discovery.

distributed systemsJavaBackend DevelopmentRPCNettyCustom Protocol
JD Tech
Written by

JD Tech

Official JD technology sharing platform. All the cutting‑edge JD tech, innovative insights, and open‑source solutions you’re looking for, all in one place.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.