Cloud Native 16 min read

Edge Computing Platform IPES: Architecture, Components, and Application Scenarios at iQIYI

iQIYI’s IPES platform unifies cloud‑edge collaboration by managing millions of heterogeneous devices, deploying Docker, native and function‑as‑a‑service workloads, scheduling tasks, and providing logging, storage and messaging services, enabling edge video caching, live‑stream mirroring, conferencing and ad‑violation detection to cut latency and bandwidth costs while planning full Kubernetes compatibility.

iQIYI Technical Product Team
iQIYI Technical Product Team
iQIYI Technical Product Team
Edge Computing Platform IPES: Architecture, Components, and Application Scenarios at iQIYI

With the rapid development of cloud‑native technologies and the commercial rollout of 5G, edge computing has emerged as a key trend. This article presents iQIYI’s experience in building an edge‑computing service platform (IPES) and its edge‑caching applications, focusing on two goals: improving video playback experience and reducing bandwidth costs.

1. Edge Computing Overview

Edge computing has been highlighted by Gartner as a top technology trend for three consecutive years. The MEC (Multi‑Access Edge Computing) reference architecture expands the concept beyond mobile networks to include a broader set of access networks. Edge locations are classified by proximity to the end user: device edge, mobile edge, and cloud edge. In China, operators and cloud vendors mainly target the mobile and cloud edges.

Container‑based cloud‑native technologies and 5G provide the infrastructure needed for large‑scale edge deployments. Edge and cloud complement each other: edge nodes offload storage, AI, and big‑data processing from the cloud, reducing latency and cloud‑side pressure.

2. The IPES Edge Computing Platform

Since early 2019, iQIYI’s HCDN team designed and implemented IPES (Intelligent Platform for Edge Service) to manage massive heterogeneous devices and deliver PaaS capabilities close to users.

Key design principles:

Support massive heterogeneous devices.

Enable cloud‑edge collaboration.

Allow isolation of native programs on devices that cannot run containers.

A. Node Management

Device onboarding.

Node status reporting.

Application status reporting.

B. Application Management (supports gray release, version dependencies, resource limits, auto‑scaling)

Docker application deployment.

Native application deployment.

Function‑as‑a‑service deployment.

Online synchronization of desired cloud state.

Offline autonomous operation.

C. Task Scheduling

Real‑time function task dispatch.

Scheduled function triggers.

Task callback interfaces.

Task result upload interfaces.

Task status query interfaces.

D. Common Services

Log collection.

Health checks.

Message routing.

Distributed storage.

The cloud side of IPES adopts a K8s‑like architecture:

API Server – unified entry for resources, authentication, and authorization.

Controller – maintains desired state of applications, handles auto‑scaling.

Scheduler – matches resources to tasks based on policies.

Data Manager – processes data reported from edge nodes.

MQTT Broker – maintains long‑lived, encrypted connections with edge agents.

Store – Docker repository, IPFS proxy for distributed data.

Observe – health checks and log collection (Prometheus, EFK, etc.).

The edge side consists of:

Master – local API, lifecycle manager for both Docker and Native engines.

Agent – encrypted long‑connection to the cloud.

Lambda – runtime for Python/JS functions, task queue, scheduling, callbacks.

MQTT‑Hub – intra‑edge message broker.

IPFS – content‑addressed distributed storage to reduce cloud bandwidth.

Filebeat – lightweight log collector.

IPES has been deployed on over one million devices covering x86, ARM, and MIPS architectures, supporting Linux, Windows, and OpenWrt. It currently hosts video‑on‑demand caching, live‑stream caching, video‑conference services, and ad‑material violation detection, among other workloads.

3. Edge Application Scenarios

iQIYI’s HCDN architecture, built since 2014, combines P2P and CDN to create a massive distributed storage network. Edge caching applications run on IPES nodes, delivering high‑quality video while reducing bandwidth consumption.

Additional scenarios include:

Live‑stream mirroring – edge nodes offload live traffic, lowering CDN pressure and latency.

SFU/RTCDN – low‑latency interactive live streaming and video conferencing using WebRTC‑based SFU cascades, scheduled via IPES.

Ad‑material violation detection – Python/JS functions run at the edge to audit advertising content, improving detection speed and cutting costs.

These use cases illustrate how edge computing can extend cloud capabilities, handle bursty workloads, and process data close to the user.

4. Future Directions

Edge computing is becoming a dominant medium for video delivery, accounting for more than half of Internet traffic. Companies are actively building edge solutions; iQIYI’s focus remains on edge caching while exploring AI, big‑data, and further cloud‑native integrations.

Planned work includes:

Full compatibility with Kubernetes.

Enhanced cluster and configuration management.

More application‑friendly services and health‑prediction mechanisms.

Adopting cloud‑native architectural principles will enable faster, more reliable deployment of edge workloads.

Edge ComputingContainerizationVideo Streamingcloud nativeiQIYIedge platformIPESMEC
iQIYI Technical Product Team
Written by

iQIYI Technical Product Team

The technical product team of iQIYI

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.