Design of a Low‑Latency Live Streaming System for the European Cup
The team built a reusable low‑latency live‑streaming platform for the European Cup that replaces HLS with RTMP over QUIC, reuses VOD bandwidth, automates stream control, and cuts playback latency by 20 seconds while dramatically lowering bandwidth costs and delivering stable 1080p 50 fps HDR video.
The European Cup attracts global attention, and fans require a real‑time, clear viewing experience. When picture quality is comparable, latency becomes the critical factor. Emerging technologies such as 5G and cloud computing have improved low‑latency live streaming, but high bandwidth costs remain a major obstacle for large‑scale events.
To address this, a reusable VOD‑bandwidth low‑latency live streaming system was developed. During the iQIYI Sports European Cup coverage, the solution supported all 51 matches, achieving an 82% VOD‑bandwidth reuse rate. Playback latency was reduced by 20 seconds compared with traditional segment‑based streaming, while delivering 1080p 50 fps HDR video with stable playback.
Low‑Latency Distribution Network The system adopts RTMP/HTTP‑FLV low‑latency streaming instead of HLS, because HLS segment size and count lead to latencies above ten seconds. RTMP streams frames directly, significantly lowering distribution delay.
QUIC‑Based Protocol Reconstruction RTMP was rebuilt on top of the QUIC protocol, providing faster connection establishment, eliminating head‑of‑line blocking, and offering better performance on weak networks through adaptive congestion control algorithms such as BBR and Cubic.
VOD Bandwidth Reuse Since VOD bandwidth costs are roughly half of low‑latency live streaming bandwidth, the abundant VOD bandwidth pool was repurposed for live streams. Most matches occur at night, avoiding conflict with VOD usage. An Rcache service was deployed on VOD servers to provide low‑latency streaming.
Real‑Time Stream Playback Control A control flow was designed to enable or disable real‑time streams based on program schedule, bitrate, language, resolution, and device type (Android, iOS, PC, TV, etc.). Real‑time concurrent user counts are monitored to prevent overload, ensuring automatic, program‑driven playback management.
Program Production and Transcoding The end‑to‑end workflow—from satellite signal reception to viewer playback—was streamlined. Dedicated fiber links connect the master control room to multiple studios, reducing latency by about three seconds. Transcoding clusters and SFU nodes were optimized to cut another two seconds, and RTMP back‑to‑origin latency was kept under 100 ms.
Bandwidth Cost Optimization Only two stages of the live stream traverse the public Internet: CDN back‑to‑origin and viewer pull from CDN. By reusing VOD bandwidth via Rcache, the CDN load and associated costs were dramatically reduced.
Rcache Upload Performance Rcache was engineered for high stability and high upload ratio. Connection validation, keep‑alive checks, and buffer threshold management mitigate packet loss and network congestion. Multi‑stream support and weighted load balancing enable efficient use of limited upstream bandwidth.
Load Balancing and Scheduling Service Prepush servers launch and manage Rcache nodes, ensuring timely startup and balanced distribution across regions and carriers. The scheduling service dynamically adjusts hash layers based on request volume, prioritizes nearby Rcache nodes, and provides primary‑backup failover for high availability. Observability tools monitor thousands of Rcache devices to quickly detect and resolve failures.
Automation Operations An automated workflow reduces night‑time operational burden. Prepush automatically fetches and launches Rcache nodes before a match, releases resources after completion, and forces stream termination when necessary to reallocate nodes for upcoming matches. Multi‑bitrate allocation is triggered when concurrency exceeds node capacity.
Playback Experience Optimization Key metrics—startup time, playback latency, and stutter—are monitored across self‑built and third‑party CDNs. Adaptive heartbeat logic, automatic frame‑catch‑up, seamless switching between Rcache and CDN, and fallback to segmented streams ensure a smooth viewing experience even under adverse network conditions.
Conclusion With rapid technological advances and growing demand for high‑quality live streams, low‑latency streaming solutions will become increasingly widespread. The implemented system significantly reduces the cost of low‑latency live streaming, drives industry development, and delivers an excellent viewing experience to users.
iQIYI Technical Product Team
The technical product team of iQIYI
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.