OceanBase TPC‑C Benchmark: Technical Insights and Future Directions
This article provides a detailed technical analysis of OceanBase’s TPC‑C benchmark performance, covering the database’s development history, benchmark methodology, identified shortcomings, future improvement plans, and the strict standards and audit processes that ensure reliable OLTP benchmarking results.
Following OceanBase’s top ranking in the TPC‑C benchmark, the core R&D team offers a professional technical interpretation of the test, marking the first of a series of forthcoming articles.
OceanBase was initiated in 2010 and, over nine years, has evolved from serving Alipay to providing services for a wide range of industry customers, using the TPC‑C benchmark as a showcase of its capabilities.
The benchmark revealed that, while OceanBase’s distributed architecture allows performance scaling by adding ordinary hardware, per‑node performance still has room for improvement, and its functionality, usability, and ecosystem lag behind industry leaders.
Future development will focus on two key areas: enhancing Oracle compatibility to ease migration and improving OLAP processing to support both OLAP and OLTP workloads within a single engine.
The team also plans to open‑source the TPC‑C testing tool to foster industry collaboration on database technology development.
TPC‑C, created by the Transaction Processing Performance Council (TPC), is a widely accepted OLTP benchmark that simulates order‑creation, payment, and other retail transactions, measuring performance in tpmC (transactions per minute).
Results must be audited by TPC‑recognized auditors and published on www.tpc.org; unaudited claims are illegal. No other domestic database has a published TPC‑C report.
The benchmark builds on the earlier DebitCredit standard, which defined functional requirements, scalability criteria, transaction latency limits, and cost considerations for database performance testing.
Some vendors have altered the original standards to inflate results, highlighting the need for strict TPC audit enforcement.
TPC‑C mandates full ACID compliance, including serializable isolation, and challenges distributed databases with a significant proportion of distributed transactions.
Performance must scale proportionally with data volume; for example, a warehouse of ~70 MB yields a theoretical tpmC ceiling of 12.86, implying realistic hardware and storage requirements for high tpmC values.
The benchmark also requires steady‑state performance for at least eight hours with less than 2 % variation; OceanBase achieved less than 0.5 % variation during the test.
Checkpoint intervals must be under 30 minutes, but OceanBase’s multi‑replica architecture and baseline‑plus‑incremental storage model meet this requirement without relying on immediate disk writes.
While profile‑directed optimization (PDO) is allowed, OceanBase did not use PDO to avoid potential disputes.
The test was conducted on Alibaba Cloud ECS instances, demonstrating that cloud resources can reduce cost and simplify scaling for benchmark execution.
Readers are invited to watch an interview with OceanBase founder Yang Zhenkun to learn more about the ten‑year journey of this domestically developed distributed database.
AntTech
Technology is the core driver of Ant's future creation.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.