FinBench: A Financial Graph Benchmark for System Selection and Evaluation
FinBench is a financial benchmark developed by Ant Group under the LDBC umbrella, providing a comprehensive graph system evaluation framework that includes background, scenarios, design, chokepoints, and future plans, aimed at guiding database selection for financial use cases.
FinBench (Financial Benchmark) is an open‑source benchmark project initiated by Ant Group and co‑developed with multiple vendors under the LDBC (Linked Data Benchmark Council) organization, targeting the evaluation of graph database systems in financial scenarios.
The benchmark background explains why benchmarking is needed, comparing it to classic relational benchmarks such as TPC‑C and highlighting the diversity of graph query languages (Cypher, Gremlin) and data models (RDF, property graph). It introduces LDBC’s existing benchmarks (SPB, Graphalytics, SNB) and motivates the creation of a financial‑specific benchmark.
FinBench scenarios cover typical risk‑control use cases, such as detecting transfer cycles for fraud prevention and tracing fund flows from loans. These scenarios illustrate graph modeling of accounts and transactions, and they define read‑write queries that wrap risk‑control logic in a single transaction to allow early abort of suspicious transfers.
The design section details the benchmark suite components (DataGen, Driver, Reference Implementation, ACID Suite), the data schema (five vertex types, edge multiplicity), data distribution (power‑law degree, timestamped edges), and the transaction workload. The workload includes Complex Read, Simple Read, Write, and Read‑Write queries, as well as temporal window filtering, special patterns (transfer rings, guarantee chains), recursive path filtering, and load patterns observed in real systems.
Chokepoints identify technical challenges such as expressing recursive path filters in Cypher, optimizing time‑based edge locality, and handling the mixed read‑write nature of Read‑Write queries. The benchmark proposes possible optimizations for these challenges.
Progress and plans report that FinBench was proposed in February 2022, kicked off in June 2022, and has produced an alpha version released at the end of January 2023. After internal testing with participating vendors (Ant Group, Chuanglin Technology, StarGraph, Ultipa, Intel, TigerGraph, Vesoft, etc.), a stable version is expected mid‑year.
Project links are provided for the documentation repository (https://github.com/ldbc/ldbc_finbench_docs) and the official benchmark page (https://ldbcouncil.org/benchmarks/finbench/).
DataFunSummit
Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.