Artificial Intelligence 20 min read

Tencent OlaChat: An LLM‑Powered Intelligent Business Intelligence Platform – Architecture, Capabilities, and Practice

This article presents the evolution from traditional to intelligent BI, explores how large language models enable natural‑language data analysis, details the OlaChat platform’s architecture, metadata‑enhanced retrieval methods, Text2SQL pipeline, multi‑turn dialogue system, and shares practical deployment insights and Q&A.

DataFunTalk
DataFunTalk
DataFunTalk
Tencent OlaChat: An LLM‑Powered Intelligent Business Intelligence Platform – Architecture, Capabilities, and Practice

In the rapidly evolving data analysis landscape, intelligent analytics platforms are transitioning from traditional BI to agile analysis and now to LLM‑driven intelligent BI, enabling users to interact with data via natural language and reducing learning costs.

The talk outlines four main parts: the shift from traditional BI to intelligent BI, new possibilities brought by LLMs, the practical implementation of Tencent's OlaChat platform, and a Q&A session.

Traditional BI suffers from top‑down workflows, long development cycles, and high communication overhead, while agile analysis improves accessibility but still requires significant user training.

With the rise of large language models, intelligent BI can understand user intent, generate SQL, and provide intuitive insights, turning every user into a data analyst.

The presentation reviews the development of LLMs—from early probabilistic models, through word2vec and LSTM, to Transformers, BERT, GPT, and today’s trillion‑parameter models—highlighting their impact on data intelligence.

Two metadata‑retrieval strategies are described: FlattenedRAG, which converts structured metadata into natural‑language text for conventional retrieval, and StructuredRAG, which leverages the hierarchical nature of metadata for more precise searching.

Challenges of Text2SQL in real‑world scenarios (privacy, model hallucination, data quality, noise) are addressed by a fine‑tuned LLM combined with an agent framework, selective field feeding, few‑shot prompting, and post‑generation SQL validation.

OlaChat’s core components include a multi‑turn dialogue system, task orchestration engine, AI‑BI toolbox (query rewriting, Text2SQL, metric analysis), and shared services such as unified LLM scheduling, metadata search, and annotation systems.

The architecture stacks from low‑level services to public services, agents, unified backend, and unified frontend, delivering a seamless user experience for data querying, insight generation, and analysis.

The session concludes with a Q&A covering model sizes, attribution accuracy, SQL correction and interpretation, and the trade‑offs between direct SQL generation and semantic simplification.

AI agentsLLMRAGBusiness Intelligenceintelligent analyticsMetadata RetrievalText2SQL
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.