Artificial Intelligence 11 min read

Optimization Practices for Business Opportunity Slot Recognition in 58.com Intelligent Customer Service

This article details the background, challenges, architecture, model selection, and future directions of the business‑opportunity slot recognition module used in 58.com’s intelligent customer service, highlighting how regex‑model fusion and IDCNN‑CRF improve entity extraction for phone, WeChat, address, and time slots.

58 Tech
58 Tech
58 Tech
Optimization Practices for Business Opportunity Slot Recognition in 58.com Intelligent Customer Service

The 58.com platform connects millions of users and merchants, and its intelligent customer service (Bangg Bang) uses a business‑opportunity (商机) chatbot to capture four key pieces of information—phone, WeChat, address, and time—through slot recognition, which directly impacts the conversion‑rate metric.

Each opportunity type is broken down into specific slots (e.g., phone → mobile, carrier; WeChat → wechat, QQ; address → city, subway, etc.), and the conversion rate is defined as the number of conversations that yield an opportunity divided by total conversations.

Key challenges include discontinuous user expressions (especially for address and time), ambiguous WeChat identifiers that resemble arbitrary strings, and insufficient context windows causing missed recalls. Solutions involve finer‑grained slot splitting, blacklist filtering for WeChat, and incorporating the last five dialogue turns as contextual information.

The overall architecture combines regular‑expression rules with neural models. Regex quickly captures phone, WeChat, and QQ patterns, while the model (initially BILSTM‑CRF, later IDCNN‑CRF) handles phone, address, and time extraction. Results from both are merged, and context information is optionally added to improve accuracy.

Model exploration compared BILSTM‑CRF, IDCNN‑CRF, and BERT‑augmented versions. IDCNN‑CRF was chosen for its comparable accuracy to BILSTM‑CRF and lower inference latency. The pipeline embeds the query, applies dilated convolutions (IDCNN), a fully‑connected layer, and a CRF layer to enforce BIO tagging constraints.

Further experiments introduced a context‑aware slot recognizer using bi‑GRU to encode historical turns, concatenating current and historical representations, and feeding them into a BILSTM‑CRF. Although this approach outperformed a simple multi‑turn concatenation, it still lagged behind the strong regex baseline for WeChat slots.

In summary, the optimized slot‑recognition module has boosted opportunity conversion rates close to human‑level performance. Future work aims to expand recognition to ~40 new opportunity types (e.g., product, quantity, price), enhance the context‑aware model with more data and augmentation, and better leverage pre‑trained models like BERT.

NLPBERTentity extractiondialogue systemCRFcontext modelingIDCNNslot recognition
58 Tech
Written by

58 Tech

Official tech channel of 58, a platform for tech innovation, sharing, and communication.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.