Tag

knowledge integration

0 views collected around this technical thread.

DataFunSummit
DataFunSummit
Sep 20, 2024 · Artificial Intelligence

Exploring and Applying Large Language Models in Recommendation Systems

Professor Wang Yichao from Huawei Noah's Ark Lab presents a comprehensive exploration of large language models in recommendation systems, covering background, challenges, two key projects (LLM4Rec and Uni-CTR), experimental results, and future directions for open, knowledge‑enhanced, generative recommendation pipelines.

AILLM4RecRecommendation systems
0 likes · 13 min read
Exploring and Applying Large Language Models in Recommendation Systems
DataFunTalk
DataFunTalk
Jan 15, 2023 · Artificial Intelligence

Advances in Dialogue Systems: Baidu PLATO Large‑Scale Conversational Models

This article reviews the evolution of dialogue systems from modular task‑oriented designs to end‑to‑end large‑scale models, detailing Baidu's PLATO series, their technical innovations, real‑world deployments, challenges such as inference efficiency and safety, and future research directions in conversational AI.

AI SafetyPLATOconversational AI
0 likes · 13 min read
Advances in Dialogue Systems: Baidu PLATO Large‑Scale Conversational Models
DataFunTalk
DataFunTalk
Dec 1, 2022 · Artificial Intelligence

Advances and Challenges in Controllable Text Generation with Pretrained Language Models

This report reviews the background, recent research progress, practical applications, and future directions of controllable text generation using transformer‑based pretrained language models, highlighting methods such as decoding strategies, prompt learning, memory networks, continual learning, contrastive training, and knowledge integration.

continual learningcontrastive trainingcontrollable text generation
0 likes · 13 min read
Advances and Challenges in Controllable Text Generation with Pretrained Language Models
DataFunSummit
DataFunSummit
Jul 18, 2022 · Artificial Intelligence

Advances in Natural Language Generation: ProphetNet, Knowledge‑Enhanced Generation, Non‑Autoregressive Pre‑training, Long‑Text Modeling, and Efficient Attention

This talk presents recent year’s research on natural language generation, covering the ProphetNet pre‑trained generation model, external‑knowledge integration for generation, non‑autoregressive pre‑training (BANG), the Poolingformer long‑text architecture, EL‑attention for faster decoding, and a new multi‑task generation benchmark.

Natural Language Generationefficient attentionknowledge integration
0 likes · 22 min read
Advances in Natural Language Generation: ProphetNet, Knowledge‑Enhanced Generation, Non‑Autoregressive Pre‑training, Long‑Text Modeling, and Efficient Attention
DataFunTalk
DataFunTalk
Nov 5, 2021 · Artificial Intelligence

End-to-End Entity Extraction for Tmall Genie: Speech2Slot Model and Unsupervised Pre‑Training

This article presents the business background of Tmall Genie’s voice‑driven content‑on‑demand service, critiques the traditional pipeline for entity extraction, and details an end‑to‑end speech‑semantic model—including the Speech2Slot architecture, knowledge‑enhanced encoding, and Phoneme‑BERT unsupervised pre‑training—demonstrating significant performance gains in both generation and classification tasks.

Speech RecognitionVoice Assistantend-to-end model
0 likes · 14 min read
End-to-End Entity Extraction for Tmall Genie: Speech2Slot Model and Unsupervised Pre‑Training
JD Tech
JD Tech
Feb 2, 2021 · Artificial Intelligence

Advances and Trends in Multimodal Digital Content Generation and Automatic Text Summarization

The article reviews recent research on multimodal digital content generation and automatic text summarization, outlining the evolution from extractive to abstractive methods, highlighting four key technology trends such as pretrained language models, transformer dominance, knowledge‑enhanced generation, and multimodal‑knowledge joint modeling, and describing an industrial e‑commerce application built on these advances.

Generative ModelsText Summarizatione-commerce
0 likes · 12 min read
Advances and Trends in Multimodal Digital Content Generation and Automatic Text Summarization
DataFunTalk
DataFunTalk
Apr 16, 2020 · Artificial Intelligence

Comprehensive Survey of Pre-trained Models for Natural Language Processing

This article provides a detailed survey of pre‑trained models (PTMs) for natural language processing, classifying them into shallow embeddings and contextual encoders, discussing training paradigms such as knowledge integration and model compression, and offering guidance on transfer learning and future challenges.

Natural Language Processingknowledge integrationmodel compression
0 likes · 25 min read
Comprehensive Survey of Pre-trained Models for Natural Language Processing