Tag

text representation

1 views collected around this technical thread.

Baidu Tech Salon
Baidu Tech Salon
Mar 21, 2025 · Artificial Intelligence

Semantic Embedding with Large Language Models: A Comprehensive Survey

This survey reviews the evolution of semantic embedding—from Word2vec and GloVe to BERT, Sentence‑BERT, and recent contrastive methods—then examines how large language models improve embeddings via synthetic data generation and backbone architectures, detailing techniques such as contrastive prompting, in‑context learning, knowledge distillation, and discussing resource, privacy, and interpretability challenges.

In-Context LearningNLPcontrastive learning
0 likes · 27 min read
Semantic Embedding with Large Language Models: A Comprehensive Survey
DataFunSummit
DataFunSummit
Jul 17, 2023 · Artificial Intelligence

Introduction to ModelScope Community's Fundamental NLP Models and Their Applications

This article introduces the ModelScope community's suite of foundational NLP models—including tokenization, POS tagging, NER, and text representation—detailing their architectures, performance, application scenarios, while also highlighting research contributions such as the ACE framework and retrieval‑enhanced techniques.

Entity RecognitionModelScopeNLP
0 likes · 21 min read
Introduction to ModelScope Community's Fundamental NLP Models and Their Applications
DaTaobao Tech
DaTaobao Tech
Apr 12, 2022 · Artificial Intelligence

ArcCSE: Angular Margin Contrastive Learning for Self‑Supervised Text Representation

ArcCSE introduces an angular‑margin contrastive loss and both pairwise (dropout‑augmented) and triple‑wise (span‑masked) relationship modeling to self‑supervise text embeddings, yielding tighter decision boundaries, higher alignment and uniformity, and superior performance on unsupervised STS, SentEval, and Alibaba’s retrieval and recommendation systems.

NLPangular margincontrastive learning
0 likes · 8 min read
ArcCSE: Angular Margin Contrastive Learning for Self‑Supervised Text Representation
DataFunTalk
DataFunTalk
Oct 13, 2021 · Artificial Intelligence

Intelligent Recruitment: Deep Semantic Matching, Interview Assistance, and Text Representation

This article explores how AI techniques such as deep semantic matching, attention mechanisms, variational autoencoders, and neural topic models can transform traditional recruitment by improving person‑job matching, interview assistance, and text representation, supported by experiments on real‑world hiring data.

AI recruitmentTopic ModelingVAE
0 likes · 18 min read
Intelligent Recruitment: Deep Semantic Matching, Interview Assistance, and Text Representation
DataFunSummit
DataFunSummit
Oct 13, 2021 · Artificial Intelligence

Intelligent Recruitment: Deep Semantic Matching, Interview Assistance, and Text Representation with VAE and Neural Topic Models

This article presents a comprehensive overview of applying AI techniques—semantic matching models, attention mechanisms, VAE‑based text representation, and neural topic models—to improve talent acquisition, candidate‑job matching, interview assistance, and recruitment text analysis, supported by experiments on real‑world hiring data.

AI in HRIntelligent RecruitmentNeural Topic Model
0 likes · 19 min read
Intelligent Recruitment: Deep Semantic Matching, Interview Assistance, and Text Representation with VAE and Neural Topic Models
58 Tech
58 Tech
Aug 5, 2021 · Artificial Intelligence

Exploration and Practice of Text Representation Algorithms in the 58 Security Scenario

This article presents a comprehensive study of text representation techniques—from weighted word‑vector methods to supervised SimBert and unsupervised contrastive learning models—applied to large‑scale unstructured data in 58's information‑security workflows, evaluating their effectiveness for classification and content‑recall tasks.

BERTSimCSEcontrastive learning
0 likes · 11 min read
Exploration and Practice of Text Representation Algorithms in the 58 Security Scenario
DataFunTalk
DataFunTalk
May 17, 2020 · Artificial Intelligence

Improving Text Representation and Clustering for Small‑Sample Scenarios in 58 Second‑Hand Car Intelligent Customer Service

This article presents a study on enhancing text representation and clustering in a small‑sample setting for 58's second‑hand car intelligent customer service by introducing a Bi‑LSTM based pre‑training language model and an improved Deep Embedded Clustering (DEC) algorithm, demonstrating significant gains in accuracy, silhouette score, and answer‑rate through extensive experiments.

AIBi-LSTMDEC
0 likes · 16 min read
Improving Text Representation and Clustering for Small‑Sample Scenarios in 58 Second‑Hand Car Intelligent Customer Service
58 Tech
58 Tech
May 15, 2020 · Artificial Intelligence

Improving Text Representation and Clustering for Small‑Sample Scenarios in 58.com Used‑Car Customer Service with a Bi‑LSTM Pre‑trained Language Model and Deep Clustering

This article presents a study on enhancing text representation and clustering purity in the small‑sample 58.com used‑car customer‑service scenario by introducing a Bi‑LSTM based pre‑trained language model and an improved Deep Embedded Clustering (DEC) algorithm, demonstrating significant gains in accuracy, silhouette score, and answer‑rate.

Bi-LSTMDECMachine Learning
0 likes · 16 min read
Improving Text Representation and Clustering for Small‑Sample Scenarios in 58.com Used‑Car Customer Service with a Bi‑LSTM Pre‑trained Language Model and Deep Clustering