Tag

contrastive learning

1 views collected around this technical thread.

Baidu Tech Salon
Baidu Tech Salon
Mar 21, 2025 · Artificial Intelligence

Semantic Embedding with Large Language Models: A Comprehensive Survey

This survey reviews the evolution of semantic embedding—from Word2vec and GloVe to BERT, Sentence‑BERT, and recent contrastive methods—then examines how large language models improve embeddings via synthetic data generation and backbone architectures, detailing techniques such as contrastive prompting, in‑context learning, knowledge distillation, and discussing resource, privacy, and interpretability challenges.

In-Context LearningNLPcontrastive learning
0 likes · 27 min read
Semantic Embedding with Large Language Models: A Comprehensive Survey
DeWu Technology
DeWu Technology
Feb 19, 2025 · Artificial Intelligence

Scenario-aware Multi-Scenario Recommendation Models: SACN, SAINet, and DSWIN

The paper presents a comprehensive multi‑scenario recommendation study introducing three models—SACN, SAINet, and DSWIN—that integrate scene‑aware attention, attribute‑level preferences, and contrastive disentanglement to capture distinct user interests, achieving consistent AUC gains and online CTR improvements across real‑world datasets.

CTR predictioncontrastive learningdeep learning
0 likes · 43 min read
Scenario-aware Multi-Scenario Recommendation Models: SACN, SAINet, and DSWIN
Xiaohongshu Tech REDtech
Xiaohongshu Tech REDtech
Dec 26, 2024 · Artificial Intelligence

Instruction Embedding: Latent Representations of Instructions for Task Identification

The paper introduces Instruction Embedding—a task‑focused text representation learned on the new Instruction Embedding Benchmark—and shows that Prompt‑based Instruction Embedding (PIE) outperforms standard embeddings in clustering, similarity, and downstream tasks such as data selection, in‑context example retrieval, test‑set compression, and task‑correlation analysis.

Fine‑tuningcontrastive learninginstruction embedding
0 likes · 15 min read
Instruction Embedding: Latent Representations of Instructions for Task Identification
DataFunTalk
DataFunTalk
Aug 5, 2024 · Artificial Intelligence

Enhancing Taobao Display Advertising with Multimodal Representations: Challenges, Approaches, and Insights

This article presents a comprehensive study on integrating multimodal image‑text representations into large‑scale e‑commerce advertising CTR models, introducing a semantic‑aware contrastive pre‑training (SCL) method and two application algorithms (SimTier and MAKE) that together achieve over 1 % GAUC improvement and significant online gains.

CTR predictionE-commercecontrastive learning
0 likes · 21 min read
Enhancing Taobao Display Advertising with Multimodal Representations: Challenges, Approaches, and Insights
Alimama Tech
Alimama Tech
Aug 2, 2024 · Artificial Intelligence

Multimodal Representations Boost Taobao Display Advertising CTR

Alibaba’s advertising team introduces semantic‑aware contrastive learning to pre‑train multimodal image‑text embeddings, integrates them via SimTier and MAKE into ID‑based CTR models, achieving up to 6.9% lift in Taobao display ad click‑through rates and improving long‑tail item performance.

CTR predictionE-commercecontrastive learning
0 likes · 21 min read
Multimodal Representations Boost Taobao Display Advertising CTR
DataFunSummit
DataFunSummit
Dec 8, 2023 · Artificial Intelligence

Multimodal Cold‑Start Techniques for Music Recommendation at NetEase Cloud Music

This article presents NetEase Cloud Music's multimodal cold‑start solution, detailing the problem background, feature selection using CLIP, two modeling approaches (I2I2U indirect and U2I DSSM direct), contrastive learning enhancements, interest‑boundary modeling, and evaluation results showing significant gains in user engagement.

AIcold startcontrastive learning
0 likes · 15 min read
Multimodal Cold‑Start Techniques for Music Recommendation at NetEase Cloud Music
Alimama Tech
Alimama Tech
Nov 15, 2023 · Artificial Intelligence

Hybrid Contrastive Constraints for Multi-Scenario Ad Ranking (HC²)

The HC² framework enhances multi‑scenario ad ranking by jointly applying a generalized contrastive loss on shared representations and an individual contrastive loss on scenario‑specific layers, using label‑aware positive sampling, diffusion‑noise negative sampling, and inverse‑similarity weighting, achieving consistent offline gains and up to 2.5% CVR and 3.7% GMV improvements in Alibaba’s live system.

ad rankingcontrastive learningmachine learning
0 likes · 16 min read
Hybrid Contrastive Constraints for Multi-Scenario Ad Ranking (HC²)
DataFunTalk
DataFunTalk
Nov 10, 2023 · Artificial Intelligence

Multimodal Cold-Start Techniques for Music Recommendation at NetEase Cloud Music

This article presents NetEase Cloud Music's multimodal cold-start recommendation approach, detailing the problem's significance, feature extraction using CLIP, I2I2U indirect modeling, U2I DSSM direct modeling with contrastive learning and interest‑boundary mechanisms, deployment pipeline, evaluation results, and future optimization directions.

cold startcontrastive learningdeep learning
0 likes · 14 min read
Multimodal Cold-Start Techniques for Music Recommendation at NetEase Cloud Music
NetEase Media Technology Team
NetEase Media Technology Team
Nov 6, 2023 · Artificial Intelligence

Overview of Sequential Recommendation Models

The article surveys sequential recommendation models from early non-deep approaches like FPMC, through RNN-based GRU4Rec and CNN-based Caser, to Transformer-based methods such as SASRec, BERT4Rec, TiSASRec, and recent contrastive-learning techniques, recommending SASRec or its variants for production use.

contrastive learningdeep learningrecommender systems
0 likes · 17 min read
Overview of Sequential Recommendation Models
Alimama Tech
Alimama Tech
Nov 1, 2023 · Artificial Intelligence

BOMGraph: Boosting Multi-Scenario E-commerce Search with a Unified Graph Neural Network

BOMGraph introduces a unified heterogeneous graph neural network that jointly models text, image, and similar‑item search across multiple e‑commerce scenarios, using meta‑path‑guided attention, disentangled scenario‑specific and shared embeddings, and contrastive learning to alleviate sample sparsity, achieving consistent offline and online performance gains.

E-commercecontrastive learninggraph neural network
0 likes · 13 min read
BOMGraph: Boosting Multi-Scenario E-commerce Search with a Unified Graph Neural Network
Kuaishou Tech
Kuaishou Tech
Sep 26, 2023 · Artificial Intelligence

Cross-Domain Product Representation (COPE): A Large-Scale Dataset and Baseline Model for Rich‑Content E‑Commerce

The paper introduces ROPE, the first large‑scale cross‑domain product recognition dataset covering detail pages, short videos and live streams, and proposes COPE, a dual‑tower multimodal model that learns unified product embeddings using contrastive and classification losses, achieving superior retrieval and few‑shot classification performance across domains.

E-commercecontrastive learningcross-domain
0 likes · 13 min read
Cross-Domain Product Representation (COPE): A Large-Scale Dataset and Baseline Model for Rich‑Content E‑Commerce
Alimama Tech
Alimama Tech
Sep 12, 2023 · Artificial Intelligence

Content Collaborative Graph Neural Network for Large‑Scale E‑commerce Search

CC‑GNN addresses three drawbacks of existing graph‑neural retrieval for e‑commerce by adding content phrase nodes, scalable meta‑path message passing, and difficulty‑aware noisy contrastive learning with counterfactual augmentation, achieving up to 16 % recall improvement and notably larger gains on long‑tail queries and cold‑start items.

Graph Neural Networkscold startcontent collaboration
0 likes · 19 min read
Content Collaborative Graph Neural Network for Large‑Scale E‑commerce Search
DataFunTalk
DataFunTalk
Jun 21, 2023 · Artificial Intelligence

Low‑Resource NLP Pretraining: Methodology, Experiments, and Zero‑Shot Applications

This article presents a low‑resource NLP pretraining approach that combines transformer‑based language modeling with contrastive vector learning, details the unsupervised sample‑pair construction, introduces a camel‑shaped masking distribution, and demonstrates through extensive experiments that the resulting model achieves strong zero‑shot NLU, NLG, and retrieval performance while requiring minimal compute and data.

NLPZero-shotcontrastive learning
0 likes · 10 min read
Low‑Resource NLP Pretraining: Methodology, Experiments, and Zero‑Shot Applications
DataFunSummit
DataFunSummit
May 23, 2023 · Artificial Intelligence

Continuous Semantic Enhancement for Neural Machine Translation: Methodology, Experiments, and Community Deployment

This article introduces a continuous semantic enhancement approach for neural machine translation that overcomes the limitations of discrete data‑augmentation techniques, details the neighbor risk minimization training objective, presents benchmark improvements on ACL‑2022 datasets, and describes practical deployment and fine‑tuning workflows in the Modu community.

continuous semantic augmentationcontrastive learningdata augmentation
0 likes · 19 min read
Continuous Semantic Enhancement for Neural Machine Translation: Methodology, Experiments, and Community Deployment
AntTech
AntTech
May 10, 2023 · Artificial Intelligence

Brainwave and Behavior Recognition: Multi‑Modal Biometric Authentication with Adversarial Contrastive Transfer Learning

This article presents Ant Security's research on novel biometric methods—brainwave (脑纹) and behavior recognition—detailing their scientific background, data collection, multi‑modal deep‑learning algorithms, adversarial and contrastive training strategies, experimental results, and practical applications for inclusive, secure identity verification.

Accessibilityadversarial learningbehavior recognition
0 likes · 17 min read
Brainwave and Behavior Recognition: Multi‑Modal Biometric Authentication with Adversarial Contrastive Transfer Learning
Kuaishou Tech
Kuaishou Tech
Apr 25, 2023 · Artificial Intelligence

DCCL: A Contrastive Learning Framework for Causal Representation Decoupling in Recommendation Systems

The paper introduces DCCL, a model‑agnostic contrastive learning framework that decouples user interest and conformity representations to address popularity bias and out‑of‑distribution challenges in recommendation systems, demonstrating significant offline and online performance gains on real‑world datasets.

OOD robustnesscausal inferencecontrastive learning
0 likes · 8 min read
DCCL: A Contrastive Learning Framework for Causal Representation Decoupling in Recommendation Systems
Alimama Tech
Alimama Tech
Dec 14, 2022 · Artificial Intelligence

Contrastive Image Representation Learning with Debiasing for CTR Prediction

The article proposes a three-stage contrastive learning framework—pre‑training, fine‑tuning, and debiasing—to generate unbiased, fine‑grained image embeddings for mobile Taobao CTR prediction, achieving higher accuracy, fairness, and a 4‑5% CTR lift in large‑scale offline and online evaluations.

Bias MitigationCTR predictioncontrastive learning
0 likes · 14 min read
Contrastive Image Representation Learning with Debiasing for CTR Prediction
Alimama Tech
Alimama Tech
Nov 9, 2022 · Artificial Intelligence

Graph-based Weakly Supervised Framework for Semantic Relevance Learning in E-commerce

The paper introduces a graph‑based weakly supervised contrastive learning framework that uses heterogeneous user‑behavior graphs, e‑commerce‑specific augmentations, and a hybrid fine‑tuning/transfer learning strategy to improve semantic relevance matching between queries and product titles, achieving significant gains on a large‑scale Taobao dataset.

Weak Supervisioncontrastive learninge-commerce
0 likes · 12 min read
Graph-based Weakly Supervised Framework for Semantic Relevance Learning in E-commerce
Zhuanzhuan Tech
Zhuanzhuan Tech
Oct 28, 2022 · Artificial Intelligence

Contrastive Learning: Definitions, Principles, Classic Algorithms, and Applications in Recommendation Systems

This article introduces contrastive learning, explains its definition, principles, and classic algorithms such as SimCLR and MoCo, and details its practical applications in recommendation systems, including a case study of its deployment at Zhuanzhuan that boosted order rates by over 10%.

AIcontrastive learningmachine learning
0 likes · 12 min read
Contrastive Learning: Definitions, Principles, Classic Algorithms, and Applications in Recommendation Systems
Youzan Coder
Youzan Coder
Oct 24, 2022 · Artificial Intelligence

Knowledge Base Retrieval Matching: Algorithm and Engineering Service Practice

The article outlines a comprehensive knowledge‑base retrieval matching solution—combining PageRank‑enhanced DSL rewriting, keyword and dual‑tower vector recall, contrastive fine‑ranking, and optimized vector‑based ranking—implemented via offline DP training and Sunfish online inference on Milvus, with applications in enterprise search and recommendations and future plans for graph‑neural embeddings.

InfoNCEMilvusNLP
0 likes · 12 min read
Knowledge Base Retrieval Matching: Algorithm and Engineering Service Practice