Tag

embedding normalization

0 views collected around this technical thread.

DataFunSummit
DataFunSummit
Oct 29, 2021 · Artificial Intelligence

Contrastive Learning Perspectives on Retrieval and Ranking Models in Recommendation Systems

This talk explains contrastive learning fundamentals, typical image‑domain models such as SimCLR, MoCo and SwAV, and shows how their principles—positive/negative sample construction, encoder design, loss functions, alignment and uniformity—can be applied to improve dual‑tower retrieval and ranking models, embedding normalization, temperature scaling, and graph‑based recommender systems.

InfoNCEcontrastive learningdual-tower models
0 likes · 40 min read
Contrastive Learning Perspectives on Retrieval and Ranking Models in Recommendation Systems
DataFunTalk
DataFunTalk
Oct 26, 2021 · Artificial Intelligence

Contrastive Learning Perspective on Retrieval and Reranking Models in Recommendation Systems

This article explains how contrastive learning, originally popular in computer‑vision, can be interpreted and applied to recommendation‑system recall and coarse‑ranking models, covering its theoretical roots, typical architectures like SimCLR, MoCo and SwAV, and practical tricks such as in‑batch negatives, embedding normalization, temperature scaling, and graph‑based extensions.

contrastive learningdual-tower modelsembedding normalization
0 likes · 40 min read
Contrastive Learning Perspective on Retrieval and Reranking Models in Recommendation Systems