Artificial Intelligence 12 min read

MiNet: Mixed Interest Network for Cross-Domain Click-Through Rate Prediction

This article reviews the MiNet model, which leverages cross‑domain information by modeling long‑term, source‑domain short‑term, and target‑domain short‑term user interests with hierarchical attention and an auxiliary task to improve CTR prediction and alleviate cold‑start issues.

DataFunTalk
DataFunTalk
DataFunTalk
MiNet: Mixed Interest Network for Cross-Domain Click-Through Rate Prediction

The paper introduces a cross‑domain CTR prediction model called MiNet, designed for Alibaba's UC News feed scenario where news articles serve as the source domain and ads as the target domain. By exploiting overlapping user, item, and feature spaces across domains, the model mitigates data sparsity and cold‑start problems, thereby enhancing recommendation performance.

Cross‑Domain Recommendation Concepts – A "domain" is any collection of items (e.g., a news section or a video category). "Cross‑domain" recommendation transfers knowledge from a source domain to a target domain, using shared users or items to bridge the gap. Advantages include cold‑start alleviation, performance gains, and increased diversity; disadvantages involve potential data sparsity and the need for careful weighting.

Model Design – MiNet models three types of user interest: 1. Long‑term interest across domains (profile features such as age, gender, location, device). 2. Short‑term interest from the source domain (recent news interactions). 3. Short‑term interest from the target domain (recent ad interactions). Item‑level attention captures relevant historical behaviors, while a transfer matrix maps source‑domain embeddings into the target‑domain space. An interest‑level attention layer assigns dynamic weights to the three interests for each target ad.

Auxiliary Task – An additional CTR prediction task on source‑domain items helps learn better long‑term interest representations. Both primary and auxiliary losses are combined using weighted cross‑entropy.

Experimental Details – Two datasets are used: (1) UC News/Ad logs (6 days training, 1 day validation, 1 day test) and (2) Amazon ratings (books as source, movies as target). Users with fewer than five ratings are filtered; 4‑5 stars are positive, others negative. Temporal splitting ensures no leakage.

The results show MiNet outperforms baselines on both datasets, confirming the benefit of modeling multiple interests and using attention mechanisms. The authors conclude that cross‑domain CTR prediction effectively addresses cold‑start scenarios by leveraging user behavior in related domains.

CTR predictionattention mechanismcross-domain recommendationcold startauxiliary taskMiNet
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.