Privacy-Preserving Cross-Domain Recommendation via Differential Privacy and Subspace Embedding
The article reviews a TheWebConf 2022 paper that introduces a two‑stage framework combining differential‑privacy‑based random subspace publishing (using Johnson‑Lindenstrauss and sparse‑aware transforms) with asymmetric deep models to achieve accurate, privacy‑preserving cross‑domain recommendation, and discusses broader differential‑privacy applications.
The paper "Privacy‑Protected Cross‑Domain Recommendation via Differential Privacy" was accepted at TheWebConf 2022, an A‑class AI conference, where it was presented among 1822 submissions with a 17.7% acceptance rate.
Differential Privacy (DP), originally proposed by Dwork in 2006, adds carefully calibrated noise to data or computation results to prevent inference of any individual's presence in a dataset, and has become a rapidly growing privacy‑preserving technology.
The authors address the privacy‑protected cross‑domain recommendation problem: improving recommendation performance in a target domain while preserving data privacy in both source and target domains, thereby mitigating data sparsity and cold‑start issues.
In the first stage, two random‑transform based DP publishing methods are proposed: the standard Johnson‑Lindenstrauss Transform (JLT) and a Sparse‑aware Johnson‑Lindenstrauss Transform (SJLT), both designed to securely embed rating matrices into lower‑dimensional subspaces while guaranteeing DP.
The second stage introduces an asymmetric cross‑domain recommendation model. A deep auto‑encoder learns user and item embeddings from the DP‑published source‑domain subspace, while a deep neural network regresses the target‑domain rating matrix to obtain target user/item embeddings; the learned source embeddings are then aligned and transferred to the target model.
Theoretical analysis proves that the subspace embeddings satisfy user‑level DP, and extensive experiments show that the DP‑published matrices improve target‑domain recommendation accuracy compared with plain cross‑domain baselines.
Beyond the paper, the article surveys real‑world DP deployments: Apple uses local DP for emoji trends, Safari energy‑heavy page statistics, and new word discovery; Google applies DP in open‑source libraries and mobility analytics, illustrating both local and central DP settings.
Future work will integrate the proposed methods into Ant Group’s "YinYu" privacy‑computing framework for federated DP, further advancing privacy‑preserving AI services.
AntTech
Technology is the core driver of Ant's future creation.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.