Artificial Intelligence 16 min read

Embedding Techniques for Real Estate Recommendation at 58.com

This article explains how 58.com applies various embedding methods—including ALS, Skip‑gram, and DeepWalk—to vectorize users and properties, improve similarity calculations, and enhance both recall and ranking stages of its real‑estate recommendation system, with detailed technical descriptions and evaluation results.

DataFunTalk
DataFunTalk
DataFunTalk
Embedding Techniques for Real Estate Recommendation at 58.com

The article introduces the real‑estate business and recommendation scenarios of 58.com, describing two main user groups—agents publishing listings and house‑searching users—and the platform’s role in connecting them.

It outlines five recommendation slots (home page, default list, zero‑result page, property detail page, and feed channel) and explains that similarity calculations between users and properties, as well as between properties, are essential.

Two embedding approaches are presented: label‑based vectors derived from property attributes (price, area, orientation, etc.) and relation‑based vectors learned from user‑property interaction graphs.

For label‑based embedding, continuous features are normalized and discrete features are one‑hot encoded, offering real‑time updates but limited adaptability.

Relation‑based embedding leverages co‑view and interaction data, using collaborative filtering or word2vec‑style methods to capture latent relationships, though storing full similarity matrices is impractical.

The article details three concrete embedding techniques:

ALS (Alternating Least Squares) matrix factorization to obtain user and item latent factors.

Skip‑gram, treating a user's sequence of viewed properties as a sentence and learning embeddings via negative sampling.

DeepWalk, constructing a weighted directed graph of user‑property interactions and performing random walks before applying Skip‑gram.

Effectiveness is visualized with t‑SNE plots and evaluated using AUC on constructed positive and negative samples, showing DeepWalk excels in first‑tier cities while Skip‑gram performs better in lower‑tier cities.

Embedding vectors are then integrated into a wide‑and‑deep ranking model, where wide components handle tag features and deep components process embedding features, leading to a 5.4% increase in click‑through rate and a 4.4% rise in conversion.

The article concludes with a summary of embedding benefits: vectorizing user‑item relationships, enabling similarity‑based recall, and serving as pretrained features for deep models.

embeddingRecommendation systemsReal Estateskip-gramDeepWalkALS
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.