🚀 Try Zilliz Cloud, the fully managed Milvus, for free—experience 10x faster performance! Try Now>>

Milvus
Zilliz
  • Home
  • AI Reference
  • Which deep learning architectures are popular for recommendation tasks?

Which deep learning architectures are popular for recommendation tasks?

Neural Collaborative Filtering (NCF) is a widely used architecture for recommendation tasks. It replaces traditional matrix factorization with neural networks to model user-item interactions. Instead of relying on linear combinations of user and item embeddings, NCF uses multi-layer perceptrons (MLPs) to capture non-linear relationships. For example, in a movie recommendation system, NCF can learn complex patterns from user watch history and item features, such as genre preferences or viewing frequency. This approach was popularized by a 2017 paper demonstrating its effectiveness on datasets like MovieLens, where it outperformed traditional methods, especially with implicit feedback (e.g., clicks or views). NCF is particularly useful when interaction data is sparse or noisy, as neural networks can generalize better than linear models.

Wide & Deep Learning, introduced by Google, combines two components: a “wide” linear model for memorization (e.g., capturing known user-item interactions) and a “deep” neural network for generalization (e.g., discovering new patterns). The wide component handles explicit feature interactions, such as a user downloading an app after searching for its name, while the deep component processes dense embeddings of categorical features like user demographics. This architecture powers Google Play’s app recommendations, balancing familiarity with serendipity. Variants like DeepFM integrate factorization machines into the wide part to automate feature interaction learning, reducing manual engineering. These models excel in scenarios with mixed data types, such as combining user behavior logs with contextual metadata.

Graph Neural Networks (GNNs) are gaining traction for recommendations involving relational data. GNNs propagate information across graph structures, such as user-item interaction graphs or social networks, to generate embeddings. For instance, Pinterest’s PinSage uses GNNs to recommend content by aggregating features from a pin’s visual and textual neighbors in a graph of pins and boards. In e-commerce, GNNs model user-item interactions as bipartite graphs, capturing indirect relationships (e.g., “users who bought X also liked Y”). This approach is effective for cold-start scenarios, where new items or users benefit from graph-based neighborhood signals. Frameworks like PyTorch Geometric simplify GNN implementation, making them accessible for tasks like social recommendations or knowledge graph-enhanced systems.

Like the article? Spread the word