Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 172 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Towards Equivalent Transformation of User Preferences in Cross Domain Recommendation (2009.06884v2)

Published 15 Sep 2020 in cs.IR

Abstract: Cross domain recommendation (CDR) is one popular research topic in recommender systems. This paper focuses on a popular scenario for CDR where different domains share the same set of users but no overlapping items. The majority of recent methods have explored the shared-user representation to transfer knowledge across domains. However, the idea of shared-user representation resorts to learn the overlapped features of user preferences and suppresses the domain-specific features. Other works try to capture the domain-specific features by an MLP mapping but require heuristic human knowledge of choosing samples to train the mapping. In this paper, we attempt to learn both features of user preferences in a more principled way. We assume that each user's preferences in one domain can be expressed by the other one, and these preferences can be mutually converted to each other with the so-called equivalent transformation. Based on this assumption, we propose an equivalent transformation learner (ETL) which models the joint distribution of user behaviors across domains. The equivalent transformation in ETL relaxes the idea of shared-user representation and allows the learned preferences in different domains to preserve the domain-specific features as well as the overlapped features. Extensive experiments on three public benchmarks demonstrate the effectiveness of ETL compared with recent state-of-the-art methods. Codes and data are available online:~\url{https://github.com/xuChenSJTU/ETL-master}

Citations (22)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube