Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Application of Kullback-Leibler divergence for short-term user interest detection (1507.07382v1)

Published 27 Jul 2015 in cs.IR

Abstract: Classical approaches in recommender systems such as collaborative filtering are concentrated mainly on static user preference extraction. This approach works well as an example for music recommendations when a user behavior tends to be stable over long period of time, however the most common situation in e-commerce is different which requires reactive algorithms based on a short-term user activity analysis. This paper introduces a small mathematical framework for short-term user interest detection formulated in terms of item properties and its application for recommender systems enhancing. The framework is based on the fundamental concept of information theory --- Kullback-Leibler divergence.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Maxim Borisyak (15 papers)
  2. Roman Zykov (1 paper)
  3. Artem Noskov (1 paper)

Summary

We haven't generated a summary for this paper yet.