Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Kalman Filtering Attention for User Behavior Modeling in CTR Prediction (2010.00985v2)

Published 2 Oct 2020 in cs.LG and stat.ML

Abstract: Click-through rate (CTR) prediction is one of the fundamental tasks for e-commerce search engines. As search becomes more personalized, it is necessary to capture the user interest from rich behavior data. Existing user behavior modeling algorithms develop different attention mechanisms to emphasize query-relevant behaviors and suppress irrelevant ones. Despite being extensively studied, these attentions still suffer from two limitations. First, conventional attentions mostly limit the attention field only to a single user's behaviors, which is not suitable in e-commerce where users often hunt for new demands that are irrelevant to any historical behaviors. Second, these attentions are usually biased towards frequent behaviors, which is unreasonable since high frequency does not necessarily indicate great importance. To tackle the two limitations, we propose a novel attention mechanism, termed Kalman Filtering Attention (KFAtt), that considers the weighted pooling in attention as a maximum a posteriori (MAP) estimation. By incorporating a priori, KFAtt resorts to global statistics when few user behaviors are relevant. Moreover, a frequency capping mechanism is incorporated to correct the bias towards frequent behaviors. Offline experiments on both benchmark and a 10 billion scale real production dataset, together with an Online A/B test, show that KFAtt outperforms all compared state-of-the-arts. KFAtt has been deployed in the ranking system of a leading e commerce website, serving the main traffic of hundreds of millions of active users everyday.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Hu Liu (20 papers)
  2. Jing Lu (158 papers)
  3. Xiwei Zhao (14 papers)
  4. Sulong Xu (23 papers)
  5. Hao Peng (291 papers)
  6. Yutong Liu (21 papers)
  7. Zehua Zhang (16 papers)
  8. Jian Li (667 papers)
  9. Junsheng Jin (5 papers)
  10. Yongjun Bao (17 papers)
  11. Weipeng Yan (14 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.