Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FE-TCM: Filter-Enhanced Transformer Click Model for Web Search (2301.07854v2)

Published 19 Jan 2023 in cs.IR and cs.LG

Abstract: Constructing click models and extracting implicit relevance feedback information from the interaction between users and search engines are very important to improve the ranking of search results. Using neural network to model users' click behaviors has become one of the effective methods to construct click models. In this paper, We use Transformer as the backbone network of feature extraction, add filter layer innovatively, and propose a new Filter-Enhanced Transformer Click Model (FE-TCM) for web search. Firstly, in order to reduce the influence of noise on user behavior data, we use the learnable filters to filter log noise. Secondly, following the examination hypothesis, we model the attraction estimator and examination predictor respectively to output the attractiveness scores and examination probabilities. A novel transformer model is used to learn the deeper representation among different features. Finally, we apply the combination functions to integrate attractiveness scores and examination probabilities into the click prediction. From our experiments on two real-world session datasets, it is proved that FE-TCM outperforms the existing click models for the click prediction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yingfei Wang (20 papers)
  2. Jianping Liu (5 papers)
  3. Jian Wang (969 papers)
  4. Xiaofeng Wang (310 papers)
  5. Meng Wang (1065 papers)
  6. Xintao Chu (2 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.