Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attention: to Better Stand on the Shoulders of Giants (2005.14256v1)

Published 27 May 2020 in cs.SI, cs.LG, and stat.ML

Abstract: Science of science (SciSci) is an emerging discipline wherein science is used to study the structure and evolution of science itself using large data sets. The increasing availability of digital data on scholarly outcomes offers unprecedented opportunities to explore SciSci. In the progress of science, the previously discovered knowledge principally inspires new scientific ideas, and citation is a reasonably good reflection of this cumulative nature of scientific research. The researches that choose potentially influential references will have a lead over the emerging publications. Although the peer review process is the mainly reliable way of predicting a paper's future impact, the ability to foresee the lasting impact based on citation records is increasingly essential in the scientific impact analysis in the era of big data. This paper develops an attention mechanism for the long-term scientific impact prediction and validates the method based on a real large-scale citation data set. The results break conventional thinking. Instead of accurately simulating the original power-law distribution, emphasizing the limited attention can better stand on the shoulders of giants.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Sha Yuan (9 papers)
  2. Zhou Shao (6 papers)
  3. Yu Zhang (1400 papers)
  4. Xingxing Wei (60 papers)
  5. Tong Xiao (119 papers)
  6. Yifan Wang (319 papers)
  7. Jie Tang (302 papers)

Summary

We haven't generated a summary for this paper yet.