Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Survey on Recent Hardware Data Prefetching Approaches with An Emphasis on Servers (2009.00715v1)

Published 1 Sep 2020 in cs.AR

Abstract: Data prefetching, i.e., the act of predicting application's future memory accesses and fetching those that are not in the on-chip caches, is a well-known and widely-used approach to hide the long latency of memory accesses. The fruitfulness of data prefetching is evident to both industry and academy: nowadays, almost every high-performance processor incorporates a few data prefetchers for capturing various access patterns of applications; besides, there is a myriad of proposals for data prefetching in the research literature, where each proposal enhances the efficiency of prefetching in a specific way. In this survey, we discuss the fundamental concepts in data prefetching and study state-of-the-art hardware data prefetching approaches. Additional Key Words and Phrases: Data Prefetching, Scale-Out Workloads, Server Processors, and Spatio-Temporal Correlation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Mohammad Bakhshalipour (7 papers)
  2. Mehran Shakerinava (7 papers)
  3. Fatemeh Golshan (2 papers)
  4. Ali Ansari (5 papers)
  5. Pejman Lotfi-Karman (1 paper)
  6. Hamid Sarbazi-Azad (14 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.