Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 177 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 119 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 439 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Efficient On-Chip Communication for Parallel Graph-Analytics on Spatial Architectures (2108.11521v2)

Published 26 Aug 2021 in cs.AR and cs.DC

Abstract: Large-scale graph processing has drawn great attention in recent years. Most of the modern-day datacenter workloads can be represented in the form of Graph Processing such as MapReduce etc. Consequently, a lot of designs for Domain-Specific Accelerators have been proposed for Graph Processing. Spatial Architectures have been promising in the execution of Graph Processing, where the graph is partitioned into several nodes and each node works in parallel. We conduct experiments to analyze the on-chip movement of data in graph processing on a Spatial Architecture. Based on the observations, we identify a data movement bottleneck, in the execution of such highly parallel processing accelerators. To mitigate the bottleneck we propose a novel power-law aware Graph Partitioning and Data Mapping scheme to reduce the communication latency by minimizing the hop counts on a scalable network-on-chip. The experimental results on popular graph algorithms show that our implementation makes the execution 2-5x faster and 2.7-4x energy-efficient by reducing the data movement time in comparison to a baseline implementation.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.