Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parallel Minimum Spanning Tree Algorithms and Evaluation (2005.06913v1)

Published 14 May 2020 in cs.DC

Abstract: Minimum Spanning Tree (MST) is an important graph algorithm that has wide ranging applications in the areas of computer networks, VLSI routing, wireless communications among others. Today virtually every computer is built out of multi-core processors. Hence it is important to take advantage of such parallel computing power by parallelizing existing algorithms and applications. Most of the earlier work on parallelizing MST focused on algorithms for PRAM models. There are two limitations to such studies. First, PRAM models assume infinite memory bandwidth which is unrealistic. Second, PRAM model based algorithms require at least O(n) processors where n being total number of vertices. For large graphs this is infeasible. There are very few implementations which target real systems. In this paper I present and evaluate two new parallel MST algorithms that are a variant of Parallel Boruvka algorithms: i) First algorithm uses lock variables without spin-locks ii) Second algorithm uses only atomic compare-and-swap (CAS) primitive. I evaluated the performance of these algorithms on a six-core, 12-thread Intel system on various input graphs of sizes up to 1 million vertices. First algorithm showed a speedup of up to1.94 over an un-optimized sequential algorithm and a speedup of up to 1.4 over an optimized sequential algorithm. Second algorithm showed a speedup of up to 2.03 over an un-optimized sequential algorithm and a speedup of up to 1.403 over an optimized sequential algorithm. When second algorithm using CAS is compared with the first algorithm second algorithm is found to be up to 1.15 times better than the first algorithm at four threads.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
Citations (2)

Summary

We haven't generated a summary for this paper yet.