Emergent Mind

Abstract

With the advent of social networks and the web, the graph sizes have grown too large to fit in main memory precipitating the need for alternative approaches for an efficient, scalable evaluation of queries on graphs of any size. Here, we use the divide and conquer approach by partitioning a graph and process queries over partitions to obtain all or specified number of answers. This entails correctly computing answers that span multiple partitions or even need the same partition more than once. Given a set of partitions, there are many approaches to evaluate a query: i) One Partition At a Time approach, ii) Traditional use of Multiple Processors, and iii) using the Map/Reduce Multi-Processor approach. Approach (i), detailed in this paper, has established scalability through independent processing of partitions. The other two approaches address response time in addition to scalability. For approach (i), necessary minimal book keeping has been identified and its correctness established in this paper. Query answering on partitioned graphs also requires analyzing partitioning schemes for their impact on query processing and determining the number and the sequence in which partitions need to be loaded to reduce the response time to process queries. We correlate query properties and partition characteristics to reduce query processing time in terms of the resources available. We also identify a set of quantitative metrics and use them to formulate heuristics to determine the order of loading partitions for efficient query processing. For approach (i), experiments on large graphs (synthetic, real-world) using different partitioning schemes analyze the proposed heuristics on a variety of query types. The other two approaches are fleshed out and analyzed. An existing graph querying system has been extended to evaluate queries on partitioned graphs. Finally all approaches are contrasted.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.