Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Using MapReduce for Large-scale Medical Image Analysis (1510.06937v1)

Published 23 Oct 2015 in cs.DC

Abstract: The growth of the amount of medical image data produced on a daily basis in modern hospitals forces the adaptation of traditional medical image analysis and indexing approaches towards scalable solutions. The number of images and their dimensionality increased dramatically during the past 20 years. We propose solutions for large-scale medical image analysis based on parallel computing and algorithm optimization. The MapReduce framework is used to speed up and make possible three large-scale medical image processing use-cases: (i) parameter optimization for lung texture segmentation using support vector machines, (ii) content-based medical image indexing, and (iii) three-dimensional directional wavelet analysis for solid texture classification. A cluster of heterogeneous computing nodes was set up in our institution using Hadoop allowing for a maximum of 42 concurrent map tasks. The majority of the machines used are desktop computers that are also used for regular office work. The cluster showed to be minimally invasive and stable. The runtimes of each of the three use-case have been significantly reduced when compared to a sequential execution. Hadoop provides an easy-to-employ framework for data analysis tasks that scales well for many tasks but requires optimization for specific tasks.

Citations (65)

Summary

We haven't generated a summary for this paper yet.