Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 153 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Evaluating the Self-Optimization Process of the Adaptive Memory Management Architecture Self-aware Memory (1405.2910v1)

Published 12 May 2014 in cs.DC

Abstract: With the continuously increasing integration level, manycore processor systems are likely to be the coming system structure not only in HPC but also for desktop or mobile systems. Nowadays manycore processors like Tilera TILE, KALRAY MPPA or Intel SCC combine a rising number of cores in a tiled architecture and are mainly designed for high performance applications with focus on direct inter-core communication. The current architectures have limitations by central or sparse components like memory controllers, memory I/O or inflexible memory management. In the future highly dynamic workloads with multiple concurrently running applications, changing I/O characteristics and a not predictable memory usage have to be utilized on these manycore systems. Consequently the memory management has to become more flexible and distributed in nature and adaptive mechanisms and system structures are needed. With Self-aware Memory (SaM), a decentralized, scalable and autonomous self-optimizing memory architecture is developed. This adaptive memory management can achieve higher flexibility and an easy usage of memory. In this paper the concept of an ongoing decentralized self-optimization is introduced and the evaluation of its various parameters is presented. The results show that the overhead of the decentralized optimization process is amortized by the optimized runtime using the appropriate parameter settings.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.