Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 167 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

A Cache Energy Optimization Technique for STT-RAM Last Level Cache (1312.2207v2)

Published 8 Dec 2013 in cs.AR

Abstract: Last level caches (LLCs) occupy a large chip-area and there size is expected to grow further to offset the limitations of memory bandwidth and speed. Due to high leakage consumption of SRAM device, caches designed with SRAM consume large amount of energy. To address this, use of emerging technologies such as spin torque transfer RAM (STT-RAM) has been investigated which have lower leakage power dissipation. However, the high write latency and power of it may lead to large energy consumption which present challenges in its use. In this report, we propose a cache reconfiguration based technique for improving the energy efficiency of STT-RAM based LLCs. Our technique dynamically adjusts the active cache size to reduce the cache leakage energy consumption with minimum performance loss. We choose a suitable value of STT-RAM retention time for avoiding refresh overhead and gaining performance. Single-core simulations have been performed using SPEC2006 benchmarks and Sniper x86-64 simulator. The results show that while, compared to an STT-RAM LLC of similar area, an SRAM LLC incurs nearly 100% loss in energy and 7.3% loss in performance; our technique using STT-RAM cache saves 21.8% energy and incurs only 1.7% loss in performance.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.