Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 39 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Stack-Summarizing Control-Flow Analysis of Higher-Order Programs (1009.1560v1)

Published 8 Sep 2010 in cs.PL

Abstract: Two sinks drain precision from higher-order flow analyses: (1) merging of argument values upon procedure call and (2) merging of return values upon procedure return. To combat the loss of precision, these two sinks have been addressed independently. In the case of procedure calls, abstract garbage collection reduces argument merging; while in the case of procedure returns, context-free approaches eliminate return value merging. It is natural to expect a combined analysis could enjoy the mutually beneficial interaction between the two approaches. The central contribution of this work is a direct product of abstract garbage collection with context-free analysis. The central challenge to overcome is the conflict between the core constraint of a pushdown system and the needs of garbage collection: a pushdown system can only see the top of the stack, yet garbage collection needs to see the entire stack during a collection. To make the direct product computable, we develop "stack summaries," a method for tracking stack properties at each control state in a pushdown analysis of higher-order programs.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.