Parametric Prediction of Heap Memory Requirements

Parametric Prediction of Heap Memory Requirements, by Victor Braberman, Federico Fernandez, Diego Garbervetsky, Sergio Yovine:

This work presents a technique to compute symbolic polynomial approximations of the amount of dynamic memory required to safely execute a method without running out of memory, for Java-like imperative programs. We consider object allocations and deallocations made by the method and the methods it transitively calls. More precisely, given an initial configuration of the stack and the heap, the peak memory consumption is the maximum space occupied by newly created objects in all states along a run from it. We over-approximate the peak memory consumption using a scoped-memory management where objects are organized in regions associated with the lifetime of methods. We model the problem of computing the maximum memory occupied by any region configuration as a parametric polynomial optimization problem over a polyhedral domain and resort to Bernstein basis to solve it. We apply the developed tool to several benchmarks.

We've briefly discussed analyses to predict heap usage here on LtU, but I can't seem to find them. Anyone with a reference handy, please post in the comments!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

FunLoft

FunLoft is related, in at least the goal of controlling peak memory consumption.

I'll be more interested in techniques that can actually provide an epsilon bound (i.e. '15%') on how much they overestimate space requirements.

A few pointers

I don't remember the discussion, so I'll just dump a few pointers that might point you the right way:

The people involved in COSTA have been analyzing Java Bytecode prior to the start of that project. There was a paper about resource usage for higher order functional languages at POPL last year. Martin Kero's thesis might be of interest to you, or the (shorter) APLAS paper.