User loginNavigation |
Avoiding worst case GC with large amounts of data?Garbage collection has improved greatly over the years, but there are worst cases which are still difficult to avoid without writing your own custom memory management system. For example, let's say you have 128MB of 3D geometry data loaded in Haskell/OCaml/ML-NJ/Erlang/Lisp/YourFavoriteLanguage. This data is in stored in the native format of your language, using lists and tuples and so on. At some point, the GC is going to go through that 128MB of data. Generational collection delays this, but it some point it will still happen. Or do any systems handle this situation gracefully? The best I've seen in this regard is Erlang, because each process has its own heap, and those heaps are collected individually. But if you put 128MB of data in one heap, there will still be a significant pause. Perhaps Erlang's GC is still better than most in this regard, as there a single-assignment language creates a unidirectional heap which has some nice properties. And then of course there are the languages with reference counting implementations, which perform better than languages with true GC in this regard. Thoughts? By James Hague at 2005-01-31 16:07 | LtU Forum | previous forum topic | next forum topic | other blogs | 10225 reads
|
Browse archives
Active forum topics |
Recent comments
19 weeks 5 days ago
19 weeks 5 days ago
19 weeks 5 days ago
41 weeks 6 days ago
46 weeks 1 day ago
47 weeks 5 days ago
47 weeks 5 days ago
50 weeks 3 days ago
1 year 2 weeks ago
1 year 2 weeks ago