User loginNavigation |
Avoiding worst case GC with large amounts of data?Garbage collection has improved greatly over the years, but there are worst cases which are still difficult to avoid without writing your own custom memory management system. For example, let's say you have 128MB of 3D geometry data loaded in Haskell/OCaml/ML-NJ/Erlang/Lisp/YourFavoriteLanguage. This data is in stored in the native format of your language, using lists and tuples and so on. At some point, the GC is going to go through that 128MB of data. Generational collection delays this, but it some point it will still happen. Or do any systems handle this situation gracefully? The best I've seen in this regard is Erlang, because each process has its own heap, and those heaps are collected individually. But if you put 128MB of data in one heap, there will still be a significant pause. Perhaps Erlang's GC is still better than most in this regard, as there a single-assignment language creates a unidirectional heap which has some nice properties. And then of course there are the languages with reference counting implementations, which perform better than languages with true GC in this regard. Thoughts? By James Hague at 2005-01-31 16:07 | LtU Forum | previous forum topic | next forum topic | other blogs | 8911 reads
|
Browse archivesActive forum topics |
Recent comments
3 hours 15 min ago
5 hours 2 min ago
6 hours 9 min ago
8 hours 16 min ago
12 hours 28 min ago
1 day 1 hour ago
1 day 1 hour ago
1 day 1 hour ago
1 day 8 hours ago
1 day 8 hours ago