Tuesday, September 13, 2011

Added Memory Mapped support to HugeCollections

Overview

The Huge collections library is designed to support large collections on data in memory efficiently without GC impact. It does this using heap less memory and generated code for efficiency.

One of the benefits of this approach is memory mapped files can be larger than the available memory, use trivial heap and no direct memory.

Memory mapping

Loading large amounts of data is time consuming so being able to re-use a store persisted to disk can improve efficiency. With the use of SSD drives the amount of "memory" a Java system can access is effectively extended provided it can be used.

How does it perform

Adding an object with 12 fields and reading objects sequentially takes about 110 ns for one billion elements. It takes less than two minutes to add all the entries, and the same to read all the elements.

Random access is much more expensive until all the data is in memory, with a fast SSD drive it takes about 600 ns when most of the data is in cache, slower if this is not the case.

Java has a collection with one billion elements. (Top)


As you can see the machine has only 24 GB and yet it the Java process is using 37.6G of virtual memory as this includes 33 GB of mapped files.  The actual memory used is 3.4 GB as this is how much is kept in memory.

The shared memory is the same as the memory mapped files can be shared with other processes (however writing to them in a safe manner is not supported)

Creating the huge collection

Like previous huge collections, an anonymous sub-class of the builder is used to create the collection. The baseDirectory is the base directory of where the list is stored.

HugeArrayList list = new HugeArrayBuilder() {{
    baseDirectory = TEMPORARY_SPACE;
    capacity = length;
    allocationSize = 32 * 1024 * 1024;
}}.create();

// update or lookup the collection
list.add(...);

MutableType mt = list.get(900*1000*1000);

// close to flush the data, close files and un-map the collection.
list.close();

Links

Wiki
Downloads
Unit Tests and examples


To follow

Support for a Huge Map with one index under development. Additional non-unique indexes also planned.

No comments:

Post a Comment