Journal of a Girl in IT: Java HashMap with millions and millions of records
Monday, June 3, 2013 Java HashMap with millions and millions of records I am just trying to find an answer to this problem. If you have millions and millions of records stored in a map, you are sure to run out of memory. Here is some alternates.. MapDB JDBM2 Buzz Hash Some basics on HashMap An instance of HashMap has two parameters that affect its performance: initial capacity and load factor. The capacity is the number of buckets in the hash table, and the initial capacity is simply the capacity at the time the hash table is created. The load factor is a measure of how full the hash table is allowed to get before its capacity is automatically increased. When the number of entries in the hash table exceeds the product of the load factor and the current capacity, the hash table is rehashed (that is, internal data structures are rebuilt) so that the hash table has approximately twice the number of buckets. Your problem is that 1.Read full article from Journal of a Girl in IT: Java HashMap with millions and millions of records
No comments:
Post a Comment