Java Heap
The java heap is the memory that Solr requires in order to actually run. Certain things will require a lot of heap memory. The following list is incomplete, but in no particular order, these include:
- A large index.
- Frequent updates.
- Super large documents.
- Extensive use of faceting with the default facet.method value.
- Using a lot of different sort parameters.
- Very large Solr caches
- A large RAMBufferSizeMB.
- Use of Lucene's RAMDirectoryFactory.
Reducing heap requirements
Here is an incomplete list, in no particular order, of how to reduce heap requirements, based on the list above for things that require a lot of heap:
- Take a large index and make it distributed - shard your index onto multiple servers.
- One very easy way to do this is to switch to SolrCloud.
- This doesn't actually reduce the overall memory requirement for a large index (it may actually increase it slightly), but spreads it across multiple servers, so each server will have lower memory requirements.
- Don't store all your fields, especially the really big ones.
- Instead, have your application retrieve detail data from the original data source, not Solr.
- Note that doing this will mean that you cannot use Atomic Updates.
- Use facet.method=enum for your facets.
- Reduce the number of different sort parameters.
- Reduce the size of your Solr caches.
- Reduce RAMBufferSizeMB. The default in recent Solr versions is 100.
- This value can be particularly important if you have a lot of cores, because a buffer will be used for each core.
- Don't use RAMDirectoryFactory - instead, use the default and install enough system RAM so the OS can cache your entire index as discussed above.
Read full article from SolrPerformanceProblems - Solr Wiki
No comments:
Post a Comment