Installing CDP Private Cloud BasePDF version

Search

Component Java Heap CPU Disk
Solr
  • Small workloads, or evaluations: 16 GB
  • Smaller production environments: 32 GB
  • Larger production environments: 96 GB is sufficient for most clusters.

Set this value using the Java Heap Size of Solr Server in Bytes Solr configuration property.

See

  • Minimum: 4
  • Recommended: 16 for production workloads
No requirement. Solr uses HDFS for storage.
Note the following considerations for determining the optimal amount of heap memory:
  • Size of searchable material: The more searchable material you have, the more memory you need. All things being equal, 10 TB of searchable data requires more memory than 1 TB of searchable data.
  • Content indexed in the searchable material: Indexing all fields in a collection of logs, email messages, or Wikipedia entries requires more memory than indexing only the Date Created field.
  • The level of performance required: If the system must be stable and respond quickly, more memory may help. If slow responses are acceptable, you may be able to use less memory.

We want your opinion

How can we improve this page?

What kind of feedback do you have?