Setting user limits for HBase
You must set user limits to avoid opening many files at the same time. Overloading many files at the same time leads to failure and causes error messages.
Because HBase is a database, it opens many files at the same time. The
   default setting of 1024 for the maximum number of open files on most Unix-like systems is
   insufficient. Any significant amount of loading will result in failures and cause error message
   such as java.io.IOException...(Too many open
    files) to be logged in the HBase or HDFS log files. For more information about this
   issue, see the Apache HBase Book. You may also notice errors such as:
2010-04-06 03:04:37,542 INFO org.apache.hadoop.hdfs.DFSClient: Exception increateBlockOutputStream java.io.EOFException
2010-04-06 03:04:37,542 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning block blk_-6935524980745310745_1391901
  Another setting you should configure is the number of processes a user is
   permitted to start. The default number of processes is typically 1024. Consider raising this
   value if you experience OutOfMemoryException
   errors.
