If the user IDs for the service users (HDFS, MapReduce, HCatalog, HBase) and/or unprivileged users (users responsible for job submissions, executing Pig or Hive queries) is less than 1000
, the Mapreduce tasks for smoke tests fail and you get the following error message:
Error initializing attempt_201112292220_0001_m_000002_0: …………………………………………………………………………………………………………………………………………………… Caused by: org.apache.hadoop.util.Shell$ExitCodeException: at org.apache.hadoop.util.Shell.runCommand(Shell.java:255) at org.apache.hadoop.util.Shell.run(Shell.java:182) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375) at org.apache.hadoop.mapred.LinuxTaskController.initializJob(LinuxTaskController.java:185)
Solution:
Open the TaskTracker log file to analyze root cause. Find the incorrectly configured user ID by looking at the error message similar to the one shown below:
INFO org.apache.hadoop.mapred.TaskController: Reading task controller configuration /etc/hadoop/taskcontroller.cfg INFO. org.apache.hadoop.mapred.TaskController: requested user hdfs has id 201, which is below the minimum allowed 1000
Change the User ID for the
$user_name
obtained in Step-1 above.cd master-install-location/gsInstaller usermod -u 10000 $user_name
Restart the installation process.