This section provides some helpful MapReduce Version 2 commands. The full list of commands is available on the apache.org website at http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/CommandsManual.html.
Getting the Hadoop Version
Command:
hadoop version
Example:
[hdfs@sandbox run]$ hadoop version Hadoop 2.2.0.2.0.6.0-76 Subversion git@github.com:hortonworks/hadoop.git -r 8656b1cfad13b03b29e98cad042626205e7a1c86 Compiled by jenkins on 2013-10-18T00:19Z Compiled with protoc 2.5.0 From source with checksum d23ee1d271c6ac5bd27de664146be2 This command was run using /usr/lib/hadoop/hadoop-common-2.2.0.2.0.6.0-76.jar
Running the Pi Job
Command:
hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-*.jar pi 10 10
Getting the Running Job List
Command:
yarn application -list
Example:
[yarn application -list 13/11/04 12:08:40 INFO client.RMProxy: Connecting to Resource Manager at sandbox/10.11.2.159:8050 Total number of applications (application-types: [] and states: [SUBMITTED, ACCEPTED, RUNNING]):1 Application-Id Application-Name Application-Type User Queue State Final-State Progress Tracking-URL application_1383594295029_0005 QuasiMonteCarlo MAPREDUCE hdfs default ACCEPTED UNDEFINED 0% N/A
Getting the Queue List
Command:
hadoop queue -list
Example:
[hdfs@sandbox run]$ hadoop queue -list DEPRECATED: Use of this script to execute mapred command is deprecated. Instead use the mapred command for it. 13/10/31 14:07:55 INFO client.RMProxy: Connecting to Resource Manager at sandbox.hortonworks.com/10.0.2.15:8050 ====================== Queue Name : default Queue State : running Scheduling Info : Capacity: 100.0, MaximumCapacity: 100.0, CurrentCapacity: 0.0
Getting the Access Control List for the Current User
Command:
hadoop queue -showacls
Example:
root@a2nn:~> hadoop queue -showacls Queue acls for user : root Queue Operations ===================== default submit-job,administer-jobs data-analysis submit-job,administer-jobs
Kill an Application
Command:
yarn application -kill <application_id>
Kill a Task
Command:
hadoop job -kill-task <task-id>
Fail a Task
Command:
hadoop job -fail-task <task-id>
List Attempts
Command:
hadoop job -list-attempt-ids <job-id> <task-type> <task-state>