Execution errors might occur due to inadequate resource threshold values set for
YARN.
In order to run Spark jobs on an HDP cluster, apply the
following changes:
-
From Ambari, go to service.
-
Choose or add a queue. You can configure root, default, or individual queues.
-
Under the Resources section, increase the value in the
Maximum AM Resource field so that the queues have enough
resources to execute. For example, you can set it to
80%.
Child queues can inherit the settings from the parent queue.