Apache Ambari Release Notes
Also available as:
PDF

Known Issues

Ambari 2.6.2 has the following known issues, scheduled for resolution in a future release. Also, refer to the Ambari Troubleshooting Guide for additional information.

Table 1.4. Ambari 2.6.2 Known Issues

Apache Jira

HWX Jira

Problem

Solution

N/ABUG-123182SmartSense service stops when you upgrade to Ambari 2.6.2.45-8 on Ubuntu 18 with OpenSSL 1.1.x installed.No known workaround. Upgrade to SmartSense 1.5.1 as it supports OpenSSL 1.1.x
AMBARI-24536BUG-109839When SPNEGO is enabled (`ambari-server setup-kerberos`), the SSO (`ambari-server setup-sso`) redirect no longer works.No known workaround. Do not enable both kerberos and SSO using ambari-server setup.
AMBARI-23932Bug-103818

Changes to the way metric aggregation operations were performed in Ambari 2.6.2.0 cause the average to be miscalculated in some situations.Some metrics averages will have lower than normal values causing Ambari Alerts to fire and metrics to be reported inaccurately.

Upgrade to Apache Ambari 2.6.2.2.
N/ABug-103704Slider service check fails in kerberos environment when HDFS client is not present on the host where smoke test command is issued.If slider service check fails on any host during express upgrade with task log showing error about kint command failing for hdfs command then move existing hdfs keytab from any other host to the host where slider service check failed.
N/ABug-93305

Oozie Spark workflows are failing with the following exception due to missing py4j and pyspark files after patch upgrade:

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain],
main() threw exception, Missing py4j and/or pyspark zip files.
Please add them to the lib folder or to the Spark sharelib.
org.apache.oozie.action.hadoop.OozieActionConfiguratorException:
 Missing py4j and/or pyspark zip files.
 Please add them to the lib folder or to the Spark sharelib.

The workaround is to manually copy these files to HDFS. To do that follow the steps below:

  1. On the Oozie server, using the HDFS client, find the new sharelib location for the patched Oozie:

    # su - oozie
    $ hdfs dfs -ls /user/oozie/share/lib
    Found 2 items
    drwxr-xr-x   - oozie hdfs          0 2017-12-14 16:45 /user/oozie/share/lib/lib_20171214164505
    drwxr-xr-x   - oozie hdfs          0 2017-12-18 18:56 /user/oozie/share/lib/lib_20171218185543
  2. As the Oozie user, upload Spark files to the new location.

    $ cd /usr/hdp/2.6.3.0-86/spark/lib
    $ hdfs dfs -put datanucleus-* /user/oozie/share/lib/lib_20171218185543/spark
    $ cd ../python/lib/
    $ hdfs dfs -put py*.zip /user/oozie/share/lib/lib_20171218185543/spark
N/ABug-93805If any hosts with master components are in maintenance mode, patch upgrade cannot proceed even if these master components are unrelated to the patch upgrade.Please take these hosts out of maintenance mode before proceeding.
N/ABug-92232When patching HDFS using Express Upgrade, HBase may stop responding because HDFS is shut down completely. You may Ignore and Proceed when the HBase Service Check is run and fails. Alternatively, you may pause the upgrade, restart HBase, then resume the upgrade.