Apache Ambari Release Notes
Also available as:
PDF

Known Issues

Ambari 2.6.1 has the following known issues, scheduled for resolution in a future release. Also, refer to the Ambari Troubleshooting Guide for additional information.

Table 1.5. Ambari 2.6.1 Known Issues

Apache Jira

HWX Jira

Problem

Solution

AMBARI-24536BUG-109839When SPNEGO is enabled (`ambari-server setup-kerberos`), the SSO (`ambari-server setup-sso`) redirect no longer works.No known workaround. Do not enable both kerberos and SSO using ambari-server setup.
AMBARI-22906BUG-95744

After upgrading Ambari to 2.6.1.0, registering a new HDP version fails with the following error message:

Cannot read property 'gpl.license.accepted' of undefined

Upgrade Ambari to version 2.6.1.5 before upgrading HDP to version 2.6.4.

See steps below:

 ST-3620SmartSense scheduler doesn't initiate capture.

After successful upgrade, all services (including SmartSense) should be up and running.

N/A

BUG-94823

Upgrading Ambari 2.5.x to 2.6.x and AMS to 2.6.1 without changing the JDK version from 1.7 to 1.8 causes the Metrics Collector services start to fail.

The Ambari Metrics Collector 2.6.1 requires JDK 8.

Traceback (most recent call last):
 File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/service_check.py",
 line 207, in <module> AMSServiceCheck().execute()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 375, in execute method(env)
 File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py",
 line 89, in thunk return fn(*args, **kwargs)
 File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/service_check.py",
 line 159, in service_check raise Fail("All metrics collectors are unavailable.")
resource_management.core.exceptions.Fail: All metrics collectors are unavailable.

  1. Upgrade the JDK version only for the AMS Metrics Collector.

    Only on the metrics collector host, do the following:

    wget -O /tmp/jdk-8u112-linux-x64.tar.gz  http://public-repo-1.hortonworks.com/ARTIFACTS/jdk-8u112-linux-x64.tar.gz
    cd /usr/jdk64
    tar -xzf /tmp/jdk-8u112-linux-x64.tar.gz
    
    wget -O /tmp/jce_policy-8.zip http://public-repo-1.hortonworks.com/ARTIFACTS/jce_policy-8.zip
    cd /usr/jdk64/jdk1.8.0_112/jre/lib/security
    unzip -o /tmp/jce_policy-8.zip
    cp UnlimitedJCEPolicyJDK8/local_policy.jar UnlimitedJCEPolicyJDK8/US_export_policy.jar .
    rm -r UnlimitedJCEPolicyJDK8
    
    rm /tmp/jdk-8u112-linux-x64.tar.gz /tmp/jce_policy-8.zip

  2. In Ambari Web, go to Services > Ambari Metrics > Advanced ams-hbase-env > ams-env template.

  3. Change

    export JAVA_HOME={{java64_home}}

    to

    export JAVA_HOME=/usr/jdk64/jdk1.8.0_112
  4. Restart the Metrics Collector service.

N/ABUG-82900If hosts are in maintenance mode during the Enable Kerberos wizard, the kerberos client is not installed, and keytabs and principals are not created for these hosts.
  1. Take the host out of Maintenance Mode.

  2. Use Install Clients on Host detail page of the host to install all of the clients which will include the Kerberos client.

  3. Regenerate keytabs using Admin > Kerberos > Regenerate Keytabs.

N/ABug-93305

Oozie Spark workflows are failing with the following exception due to missing py4j and pyspark files after patch upgrade:

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain],
main() threw exception, Missing py4j and/or pyspark zip files.
Please add them to the lib folder or to the Spark sharelib.
org.apache.oozie.action.hadoop.OozieActionConfiguratorException:
 Missing py4j and/or pyspark zip files.
 Please add them to the lib folder or to the Spark sharelib.

The workaround is to manually copy these files to HDFS. To do that follow the steps below:

  1. On the Oozie server, using the HDFS client, find the new sharelib location for the patched Oozie:

    # su - oozie
    $ hdfs dfs -ls /user/oozie/share/lib
    Found 2 items
    drwxr-xr-x   - oozie hdfs          0 2017-12-14 16:45 /user/oozie/share/lib/lib_20171214164505
    drwxr-xr-x   - oozie hdfs          0 2017-12-18 18:56 /user/oozie/share/lib/lib_20171218185543
  2. As the Oozie user, upload Spark files to the new location.

    $ cd /usr/hdp/2.6.3.0-86/spark/lib
    $ hdfs dfs -put datanucleus-* /user/oozie/share/lib/lib_20171218185543/spark
    $ cd ../python/lib/
    $ hdfs dfs -put py*.zip /user/oozie/share/lib/lib_20171218185543/spark
N/ABug-93805If any hosts with master components are in maintenance mode, patch upgrade cannot proceed even if these master components are unrelated to the patch upgrade.Please take these hosts out of maintenance mode before proceeding.
N/ABug-92232When patching HDFS using Express Upgrade, HBase may stop responding because HDFS is shut down completely. You may Ignore and Proceed when the HBase Service Check is run and fails. Alternatively, you may pause the upgrade, restart HBase, then resume the upgrade.
AMBARI-22491BUG-91896When moving a Metrics Collector to a host, if a ZooKeeper server is not installed on that host, a new ZooKeeper service will be erroneously installed.No known workaround.