Upgrading HDP Manually
Also available as:
PDF
loading table of contents...

Configure Apache Phoenix

Before you can upgrade Apache Phoenix, you must have first upgraded your HDP components to the latest version (in this case, 2.3.6). This section assumes that you have already upgraded your components for HDP 2.3.6. If you have not already completed these steps, return to Getting Ready to Upgrade and Upgrade 2.2 Components for instructions on how to upgrade your HDP components to 2.3.6.

To configure Phoenix, complete the following steps:

  1. Add the following property to the /etc/hbase/hbase-site.xml file on all HBase nodes, the MasterServer, and all RegionServers to prevent deadlocks from occurring during maintenance on global indexes:

    <property>
      <name>hbase.regionserver.wal.codec</name>
      <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value>
    </property>
  2. To enable user-defined functions, configure the following property in /etc/hbase/conf on all Hbase nodes.

    <property>
     <name>phoenix.functions.allowUserDefinedFunctions</name>
     <value>true</value>
     <description>enable UDF functions</description>
    </property>
  3. Ensure the client side hbase-site.xml matches the server side configuration.

  4. Add the following, if it does not already exist, to the RegionServer side configurations:

    <property>
     <name>hbase.coprocessor.regionserver.classes</name>
     <value>org.apache.hadoop.hbase.regionserver.LocalIndexMerger</value>
    </property>
  5. If the folder specified in hbase.tmp.dir property on hbase-site.xml does not exist, create that directory with adequate permissions.

  6. Set the following porperty in the hbase-site.xml file for all RegionServers, but not on the client side:

    <property>
      <name>hbase.rpc.controllerfactory.class</name>
      <value>org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory</value>
    </property> 
    
  7. Restart the HBase Master and RegionServers.

Configuring Phoenix to Run in a Secure Cluster

Perform the following additional steps to configure Phoenix to run in a secure Hadoop cluster:

  1. To link the HBase configuration file with the Phoenix libraries:

    ln -sf HBASE_CONFIG_DIR/hbase-site.xml PHOENIX_HOME/bin/hbase-site.xml

  2. To link the Hadoop configuration file with the Phoenix libraries:

    ln -sf HADOOP_CONFIG_DIR/core-site.xml PHOENIX_HOME/bin/core-site.xml

[Note]Note

When running the pssql.py and sqlline.py Phoenix scripts in secure mode, you can safely ignore the following warnings.

14/04/19 00:56:24 WARN util.NativeCodeLoader: 
Unable to load native-hadoop library for your platform... 
  using builtin-java classes where applicable
 
14/04/19 00:56:24 WARN util.DynamicClassLoader: Failed to identify the fs of 
dir hdfs://<HOSTNAME>:8020/apps/hbase/data/lib, ignored java.io.IOException: 
No FileSystem for scheme: hdfs