Configure Apache Phoenix
Before you can upgrade Apache Phoenix, you must have first upgraded your HDP components to the latest version (in this case, 2.4.2). This section assumes that you have already upgraded your components for HDP 2.4.2. If you have not already completed these steps, return to Getting Ready to Upgrade and Upgrade 2.2 Components for instructions on how to upgrade your HDP components to 2.4.2.
To configure Phoenix, complete the following steps:
Add the following property to the
/etc/hbase/hbase-site.xml
file on all HBase nodes, the MasterServer, and all RegionServers to prevent deadlocks from occurring during maintenance on global indexes:<property> <name>hbase.regionserver.wal.codec</name> <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value> </property>
To enable user-defined functions, configure the following property in
/etc/hbase/conf
on all Hbase nodes.<property> <name>phoenix.functions.allowUserDefinedFunctions</name> <value>true</value> <description>enable UDF functions</description> </property>
Ensure the client side hbase-site.xml matches the server side configuration.
If the folder specified in
hbase.tmp.dir
property onhbase-site.xml
does not exist, create that directory with adequate permissions.Set the following porperty in the hbase-site.xml file for all RegionServers, but not on the client side:
<property> <name>hbase.rpc.controllerfactory.class</name> <value>org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory</value> </property>
Restart the HBase Master and RegionServers.
Configuring Phoenix to Run in a Secure Cluster
Perform the following additional steps to configure Phoenix to run in a secure Hadoop cluster:
To link the HBase configuration file with the Phoenix libraries:
ln -sf HBASE_CONFIG_DIR/hbase-site.xml PHOENIX_HOME/bin/hbase-site.xml
To link the Hadoop configuration file with the Phoenix libraries:
ln -sf HADOOP_CONFIG_DIR/core-site.xml PHOENIX_HOME/bin/core-site.xml
Note | |
---|---|
When running the pssql.py and sqlline.py Phoenix scripts in secure mode, you can safely ignore the following warnings. |
14/04/19 00:56:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/04/19 00:56:24 WARN util.DynamicClassLoader: Failed to identify the fs of dir hdfs://<HOSTNAME>:8020/apps/hbase/data/lib, ignored java.io.IOException: No FileSystem for scheme: hdfs