To install the Phoenix RPM:
Install Phoenix on all HBase Region servers:
RHEL/CentOS/Oracle Linux:
yum install phoenix
For SLES:
zypper install phoenix
Link the Phoenix core jar file to the HBase Master and Region Servers:
ln -sf /usr/lib/phoenix/lib/phoenix-core-4.0.0.2.1.1.0-385.jar /usr/lib/hbase/lib/phoenix.jar
Add the following configuration to
hbase-site.xml
on all HBase nodes, the Master Server, and all Region Servers:<property> <name>hbase.defaults.for.version.skip</name> <value>true</value> </property> <property> <name>hbase.regionserver.wal.codec</name> <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value> </property>
Restart the HBase Master and Region Servers.
Add the phoenix-4.0-client.jar to the classpath of any Phoenix client.
Configuring Phoenix
Use the following procedure to configure Phoenix:
Link the Phoenix core jar file to the HBase Master and Region Servers:
ln -sf /usr/lib/phoenix/lib/phoenix-core-4.0.0.2.1.1.0-365.jar /usr/lib/hbase/lib/phoenix.jar
Add the following configuration to
hbase-site.xml
on all HBase nodes, Master and Region Servers:<property> <name>hbase.regionserver.wal.codec</name> <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value> </property>
Restart the Master and Region Servers.
Configuring Phoenix for Security
If you are a Phoenix administrator, you must perform the following additional steps to configure Phoenix to run in a secure Hadoop cluster:
Execute the following command to link the HBase configuration file with the Phoenix libraries:
ln -sf <HBASE_CONFIG_DIR>/hbase-site.xml <PHOENIX_HOME>/bin/hbase-site.xml
Execute the following command to link the Hadoop configuration file with the Phoenix libraries:
ln -sf <HADOOP_CONFIG_DIR>/core-site.xml <PHOENIX_HOME>/bin/core-site.xml
Note | |
---|---|
Phoenix administrators may safely ignore the following warnings when running the
14/04/19 00:56:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/04/19 00:56:24 WARN util.DynamicClassLoader: Failed to identify the fs of dir hdfs://<HOSTNAME>:8020/apps/hbase/data/lib, ignored java.io.IOException: No FileSystem for scheme: hdfs
|
Smoke Testing Phoenix
Execute the following commands to verify your installation on an unsecure cluster:
Note | |
---|---|
The command assumes an unsecure cluster with the following values for configuration parameters:
For a secure cluster, the
|
cd $PHOENIX_HOME/bin ./psql.py localhost:2181:/hbase-unsecure /usr/share/doc/phoenix-4.0.0.2.1.1.0/examples/WEB_STAT.sql /usr/share/doc/phoenix-4.0.0.2.1.1.0/examples/WEB_STAT.csv /usr/share/doc/phoenix-4.0.0.2.1.1.0/examples/WEB_STAT_QUERIES.sql
Troubleshooting Phoenix
You may encounter a runtime exception similar to the following:
Exception in thread "main" java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
To resolve this issue, place hbase-protocol*.jar
immediately after
hbase-site.xml
in the HADOOP_CLASSPATH
environment
variable, as shown in the following example:
HADOOP_CLASSPATH=/path/to/hbase-site.xml:/path/to/hbase-protocol.jar