Getting Started with Streaming Analytics
Also available as:
loading table of contents...

Setting up an Enrichment Store, Creating an HBase Table, and Creating an HDFS Directory

To prepare to perform predictive analytics on streams , you need some HBase and Phoenix tables. This section gives you instructions on setting up the HBase and Phoenix tables timesheet and drivers, loading them with reference data, and downloading the custom UDFs and processors to perform the enrichment and normalization.

Install HBase/Phoenix and download the sam-extensions

  1. If HBase is not installed, install/add an HBase service.
  2. Ensure that Phoenix is enabled on the HBase Cluster.
  3. Download the and save it to your local machine.
  4. Unzip the contents. Name the unzipped folder $SAM_EXTENSIONS.

Steps for Creating Phoenix Tables and Loading Reference Data

  1. Copy the $SAM_EXTENSIONS/custom-processor/scripts.tar.gz to a node where HBase/Phoenix client is installed.
  2. On that node, untar the scripts.tar.gz. Name the directory $SCRIPTS.
    tar -zxvf scripts.tar.gz
  3. Navigate to the directory where the phoenix script is located which will create the phoenix tables for enrichment and load it with reference data.
    cd $SCRIPTS/phoenix
  4. Open the file and replace <ZK_HOST> with the FQDN of your ZooKeeper host.
  5. Make the script executable and execute it. Make sure you add the script to JAVA_HOME.

Steps for Verifying Data has Populated Phoenix Tables

  1. Start up the sqlline Phoenix client.
    cd /usr/hdp/current/phoenix-client/bin
    ./ $ZK_HOST:2181:/hbase-unsecure
  2. List all the tables in Phoenix.
  3. Query the drivers and timesheet tables.
    select * from drivers;
    select * from timesheet; 

Steps for Starting HBase and Creating an HBase Table

  1. This can be easily done by adding the HDP HBase Service using Ambari.
  2. Create a new HBase table by logging into an node where Hbase client is installed then execute the following commands:
    cd /usr/hdp/current/hbase-client/bin
    /hbase shell
    create 'violation_events', {NAME=> 'events', VERSIONS => 3} ;

Steps for Creating an HDFS Directory

Create the following directory in HDFS and give it access to all users.

  1. Log into a node where HDFS client is installed.
  2. Execute the following commands:
    su hdfs
    hadoop fs -mkdir /apps/trucking-app
    hadoop fs -chmod 777 /apps/trucking-app