Installation
Copyright © 2012-2017 Hortonworks, Inc.
Except where otherwise noted, this document is licensed under Creative Commons Attribution ShareAlike 4.0 License |
2017-07-12
Abstract
Hortonworks Cybersecurity Package (HCP) is a modern data application based on Apache Metron, powered by Apache Hadoop, Apache Storm, and related technologies.
HCP provides a framework and tools to enable greater efficiency in Security Operation Centers (SOCs) along with better and faster threat detection in real-time at massive scale. It provides ingestion, parsing and normalization of fully enriched, contextualized data, threat intelligence feeds, triage and machine learning based detection. It also provides end user near real-time dashboards.
Based on a strong foundation in the Hortonworks Data Platform (HDP) and Hortonworks DataFlow (HDF) stacks, HCP provides an integrated advanced platform for security analytics.
Please visit the Hortonworks Data Platform page for more information on Hortonworks technology. For more information on Hortonworks services, please visit either the Support or Training page. Feel free to Contact Us directly to discuss your specific needs.
Contents
- 1. Hortonworks Cybersecurity Package Information Roadmap
- 2. Preparing to Install
- 3. Installing HCP on an Ambari-Managed Cluster Using Ambari
- Prerequisites for an Existing Cluster
- Setting up the REST Application Database
- Installing HCP on an Ambari Cluster
- Installing, Configuring, and Deploying a HDP Cluster with HCP
- Importing Zeppelin Notebook Using Ambari
- Streaming Data into HCP
- Verifying That HCP Deployed Successfully
- Launching HCP Management Module User Interface
- Installing Alerts User Interface
- Optimization Guidelines
- 4. Manually Installing HCP
- Installation Variables
- Preparing the Environment
- Installing REST Application
- Installing HCP
- Setting Environment Variables
- Creating a Repository
- Installing HCP
- Creating Kafka Topics
- Creating HBase Tables
- Creating an HCP Global.json File
- Setting up the Metron Enrichment
- Setting Up Indexing
- Pushing the Configuration Changes to ZooKeeper
- Loading GeoIP Data
- Streaming Data into HCP
- Starting Your Parsers
- Starting Your Enrichments
- Starting Indexing
- Importing the Apache Zeppelin Notebook Manually
- Verifying that HCP Deployed Successfully
- Launching the HCP Management Module
- Optimization Guidelines
- 5. Enabling Kerberos
List of Figures
- 3.1. Ambari Component
- 3.2. Sample Deployment Architecture
- 3.3. Ambari Choose Services Window
- 3.4. Ambari Assign Masters Window
- 3.5. Ambari Assign Slaves and Clients Window
- 3.6. Install, Start and Test Window
- 3.7. NiFi Configure Processor Dialog Box EC2 Dashboard
- 3.8. NiFi Configure Processor Dialog Box EC2 Dashboard
- 3.9. Storm UI with Enrichment Details
- 4.1. Ambari Component
- 4.2. Sample Deployment Architecture
- 4.3. NiFi Configure Processor Dialog Box EC2 Dashboard
- 4.4. NiFi Configure Processor Dialog Box EC2 Dashboard
- 4.5. Ambari Metron Dashboard
- 4.6. Storm UI with Enrichment
- 5.1. Ambari Storm Site
- 5.2. Add Property
- 5.3. Enable Kerberos Wizard
- 5.4. Enable Kerberos Wizard
- 5.5. Final Custom Storm-site
List of Tables
- 1.1. HCP Information Roadmap
- 2.1. Physical Nodes