Bulk Loading Enrichment Information
Enrichment data can either be bulk loaded from HDFS or be streamed into enrichment store via pluggable loading framework. This section provides the steps to bulk load enrichment data.
You can bulk load enrichment information from the following sources:
Taxii Loader
HDFS via MapReduce
Flat File Ingestion
Taxii Loader
The shell script $METRON_HOME/bin/threatintel_taxii_load.sh
can be used to
poll a Taxii server for STIX documents and ingest them into HBase.
It is common for this Taxii server to be an aggregation server such as Soltra Edge.
In addition to the Enrichment and Extractor configs described in the following sections, this loader requires a configuration file describing the connection information to the Taxii server. The following is an example of a configuration file:
{ "endpoint" : "http://localhost:8282/taxii-discovery-service" ,"type" : "DISCOVER" ,"collection" : "guest.Abuse_ch" ,"table" : "threatintel" ,"columnFamily" : "t" ,"allowedIndicatorTypes" : [ "domainname:FQDN", "address:IPV_4_ADDR" ] }
where:
- endpoint
The URL of the endpoint.
- type
POLL
orDISCOVER
depending on the endpoint.- collection
The Taxii collection to ingest.
- table
The HBase table to import into.
- columnFamily
The column family to import into.
- allowedIndicatorTypes
An array of acceptable threat intelligence types (see the "Enrichment Type Name" column of the Stix table above for the possibilities).
The parameters for the utility are as follows:
Short Code | Long Code | Is Required? | Description |
---|---|---|---|
-h | No | Generate the help screen/set of options | |
-e | --extractor_config | Yes | JSON document describing the extractor for this input data source |
-c | --taxii_connection_config | Yes | The JSON config file to configure the connection |
-p | --time_between_polls | No | The time between polling the Taxii server in milliseconds. (default: 1 hour) |
-b | --begin_time | No | Start time to poll the Taxii server (all data from that point will be gathered in the first pull). The format for the date is yyyy-MM-dd HH:mm:ss |
-l | --log4j | No | The Log4j properties to load |
-n | --enrichment_config | No | The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified. |
HDFS via MapReduce
The shell script $METRON_HOME/bin/threatintel_bulk_load.sh
will kick off a
MapReduce job to load data staged in HDFS into an HBase table.
Note | |
---|---|
Despite what the naming might suggest, this utility works for enrichment as well as threat intel due to the underlying infrastructure being the same. |
The parameters for the utility are as follows:
Short Code | Long Code | Is Required? | Description |
---|---|---|---|
-h | No | Generate the help screen/set of options | |
-e | --extractor_config | Yes | JSON document describing the extractor for this input data source |
-t | --table | Yes | The HBase table to import into |
-f | --column_family | Yes | The HBase table column family to import into |
-i | --input | Yes | The input data location on HDFS |
-n | --enrichment_config | No | The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified. |
CSV File
The shell script $METRON_HOME/bin/flatfile_loader.sh
will read data from
local disk and load the enrichment or threat intel data into an HBase table.
One special thing to note here is that there is a special configuration parameter to the Extractor config that is only considered during this loader:
- inputFormatHandler
This specifies how to consider the data. The two implementations are
BY_LINE
andorg.apache.metron.dataloads.extractor.inputformat. WholeFileFormat
The default is BY_LINE
, which makes sense for a list of CSVs where each
line indicates a unit of information which can be imported. However, if you are importing a
set of STIX documents, then you want each document to be considered as input to the
Extractor.
The parameters for the utility are as follows:
Short Code | Long Code | Is Required? | Description |
---|---|---|---|
-h | No | Generate the help screen/set of options | |
-e | --extractor_config | Yes | JSON document describing the extractor for this input data source |
-t | --hbase_table | Yes | The HBase table to import into |
-c | --hbase_cf | Yes | The HBase table column family to import into |
-i | --input | Yes | The input data location on local disk. If this is a file, then that file will be loaded. If this is a directory, then the files will be loaded recursively under that directory. |
-l | --log4j | No | The log4j properties file to load |
-n | --enrichment_config | No | The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified. |