Syslog TCP Source connector

The Syslog TCP Source connector is a Stateless NiFi dataflow developed by Cloudera that is running in the Kafka Connect framework. Learn about the connector, its properties, and configuration.

The Syslog TCP Source connector listens on a port for syslog messages over TCP and transfers them to Kafka. The connector accepts messages in one of the following formats: Syslog 3164, Syslog 5424, or Grok. If the input messages are in Grok format, the connector can either derive the schema using the field names from the value of the Grok Expression property or read the schema from Schema Registry.

The connector can write messages into Kafka in one of the following formats: Avro, JSON, or text. If the output format is text, the raw message is transferred to Kafka. If the output format is JSON, the input message is processed and converted into JSON format before it is transferred to Kafka. If the output format is Avro, the input message is processed and converted into Avro format before it is transferred to Kafka. The record schema is embedded into Avro messages if the schema is not fetched from Schema Registry. If Schema Registry is used for getting the schema of the records, then the schema does not get embedded in the Avro messages.

If Schema Registry is used, and it is on a Kerberized cluster, the krb5.file property must point to the krb5.conf file that provides access to the cluster on which Schema Registry is present. This means that the krb5.conf file must be on the same cluster node that the connector runs on. The Kerberos keytab that is used to access Schema Registry must also be on the same cluster node that the connector runs on.

The connection to Schema Registry can be secured by TLS. The truststore file necessary for securing the connection must be on the same cluster node that the connector runs on. Mutual TLS for securing the communication between the message sender and the connector itself is also supported. The keystore and truststore files necessary for securing the connection must be on the same cluster node that the connector runs on.

Properties and configuration

Configuration is passed to the connector in a JSON file during creation. The properties of the connector can be categorized into three groups. These are as follows:

Common connector properties
These are the properties of the Kafka Connect framework that are accepted by all connectors. For a comprehensive list of these properties, see the Apache Kafka documentation.
Stateless NiFi Source properties
These are the properties that are specific to the Stateless NiFi Source connector. All Stateless NiFi Source connectors share and accept these properties. For a comprehensive list of these properties, see the Stateless NiFi Source property reference.
Connector/dataflow-specific properties
These properties are unique to this specific connector. Or to be more precise, unique to the dataflow running within the connector. These properties use the following prefix:
parameter.[***CONNECTOR NAME***] Parameters:
For a comprehensive list of these properties, see the Syslog TCP Source properties reference.

Notes and limitations

  • Required properties must be assigned a valid value even if they are not used in the particular configuration. If a required property is not used, either leave its default value, or completely remove the property from the configuration JSON.
  • If a property that has a default value is completely removed from the configuration JSON, the system uses the default value.
  • Properties not marked as required must be completely removed from the configuration JSON if not set.
  • The Syslog TCP Source connector must use at least one-way SSL. It cannot be used without SSL.
  • The Schema Registry URL property is mandatory even if Schema Registry is not used. If Schema Registry is not used, use the default value, or completely remove the property from the configuration JSON.
  • Schemas are only read from Schema Registry if Input Data Format is set to GROK and Schema Access Strategy is set to Schema Registry.
  • If Output Format is AVRO, the schema of the records can be embedded in the output data. Whether the schema is embedded is determined by the Schema Access Strategy property. If this property is set to Schema Registry, the schema is not embedded in the output messages. The schema is embedded in all other cases.

Configuration example

In this example, the connector uses mutual TLS to receive data in Syslog 3164 format which is then transferred to Kafka in JSON format.
{
 "connector.class": "org.apache.nifi.kafka.connect.StatelessNiFiSourceConnector",
 "meta.smm.predefined.flow.name": "Syslog TCP Source",
 "meta.smm.predefined.flow.version": "1.0.0",
 "key.converter": "org.apache.kafka.connect.storage.StringConverter",
 "value.converter": "org.apache.kafka.connect.converters.ByteArrayConverter",
 "tasks.max": "1",
 "nexus.url": "https://repository.cloudera.com/artifactory/repo",
 "extensions.directory": "/tmp/nifi-stateless-extensions",
 "working.directory": "/tmp/nifi-stateless-working",
 "topics": "[***KAFKA TOPIC NAME***]",
 "parameter.Syslog TCP Source Parameters:Port": "[***PORT***]",
 "parameter.Syslog TCP Source Parameters:Input Data Format": "Syslog 3164",
 "parameter.Syslog TCP Source Parameters:Output Format": "JSON",
 "parameter.Syslog TCP Source Parameters:Client Authentication": "REQUIRED",
 "parameter.Syslog TCP Source Parameters:SSL Keystore Filename": "[***THE FULLY-QUALIFIED FILENAME OF THE KEYSTORE***]",
 "parameter.Syslog TCP Source Parameters:SSL Keystore Key Password": "[***KEYSTORE KEY PASSWORD***]",
 "parameter.Syslog TCP Source Parameters:SSL Keystore Password": "[***KEYSTORE PASSWORD***]",
 "parameter.Syslog TCP Source Parameters:SSL Keystore Type": "[***KEYSTORE TYPE***]",
 "parameter.Syslog TCP Source Parameters:SSL Truststore Filename": "[***THE FULLY-QUALIFIED FILENAME OF THE TRUSTSTORE***]",
 "parameter.Syslog TCP Source Parameters:SSL Truststore Password": "[***TRUSTSTORE PASSWORD***]",
 "parameter.Syslog TCP Source Parameters:SSL Truststore Type": "[***TRUSTSTORE TYPE***]"
}

The following list collects the properties from the configuration example that must be customized for this use case.

topics
The name of the Kafka topic that the connector sends messages to.
Port
The port that the connector listens on for incoming messages.
Input Data Format
Determines what format incoming messages are expected in.
Output Format
Determines the format in which messages are transferred to Kafka.
Client Authentication
Determines if one-way or two-way SSL is used. In this example, this property is set to REQUIRED, meaning that two-way SSL is used.
Keystore *
These are the properties for accessing the keystore containing the keypair used for secure communication.
Truststore *
These are the parameters for accessing the truststore containing the message sender’s certificate used for secure communication.