Stateless NiFi Source properties reference
Review the following reference for a comprehensive list of the connector properties that are specific to the Stateless NiFi Source connector.
In addition to the properties listed here, this connector also accepts the properties of the Kafka Connect framework. For a comprehensive list of these properties, see the Apache Kafka documentation.
dataflow.timeout
- Description
- Specifies the maximum amount of time to wait for the dataflow to complete. If the dataflow does not complete before this timeout, the thread is interrupted and the dataflow is considered as a failure. The session is rolled back and the connector retriggers the flow. Defaults to 30 seconds if not specified.
- Default Value
- 30 seconds
- Accepted Values
- Required
- false
extensions.directory
- Description
- Specifies the directory that stores downloaded extensions. Because the default directory might not be writable, and to aid in upgrade scenarios, Cloudera recommends that you always specify an extensions directory.
- Default Value
- /tmp/nifi-stateless-extensions
- Accepted Values
- Required
- true
flow.snapshot
- Description
- Specifies the dataflow to run. When using SMM to deploy a connector, the value you set in this property must be a JSON object. URLs, file paths, or escaped JSON strings are not supported when using SMM. Alternatively, if using the Kafka Connect REST API to deploy a connector, this can be a file containing the dataflow, a URL that points to a dataflow, or a string containing the entire dataflow as an escaped JSON. Cloudera however, does not recommend using to Kafka Connect REST API to interact with this connector or Kafka Connect.
- Default Value
- Accepted Values
- Required
- true
header.attribute.regex
- Description
- A Java regular expression that is evaluated against all FlowFile attribute names. Any attribute name matching the regular expression is converted into a Kafka message header. The name of the attribute is used as the header key, the value of the attribute is used as the header value. If not specified, headers are not added to the Kafka record.
- Default Value
- Accepted Values
- Required
- false
key.attribute
- Description
- Specifies the name of a FlowFile attribute that should be used to specify the key of the Kafka record. If not specified, the Kafka record will not have a key associated with it. If specified, but the attribute does not exist on a particular FlowFile, it will also have no key associated with it.
- Default Value
- Accepted Values
- Required
- false
krb5.file
- Description
- Specifies the
krb5.conf
file to use if the dataflow interacts with any services that are secured using Kerberos. Defaults to/etc/krb5.conf
if not specified. - Default Value
- /etc/krb5.conf
- Accepted Values
- Required
- false
name
- Description
- The name of the connector. On the SMM UI, the connector names are specified using the
Enter Name field. The name that you enter in the
Enter Name field is automatically set as the value of the
name
property when the connector is deployed. Because of this, the name property is omitted from the configuration template provided in SMM. If you manually add the name property to the configuration in SMM, ensure that the value you set matches the connector name specified in the Enter Name field. Otherwise, the connector fails to deploy. - Default Value
- Accepted Values
- Required
- True
nexus.url
- Description
- Specifies the Base URL of the Nexus instance to source extensions from. If configuring
a Nexus instance that has multiple repositories, include the name of the repository in
the URL. For example,
https://nexus-private.myorganization.org/nexus/repository/my-repository/
. If the property is not specified, the necessary extensions (the ones used by the flow) must be provided in the extensions directory before deploying the connector. - Default Value
- Accepted Values
- Required
- true
output.port
- Description
- The name of the Output Port in the NiFi dataflow to pull data from. If the dataflow contains exactly one port, this property is optional and can be omitted. However, if the dataflow contains multiple ports (for example, a Success and a Failure port), this property must be specified. If any FlowFile is sent to any port other than the specified Port, it is considered as a failure. The session is rolled back and no data is collected.
- Default Value
- Accepted Values
- Required
- false
parameter.[***FLOW PARAMETER NAME***]
- Description
- Specifies a parameter to use in the dataflow. For example, assume that you have the
following entry in your connector configuration
"parameter.Directory": "/mydir".
In a case like this, any Parameter Context in the dataflow that has a parameter namedDirectory
gets the specified value (/mydir
). If the dataflow has child Process Groups, and those child Process Groups have their own Parameter Contexts, the value is used for all Parameter Contexts that contain a parameter namedDirectory
. Parameters can also be applied to specific Parameter Contexts only. This can be done by prefixing the parameter name (Directory
) with the name of the Parameter Context followed by a colon. For example,parameter.My Context:Directory
only applies the specified value for theDirectory
parameter in the Parameter Context named My Context. - Default Value
- Accepted Values
- Required
- false
topic.name.attribute
- Description
- Specifies the name of a FlowFile attribute to use for determining which Kafka topic a
FlowFile is sent to. Either the
topics
ortopic.name.attribute
property must be specified. If both are specified,topic.name.attribute
takes precedence. However, if a FlowFile does not have the specified attribute name, then the connector falls back to using the topics property. - Default Value
- Accepted Values
- Required
- false
topics
- Description
- The name of the topic to deliver data to. All FlowFiles are delivered to the topic
specified here. However, it is also possible to determine the topic individually for
each FlowFile. To do this, ensure that the dataflow specifies the topic name in an
attribute, and then use
topic.name.attribute
to specify the name of the attribute instead of topic name. For example, if you wanted a separate Kafka topic for each data source, you can omit thetopics
property and instead specify the attribute (for example,datasource.hostname
) corresponding to the topic using thetopic.name.attribute
property. - Default Value
- Accepted Values
- Required
- true
working.directory
- Description
- Specifies a directory on the Connect server that NiFi should use for unpacking
extensions that it needs to perform the dataflow. Defaults to
/tmp/nifi-stateless-working
if not specified. - Default Value
- /tmp/nifi-stateless-working
- Accepted Values
- Required
- false