Moving data out of Snowflake
You can create a NiFi dataflow to move data out of Snowflake. To do this, you must meet
some prerequisites, download the Snowflake JDBC driver
file, update the NiFi truststore, configure your Controller Services, build your dataflow, and
configure the source and target Processors.
Before you begin Before setting up a NiFi dataflow to pull data from a Snowflake database table, you must meet certain minimum prerequisites. Downloading the Snowflake JDBC driver jar file Before you can create a dataflow that moves data out of a Snowflake database, you must ensure that NiFi can interact with the Snowflake database using a JDBC interface. To do this, you must download the Snowflake JDBC driver JAR file, upload it to each NiFi node in your cluster, and ensure that the proper permissions are set. Adding Snowflake CA certificates to NiFi truststore You must ensure that NiFi can communicate securely with Snowflake. To do this, configure NiFi to trust the Snowflake Certificate Authority (CA) by merging the default Snowflake JDK truststore content into the NiFi truststore. Building your dataflow Set up the elements of your NiFi dataflow that enables you to move data out of Snowflake using Apache NiFi. This involves opening NiFi in CDP Public Cloud, adding processors to your NiFi canvas, and connecting the processors. Creating Controller Services for your dataflow You can add Controller Services that can provide shared services to be used by the processors in your dataflow. Create them after you build the NiFi dataflow and before you configure the Processors, so that they are available when you configure your NiFi Processors. Configuring your source processor You can use the ListDatabaseTable processor to get data from your Snowflake table. To do this, launch the Configure Processor window, specify the necessary configurations, and start the process to verify that you can view the Snowflake table. Configuring your target processor In a dataflow that is pulling data from a Snowflake database, configure the ExecuteSQLRecord processor to handle data pooling from remote tables. To do this, launch the processor configuration window and provide the configurations appropriate for your use case. Confirming your dataflow success Confirm that you have successfully built a dataflow to move data out of Snowflake database tables by starting your dataflow and verifying that data is moving through it.