Connectors
SQL Stream Builder (SSB) supports different connector types and data formats for Flink SQL tables to ease development and access to all kinds of data sources.
The following table summarizes the supported connectors and how they can be used in SSB:
Connector | Type | Description |
---|---|---|
Kafka | source/sink | Supported as exactly-once-sink |
Hive | source/sink | Can be used as catalog |
Kudu | source/sink | Can be used as catalog |
Schema Registry | source/sink | Can be used as catalog |
Iceberg | source/sink | Can be used with Flink SQL. Hive and HDFS catalog is supported. |
JDBC | source/sink | Can be used with Flink SQL. PostgreSQL, MySQL and Hive are supported. |
Filesystems | source/sink | Filesystems such as HDFS, S3 and so on. Can be used with Flink SQL |
Debezium CDC | source | Can be used with Flink SQL. PostgreSQL, MySQL, Oracle DB, Db2 and SQL Server are supported. |
Webhook | sink | Can be used as HTTP POST/PUT with templates and headers |
PostgreSQL | sink | Materialized View connection for reading views. Can be used with anything that reads PostgreSQL wire protocol |
REST | sink | Materialized View connection for reading views. Can be used with anything that reads REST (such as notebooks, applications, and so on) |
BlackHole | sink | Can be used with Flink SQL. |