You can use APIs to develop Flink streaming applications where the data pipeline
consists of one or more data source, data transformation, and data sink. You can build the
architecture of your application with parallelism and windowing functions to benefit from the
scalability and state handling features of Flink.
The DataStream API is used as the core API to develop Flink streaming applications using Java
or Scala programming languages. The core building blocks of a streaming application are
datastream and transformation. In a Flink program, the incoming data streams from a source are
transformed by a defined operation which results in one or more output streams to the sink as
shown in the following illustration.
The structure of this dataflow is implemented in a pipeline that gives a Flink application its
core logic. On a dataflow one or more operations can be defined which can be processed in
parallel and independently to each other. With windowing functions, different computations can be
applied to different streams in the defined time window to further maintain the processing of
events. The following image illustrates the parallel structure of dataflows.
This site uses cookies and related technologies, as described in our privacy policy, for purposes that may include site operation, analytics, enhanced user experience, or advertising. You may choose to consent to our use of these technologies, or