Kafka is a popular message queue for real-time processing. Use a Kafka sink to push a Flink data stream after processing is complete.
- Kafka is a popular event source … and sink for flink pipelines. … In this example, we will look at using Kafka … as a sink for flink pipelines. … We will write the one second summaries we created earlier … with even time to a Kafka sink. … The code for this example, is in the same … event time operations class in chapter four. … To write to Kafka, we first need to create a Kafka producer. … For this, we need to define a properties object … with the bootstrap server list for Kafka. … Then we create a producer that consumes strings as events. … We provide the default topic name, … which in this case is flink.kafka.streaming.sink. … Next, we need to provide … a serialization schema for the strings. … This implements a producer record method … that returns the name of the topic … and the output data in bytes. … This is standard Kafka implementation. … We also add the properties object we defined earlier. … Finally, we also set the Kafka producer semantic … to exactly once. … This tell Flink to publish a given event …
- Streaming with Apache Flink
- Using the DataStream API for basic stream processing
- Working with process functions
- Windowing and joins
- Setting up event-time processing
- State management in Flink
Skill Level Advanced
1. Apache Flink
2. DataStream API
4. Event Time Processing
5. State Management
6. Use Case Project
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.