Message queues like Kafka are a popular data source for streaming applications. Learn how to consume Kafka streams with Apache Flink.
- [Instructor] In this video, … I will demonstrate using a Kafka streaming source in Flink. … Before we proceed, we need to update our pom.xml … to add a dependency for the Kafka connector in Flink. … To run this example, you need a running instance of Kafka … with two topics created in it, … namely Flint.Kafka.streaming.source … and Flint.Kafka.streaming.synch. … For the purpose of data generation, … we first need a Kafka data stream generator. … Let's now review the Kafka stream data generator class … under the data sources packages. … The Kafka connection is set up by specifying a broker list. … The code generates the exact audit trail CSV record … as the file stream data generator. … It then creates a producer record … and then publishes the record to Kafka. … Let me now switch to the windowing operations class … under chapter three to show how to consume … a Kafka data stream in Flink. … We first set up the streaming environment as before. … To connect to Kafka, we first set up a properties object …
- Streaming with Apache Flink
- Using the DataStream API for basic stream processing
- Working with process functions
- Windowing and joins
- Setting up event-time processing
- State management in Flink
Skill Level Advanced
1. Apache Flink
2. DataStream API
4. Event Time Processing
5. State Management
6. Use Case Project
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.