Flink provides connectors to multiple sinks for pushing out processed data. Use DataStream APIs to write data to a file sink.
- [Narrator] Flink allows output of streaming operations … to be sent to various types of streaming sinks. … Supported sinks include Kafka, … Kinesis and Cassandra. … In this case, we will write the output … to a file system sink. … We continue on the basic streaming operations example … we discussed in the previous videos. … We declare an output that we call data/five_sec_summary. … Please make sure that this directory already exists. … We cleaned out the directory before it reran. … In real world situations, … this directory would be kept archived. … We first need to create a streaming file sink first. … We declare its datatype to match the data … that is being written. … In this declaration, … we specify the output directory to write to … and the serialization format. … In this case, the serialization is SimpleStringEncoder. … Additional formats can be used based on the type of data. … Finally, we use the build method … to build the sink. … The sink needs to be attached … to a data stream, in order for the data stream …
- Streaming with Apache Flink
- Using the DataStream API for basic stream processing
- Working with process functions
- Windowing and joins
- Setting up event-time processing
- State management in Flink
Skill Level Advanced
1. Apache Flink
2. DataStream API
4. Event Time Processing
5. State Management
6. Use Case Project
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.