Build code for processing US sales data in the solution.
- [Instructor] In this video,…I'm going to show you how to build the pipeline…for US sales data for this use case project.…If you look at the right top corner window,…it shows you the data that is already set up for you…in the US sales database.…That is a table called garment sales…and there are a couple of records in there…for garment sales with some order value.…There is also the executive summary table,…which is going to be used for displaying the dashboard…in that the base record is actually set up…for the same time,…but the sales, web hits, tweets, a tweets positive…are at this point zero.…
First, we are going to set up a kafka connect task…that is going to listen to new records…in this garment sales table…and publish them as a kafka topic.…On the left side, you'll see that the contents…of that file in line 24,…the name of that task is going to be use case US sales.…The connector is the JDBC source connector…and then on line 28 is the connection URL…to the specific database.…
The table we are going to listen to in white list…
- What is data engineering?
- Spark and Kafka for data engineering
- Moving data with Kafka and Kafka Connect
- Kafka integration with Apache Spark
- How Spark works
- Optimizing for lazy evaluation
- Complex accumulators
Skill Level Advanced
Big Data Foundations: Program Managementwith Alan Simon1h 11m Intermediate
1. Data Engineering Overview
2. Moving Data with Kafka
3. Spark High-Performance Processing
4. Use Case Project
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.