Join Ben Sullins for an in-depth discussion in this video What you should know, part of Kafka Essential Training.
- [Instructor] To be successful in this course you should be familiar with database systems in general and how they've evolved or how some of them work. You should understand Hadoop architecture at a high level and what Hadoop does, at least the major components of it. And you should be comfortable on the Linux terminal. We'll be doing a lot of stuff in the terminal window and that's where we're going to see things happen and work with Kafka. So it's important that you're familiar with that at least. If you need to get up to speed there are a couple of courses here that can help you out. First I'd recommend Data Engineering Essentials Training for Data Science.
Hadoop Fundamentals, and then Learn the Linux Command Line: the Basics.
- Understanding the Kafka log
- Creating topics
- Partitioning topics across brokers
- Installing and testing Kafka locally
- Sending and receiving messages
- Setting up a multibroker cluster
- Testing fault tolerance
Skill Level Intermediate
Transitioning from Data Warehousing to Big Datawith Alan Simon1h 50m Intermediate
Big Data Foundations: Program Managementwith Alan Simon1h 11m Intermediate
1. Why Use Kafka?
2. Core Concepts
4. Installing and Testing Kafka Locally
5. Real-World Examples
6. Distributions and Packaging
Next steps1m 4s
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.