Learn about a typical hardware specification for a Kafka cluster.
- [Instructor] When it comes to hardware,…there are some high level categories…that we need to pay attention to in order for Kafka.…The first one is memory.…Now, Kafka uses the file system…for storing and caching its messages,…so you need sufficient memory in order to buffer…the active readers and writers.…So all the data that is flowing in,…you need enough memory to hold that…until it can be written to disk.…Now, you can do some back of the napkin estimate here…of your memory needs by assuming that…you want to be able to buffer for 30 seconds.…
And to compute your memory needs,…you take your right throughput times 30.…A machine with 64 gigs of RAM is a decent choice,…but there are plenty of people using…32 gig machines out there as well.…Less than 32 gigabytes of RAM…tends to be counterproductive,…meaning you need to have a lot of little machines…instead of just having some medium sized machines.…So, I would recommend starting with 32 gigabytes of memory…and going up from there.…But again, you don't need really big machines here…
- Understanding the Kafka log
- Creating topics
- Partitioning topics across brokers
- Installing and testing Kafka locally
- Sending and receiving messages
- Setting up a multibroker cluster
- Testing fault tolerance
Skill Level Intermediate
Transitioning from Data Warehousing to Big Datawith Alan Simon1h 50m Intermediate
Big Data Foundations: Program Managementwith Alan Simon1h 11m Intermediate
1. Why Use Kafka?
2. Core Concepts
4. Installing and Testing Kafka Locally
5. Real-World Examples
6. Distributions and Packaging
Next steps1m 4s
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.