Learn about some best practices while building data transport modules within the big data architecture.
- [Instructor] Let us now turn our attention…to best practices we need to consider…when architecting the transport module…in our data pipelines.…We first need to look at guaranteed delivery.…Each event or message needs to be received,…transported, and delivered once, and only once.…Ideally, there should neither be loss of events…or duplicate events.…Rather than doing point-to-point transport,…architect for a publish and subscribe scheme…as much as possible.…
This means, the same information…can be published by multiple sources…without worrying about the destination…and the same information can be subscribed to…by multiple sinks…without worrying about the source.…Then the intermediate brokers are agents like…Flume or Kafka can take care of the magic…of transporting data from the sources to the destinations.…Data sources and data sinks do not work at the same speed.…
Hence, buffering of data is required…so that the sinks can work at their own speed…and catch up later during volume spikes.…It is a good practice…to provide for intermediate persistent storage of data.…
There is no coding involved. Instead you will see how big data tools can help solve some of the most complex challenges for businesses that generate, store, and analyze large amounts of data. The use cases are drawn from a variety of industries, including ecommerce and IT. Instructor Kumaran Ponnambalam shows how to analyze a problem, draw an architectural outline, choose the right technologies, and finalize the solution. After each use case, he reviews related best practices for data acquisition, transport, processing, storage, and service. Each lesson is rich in practical techniques and insights from a developer who has experienced the benefits and shortcomings of these technologies firsthand.
- Components of a big data application
- Big data app development strategies
- Use cases: archiving audit logs and performing customer analytics
- Technology options
- Designing solutions
- Best practices
Skill Level Advanced
Big Data Foundations: Program Managementwith Alan Simon1h 11m Intermediate
1. Intro to Big Data Applications
2. Use Case 1: Data Warehouse (DW)
3. Use Case 2: Log Accumulation (LA)
4. Use Case 3: IT Operations Analytics (OA)
5. Use Case 4: Customer 360 (C360)
6. Use Case 5: Customer Analytics (CA)
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.