From the course: Cloud Hadoop: Scaling Apache Spark

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Continue learning for scaling

Continue learning for scaling - Apache Spark Tutorial

From the course: Cloud Hadoop: Scaling Apache Spark

Start my 1-month free trial

Continue learning for scaling

- [Lynn] To continue learning more about cloud scaling Apache Spark I suggest three resources, examples, a community, and the AWS Data Heroes. Let me show you more. Whether or not you use Databricks Apache Spark, I find their resources are really useful and as mentioned several times, a lot of the committers for Spark actually work at Databricks. So I'm in the Databricks documentation and often when I'm struggling with a concept, I'll start there irregardless of which cloud or how I'm running my Spark implementation. Secondly, I have my repository here that I've created for this course, Learning Hadoop and Spark, because it is an evolving ecosystem and I will work to keep my examples updated for the various versions and please feel free to submit poll requests and to help me to create a learning community for Hadoop and Spark with these examples on GitHub. Speaking of learning communities, if you are working in the AWS…

Contents