From the course: Apache Spark Deep Learning Essential Training

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Using deep learning in Spark

Using deep learning in Spark - Spark DataFrames Tutorial

From the course: Apache Spark Deep Learning Essential Training

Start my 1-month free trial

Using deep learning in Spark

- [Instructor] There are three major ways to do deep learning in Spark. They are inference, transfer learning, and model training. Let's look at each of them in turn in a little more detail. We could take a model like ResNet 50, or Inception Version Three, or VGG16, these are deep learning models that have been trained on ImageNet dataset, and apply them to a large dataset in parallel using Spark. This will allow you to apply image classification to a large dataset of images very quickly and put them into one of the 1000 classes. You could take the YOLO Version Three model, for example, which is used for object detection, and apply it in parallel using a Spark function. You could also use PySpark's map function to get a distributed inference by calling TensorFlow, Keras, or PyTorch. The second way to use deep learning in Spark is via transfer learning. Transfer learning is using a trained neural network that would have been trained on a dataset that is similar to the one you're…

Contents