There are three main ways to do deep learning in Spark. In this video, learn what they are.
- [Instructor] There are three major ways … to do deep learning in Spark. … They are inference, transfer learning, … and model training. … Let's look at each of them in turn in a little more detail. … We could take a model like ResNet 50, … or Inception Version Three, or VGG16, … these are deep learning models … that have been trained on ImageNet dataset, … and apply them to a large dataset … in parallel using Spark. … This will allow you to apply image classification … to a large dataset of images very quickly … and put them into one of the 1000 classes. … You could take the YOLO Version Three model, … for example, which is used for object detection, … and apply it in parallel using a Spark function. … You could also use PySpark's map function … to get a distributed inference … by calling TensorFlow, Keras, or PyTorch. … The second way to use deep learning … in Spark is via transfer learning. … Transfer learning is using a trained neural network … that would have been trained on a dataset …
- Components in the Apache Spark ecosystem
- What is deep learning?
- Using deep learning in Spark
- Working with images in Spark
- Using pre-trained models
- Testing your model
- Deploying models as SQL functions