From the course: Building and Deploying Deep Learning Applications with TensorFlow
Unlock the full course today
Join today to access over 22,600 courses taught by industry experts or purchase this course individually.
Options for loading data - TensorFlow Tutorial
From the course: Building and Deploying Deep Learning Applications with TensorFlow
Options for loading data
- [Instructor] The first step of training a machine learning algorithm is loading the training data. TensorFlow supports different ways of loading datasets, depending on how much data you are dealing with. The more data that you have, the more complicated it gets. The simplest method is to preload all your data into memory and pass it to TensorFlow as a single array. To do this, you just write plain Python code to load your data. There's nothing TensorFlow specific. The second more complicated option is to write code that feeds your training data step-by-step into TensorFlow as TensorFlow requests it. This gives you more control over when the data is loaded but it requires you to manage everything yourself. The third option is to set up a custom data pipeline in TensorFlow. This is the best option when you are working with enormous datasets like millions of images. A data pipeline allows TensorFlow to manage loading data into memory itself as it needs it. Let's take a deeper look at…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.