From the course: Data Science on Google Cloud Platform: Predictive Analytics

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Training using jobs

Training using jobs - Google Cloud Tutorial

From the course: Data Science on Google Cloud Platform: Predictive Analytics

Start my 1-month free trial

Training using jobs

- [Instructor] Let's see how see can set up training jobs in Cloud ML, that can work on large training data sets and run the process asynchronously. To achieve this, I will first create a Shell Script called submit_job.sh with the following content. In this, we first set the project ID, under whose context this job needs to be run. This is the GCP project ID. We set the bucket ID to the path where you want all the files to be created and stored. The job name points to a unique name that can be used by the system to create the data cree, and the model for it. We are upending the date, with year, month, day, hour, minute, second of it, so we know when the model was created. The job_dir is essentially a temporary data cree, that would be used by Cloud ML for its internal purposes. The training_package_path points to the actual path under which we created the training package. The main_trainer_module points to the module name that was created in the same package. Region specifies the…

Contents