navigate site menu

Start learning with our library of video tutorials taught by experts. Get started

Up and Running with Google Cloud Platform

Up and Running with Google Cloud Platform

with Joseph Lowery

 


Google Cloud Platform makes the robust infrastructure of Google—including its high-speed network, servers, and software—available on an enterprise level but accessible to the everyday web developer. In this course, author Joseph Lowery introduces each of the service's five products: App Engine, Compute Engine, Cloud Storage, BigQuery, and Cloud SQL. Learn the basics of hosting a mobile app with App Engine and see how to analyze massive datasets in seconds with BigQuery. Then explore the benefits of Cloud Storage, including unlimited file storage and fast data retrieval, and learn how to establish a cloud-based private network with Compute Engine. Finally, the course walks through setting up and managing a cloud-based relational database with Cloud SQL.
Topics include:
  • Why Google Cloud Platform?
  • Deploying an app with Google App Engine
  • Activating and working with Google Cloud Storage
  • Loading, querying, and exporting data with BigQuery
  • Working with Cloud Storage buckets
  • Managing cloud-based private networks
  • Importing and exporting data
  • Scheduling backups
  • Working with Google Datastore

show more

author
Joseph Lowery
subject
Developer, Cloud Computing
software
Cloud
level
Beginner
duration
1h 48m
released
Jul 12, 2013

Share this course

Ready to join? get started


Keep up with news, tips, and latest courses.

submit Course details submit clicked more info

Please wait...

Search the closed captioning text for this course by entering the keyword you’d like to search, or browse the closed captioning text by selecting the chapter name below and choosing the video title you’d like to review.



Introduction
Welcome
00:00 Hi, I am Joe Lowery, and I would like to welcome you to Up and Running with Google
00:08 Cloud Platform. In this course, we will explore all that
00:12 the Google Cloud Platform has to offer. Then, we'll delve into the various
00:17 interconnected components to show you how to host your web and mobile apps, analyse
00:22 massive data sets, and store your assets in Google's global network for fast,
00:28 efficient access. We'll also cover working with the Google
00:32 Compute Engine, to speed up new virtual machines as needed, and manage your data
00:38 regardless of the amount, via Google Cloud SQL.
00:42 There's a ton of services and options to explore so let's get going with up and
00:46 running with Google Cloud Platform
00:48
Collapse this transcript
How to use the exercise files
00:00 I'm very happy to announce that regular and premium members of lynda.com have
00:05 access to the exercise files used throughout this title.
00:08 Naturally, the information in this course is ultimately intended to be applied to
00:13 your own work. However you may find it helpful to use the
00:16 supplied exercise files to practice the steps safely, and without creating your
00:21 own examples or assets. To begin using the files from the site
00:25 download them, extract them and then store those files in the convenient location
00:30 such as your desktop. The exercise files folder is organized by
00:34 chapter, and the chapters are broken down into various lessons within the lessons.
00:39 Within the lesson chapters, there are a series of items.
00:42 Typically HTML, CSS images and so forth, that make up the practice file assets,
00:48 which we'll use to explore the Google Cloud platform.
00:51 You can either follow the whole course, chapter by chapter, or jump in at any point.
00:56 If you'd rather not use the exercise files, you can definitely follow along
01:00 with your own assets.
01:01
Collapse this transcript
1. Introduction to Google Cloud Platform
Understanding Google Cloud Platform
00:01 This lesson will be a 10,000 foot view of the Google Cloud Platform.
00:05 In other words, a Cloud's Eye View of the cloud.
00:09 There are a fair number of moving parts to the service.
00:13 And it's important that you understand what each component does and how they work
00:16 together so you can make the most of it. At its most basic the Google Cloud
00:21 Platform is Google. It's the same infrastructure that Google
00:25 has developed over the last 14 years and is currently using for their own websites
00:31 and apps. Except now, you have access to it as well.
00:34 It's the physical network. Thousand of servers, miles of fiber cable
00:40 as well as the software constantly updated for peak efficiency.
00:46 It's what Steven Levy of Wired calls, The Mother of All Clouds, as of this recording
00:52 a very important phrase when it comes to all things Google.
00:56 There are five major components of the Google Cloud Platform, App Engine, Compute
01:03 Engine, Cloud Data Storage, BigQuery and Cloud Datastore.
01:08 We'll explore each of these components a little bit more later in this chapter.
01:14 But before we get into the details of what, I want to explain a bit of why.
01:21 Why should the Google Cloud platform be of interest to you?
01:24 I think it all comes down to the network. Google's network is both fast, and vast.
01:32 And each of these aspects supports the other its fast partly because It is
01:37 optimized for speed and partly because it is vast.
01:42 And the network that utilizes multiple enormous data centers around the globe are
01:47 possible impart because it is so fast. So, what can you do with the
01:53 infrastructure of the Google Cloud Platform?
01:55 Let's highlight two key areas. Mobile apps are hot and getting hotter,
02:03 and Google Cloud Platform allows you to quickly deploy your apps across multiple
02:07 devices including iOS, and of course, Android.
02:12 Large sized files such as videos and high resolution images are easily handled and
02:18 your apps scale up as needed. There's no start up cost and you only pay
02:24 for services used. Gaming is one app category that has
02:28 special demands. And one, that Google Cloud Platform can
02:33 handle with aplomb. All the game media and large data files
02:37 can be maintained through Google's storage infrastructure.
02:41 The rapid computation often required in a gaming environment is available on demand.
02:47 Flexible backends are accessible for both Android and iOS games.
02:52 Game states can be synchronized across multiple devices.
02:56 Games can be distributed, and best of all, monetized through numerous Google-related services.
03:03 Both mobile apps and games can leverage virtually all the various parts of the
03:08 Google Cloud Platform. In the next lesson, we'll explore each of
03:12 those components in turn.
03:14
Collapse this transcript
Exploring the core services
00:00 Although, the Google club platform is quickly evolving, there are currently five
00:04 primary services or components. In this lesson, we'll take a closer look
00:09 at each of them intern. The five key components are, App Engine,
00:13 Compute Engine, Cloud Data Storage, BigQuery and Cloud SQL Cloud Datastore.
00:21 Lets start with App Engine. App Engine makes it possible for you to
00:25 run web applications on Google's infrastructure.
00:28 You don't have to mess with the hardware at all.
00:31 The servers are all managed for you. App Engine supports several run time
00:35 environments including Java, Python, Go, and just recently added PHP.
00:43 You can serve your Apps through your own domain via Google Apps, or get a free name
00:49 on the appspot.com domain. There are a variety of storage options,
00:54 for non-relational data use the App Engine Datastores NoSQL solution.
00:59 For relational data, you would use Google Cloud SQL.
01:03 And there's Google Cloud Storage for really large objects and files.
01:09 App Engine is free to start, and you get a one gigabyte of storage and enough CPU
01:14 bandwidth for 5 million page views a month.
01:18 If you need more, you just pay for what you use.
01:21 We'll dig into App Engine in much more detail in Chapter teo, Hosting Mobile and
01:26 Web Apps. Now, let's turn to App Engine's companion
01:30 service, Compute Engine. Compute Engine comes into play when you
01:34 need to run large scale computing workloads.
01:37 It uses the same infrastructure as Google Search, Gmail, and Google Ads, so you know
01:42 you're working with some real oomph. Compute Engine's primary advantage is its
01:48 ability to spin up virtual machines or VMs, programmatically for faster
01:54 computations and efficient scaling. You can specify one, two, four or eight
02:00 virtual core instances, each with up to 3.75 Gigabytes of memory.
02:06 As with App Engine, there are a number of storage options.
02:10 You can use an Ephemeral disk for temporary storage that just last a life
02:14 time of a VM. A Persistent disk with network connected
02:19 storage, or again, for large files and data, there is Google Cloud storage.
02:24 We'll dive deeper on Compute Engine in Chapter five, Managing Cloud-based Private Networks.
02:30 Now, let's take a look at Google Cloud Storage.
02:33 Google Cloud Storage is great for storing your assets and data, whatever it's used for.
02:39 You get fast data access with hosting around the globe.
02:43 Reliability is guaranteed at a 99.95% up time.
02:48 Google also provide the ability to back up and restore your data, and believe it or
02:53 not, storage is unlimited .Other features include enhanced security.
02:59 OAuth 2.0 authentication protocols and group based access controls, including
03:06 access control lists or ACLs. As with other Google Cloud platform
03:10 services, you pay only for what you use and prices start, that's right, start.
03:16 At 8.5 cents per gigabyte for the first terabyte per month, and they only go down
03:22 from there. I'll show you more about Google Cloud
03:25 Storage in Chapter four, Enabling Multi-tiered Storage.
03:29 Now, that you've got all this data available for you in the Cloud.
03:32 How are you going to understand it? For that, let's look at BigQuery.
03:36 With BigQuery, you can analyze massive amounts of data.
03:40 We're talking millions of records within seconds.
03:43 Access is handled via a very straight forward UI or a REST rest interface.
03:50 REST stands for representational state transfer.
03:54 Data storage is, as you might suspect, not a problem, and scales to hundreds of terabytes.
04:00 BigQuery is accessible via a host of client libraries, including those for
04:06 Java, .NET, Python, Go, Puby, PHP and Javascript.
04:12 A SQL like syntax is available which can be accessed through these client libraries
04:17 or a Web User Interface. We'll give BigQuery a run through in
04:22 Chapter three, Analyzing Massive Data Sets.
04:26 Finally, let's talk about the Google Cloud platform database options, Cloud SQL and
04:32 Cloud Datastore. I've put the two together because they
04:36 basically perform the same service, providing database functionality.
04:40 However, there is one major difference. Cloud SQL is for relational databases
04:46 primarily, MySQL. And Cloud Data store is for non-relational
04:52 databases using NoSQL or NoSQL. With Cloud SQL, you have the choice of
04:58 hosting either in United States or the European Union with 100 Gigabytes of
05:04 storage and 16 Gigabytes of RAM per database instance.
05:08 Cloud Datastore, which were just introduce at the Goggle IO Conference is available
05:14 at no charge for up to 50,000 read write instruction with 200 indexes and one
05:21 gigabyte of data stored per month. Check out Chapter six, Developing Database
05:27 in the sky, for more details on Cloud SQL and Cloud Datastore.
05:31 And that's the skinny, as if that word is even remotely applicable here on the major
05:36 components on the Google Cloud Platform. Next you'll see how to sign up and get started.
05:41
Collapse this transcript
Getting underway
00:00 Okay. Time to get a little hands on.
00:02 In this lesson, I'll show you the sign up process for the Google Cloud platform, so
00:07 you can know what to expect. First thing you'll need, and that you
00:10 probably already have, is a Google account.
00:13 If you don't have one, go to google.com and click the sign in option in the upper right.
00:19 On the next page, click SIGN UP and complete the steps that appear.
00:24 Now, I've created a new Google account, so I can show you the entire sign up process
00:28 for Google Cloud platform. So, let's go to cloud.google.com.
00:34 The homepage of the site will give you some idea of what's available, and there
00:39 are various sections of the site that you can browse.
00:41 Once you're ready to give it a go, click any of the Try it now buttons.
00:48 Next, I'll sign into Google. If you've already signed in, you'll skip
00:52 this step. Now, you'll be asked to verify your account.
00:57 This could be handled through a text message to a mobile phone, or a voice
01:01 phone call. I'll opt for the text message and enter
01:04 the number of my mobile. Almost immediately, I get a text message
01:10 with a numeric verification code, which I'll enter and click verify.
01:16 Okay. We're in and we're asked to give our
01:20 project a name. Regardless of whether you're just
01:23 exploring or ready to begin work, you'll need to create a project.
01:27 I'm going to change the default name here. I'm allergic to pretty much anything that
01:31 starts with my and let's go with Lynda Cloud 001.
01:36 If you're following along you could use that same name.
01:40 Next we have a project ID. This is a unique identifier among all
01:44 Google Cloud projects. And once selected cannot be changed.
01:48 The default offering is automatically generated by Google, and if you don't like
01:52 its suggestion, you can enter your own or click the refresh icon.
01:56 Well, I am pretty happy with the next random selection that came up.
02:00 So, I'm going to go with that. Keep in mind, of course, yours will be
02:03 entirely different. Finally, we have to check off The Google
02:07 Cloud Platform's Terms of Service check box before proceeding.
02:10 That's mandatory. The other, marketing-oriented check box is optional.
02:15 When you're ready, click Get started. All right, the project is created, and
02:20 once that's done. You'll find yourself in the Google Cloud
02:23 platform console for the project. Up top is an invitation to get started
02:28 using one of the sample apps. Currently, there are two, Photofeed Java
02:35 App and the Mobile Backend App. And I can switch between the two by using
02:40 the next previous arrows. Let's save those for later.
02:43 And collapse this section by now clicking the up double arrows, that you see over on
02:48 the right. Now, you can focus on the Google Cloud
02:51 platform services that are available listed here.
02:53 The services you see may very well be different from the ones currently listed
02:57 on my screen. Google Cloud platform is a fast-moving beast.
03:01 We'll spend the rest of the course exploring the various major components.
03:04 But now, I want to show you the gateway to managing your project and setting up the
03:09 various services and that's found under the APIs, which I'll scroll down a little
03:13 down, and there you can see it in the center.
03:18 So, let's click that entry, not the documentation length but just the main entry...
03:24 Now, you'll see a full listing of all the available Google services, each with a
03:28 status indicator. As you can see, when I scroll down, there
03:32 are a lot of them representing a great deal of power.
03:36 As you'll discover later, most of them have a very generous courtesy limit, so
03:41 you can really try out the Google Cloud platform at no additional cost.
03:45 I'm going to scroll down towards the very bottom here, and then let's just turn on
03:49 the URL Shortener API. The first time you turn on anything,
03:55 you'll be asked to agree to it's terms of service.
03:57 So, I'll do that, and then click accept. And now, you can see that URL Shortener
04:05 API has been switched to "on". Now this page shows all the registered APIs.
04:09 Now ,I can just switch it to just Enabled, which will be the one I just enabled, or
04:14 if I want to see them all, click All. Now, I just refreshed the page, so that
04:19 you can see what happens after you've enabled something and you're looking at
04:23 the full list. Any enabled API will show up up top.
04:27 Let's go back to the Google Cloud Console page, and let me show you a little bit
04:31 more about projects. To see a list of your current projects,
04:34 click on the Google Cloud Console logo. Now, there's our one project with the
04:41 columns of relevant information. And if I want to create another project, I
04:45 can just click on the big red button to do so.
04:47 There's no need at this point, so let me click Cancel.
04:51 Now, let me return to the console page for the current project by clicking its name.
04:58 So, now you're up and running with Google Cloud Platform and ready to start
05:01 exploring the specific components.
05:03
Collapse this transcript
2. Hosting Mobile and Web Apps
Understanding the Google App Engine
00:00 Have you ever had a website that posted an article, video, or application that became
00:05 wildly popular? Suddenly, you're getting frantic emails
00:08 from your web host about CPU and bandwidth overages, or they just shut it down
00:12 completely, and you're frantically trying to contact them to get it online again?
00:17 Admittedly, this is a great problem to have, but it's still a problem.
00:21 App Engine is designed to be the solution. With App Engine, you can put your web and
00:27 mobile applications online using the same infrastructure used for Google's own apps.
00:33 Once deployed, App Engine manages the server side.
00:37 New instances are generated, databases are shared, bandwidth is increased, all as needed.
00:46 And because all you have to worry about is code, suddenly your app becomes
00:50 automatically scalable. Available to an unlimited number of users,
00:55 working with an unlimited number or size of assets.
01:00 Google's infrastructure also brings terrific performance, robust security, and
01:06 a high degree of reliability. So, that's what App Engine does and some
01:10 of its advantages. How do you work with it?
01:14 App Engine currently supports four different runtime environments or
01:18 languages, Java Python, Go, and PHP. Both Go and PHP are still in what Google
01:28 calls an experimental phase. This means somewhat limited functionality,
01:32 and for PHP, as of this recording, limited deployment.
01:38 To work with the App Engine, you'll need to choose one of the language options and
01:42 install its SDK. There's a good gateway to these
01:45 environments at the link now shown. So, that's developers.google.com/appengine/docs.
01:53 All the application environments are tailored for the specific language, but
01:57 they do have many similarities. They all include a full-feature local
02:01 development environment to simulate the App Engine.
02:04 Authentication and email APIs integrated with Google Accounts and the ability to
02:10 run cron jobs that trigger events at specific times.
02:13 Finally, they all run in a secure sandbox. In the computing world, a Sandbox is
02:21 defined by the degree of access to the underlying system, which protects both
02:26 your app and those of others. Besides providing security, the sandbox
02:31 makes it possible for the App Engine to distribute web requests to multiple
02:35 servers and to start and stop servers as needed.
02:40 Let's look at each of the specific development environments a little closer,
02:44 starting with Java, the Java SDK works with any platform, but you'll need to use
02:49 either Java 6 or Java 7. Google has a plugin for Eclipse if you're
02:54 using that IDE. With it, you can create, test, and upload
02:59 your apps to the app engine. The Python SDK also works on Linux, Mac,
03:06 and Windows. It needs a minimum of Python 2.7.
03:09 It was written in Python and downloads as a Zip file.
03:14 There are installers for the SDK on both Windows and Mac platforms.
03:18 The Go SDK currently supports Linux and Mac.
03:23 There's a Window version in the works. You don't need to have Go installed locally.
03:28 The installation is stand alone that automatically compiles Go apps.
03:32 The SDK is also available as a zip download.
03:36 Finally, there is the PHP SDK. Like its Python and Java cousins, the PHP
03:43 implementation works on all platforms. You'll need PHP 5.4 or higher, installed.
03:49 Like the others, it comes as a Zip download, and there are installers for Mac
03:54 and Windows. You may be wondering how much all this
03:57 functionality costs. Well, the great news is that you can get
04:00 started at no cost. And although there is a limit to
04:03 resources, it's a pretty generous one. Each developer account can create ten
04:08 different apps. Each app is permitted one gigabyte of
04:12 storage, and 5 million page views per month.
04:16 So, what happens when you need to go over those limits?
04:19 For more resources, you'll need to enable billing which you can access by going to
04:24 the Google Cloud Console page and clicking Settings.
04:27 You'll be given an opportunity to set a daily budget, and then allocate your
04:31 resources within that budget. Now that you have a better grasp with App
04:35 Engine, you're ready to deploy a sample app and see how to work within the admin console.
04:40
Collapse this transcript
Deploying an example app
00:00 Google Cloud platform comes with a couple of example apps that you can quickly
00:04 deploy to get a better sense of the service.
00:06 Let's set one up so, you can really see what's going on.
00:10 Well, I've gone to cloud.google.com/console which shows me my projects.
00:17 Now, from here, I'll click on my project name, and that will take me to the Google
00:23 Cloud Console for my particular project. Once you do that and you're on this page,
00:28 if you don't see the two example apps up top, click Get started with sample
00:33 applications to open those up. We're going to deploy the Photofeed Java app.
00:38 If you don't see it, click the Next or Previous arrows that you see on either
00:42 side of the screen. But once you have it, click Deploy.
00:45 You will get a message in yellow as it's installing.
00:49 If I quickly click Details I can track what's actually happening.
00:53 Once it's done, the message will change to successfully deployed the application.
00:59 And there is a view app link, let's click that.
01:03 Before you can view the app, you'll need to sign in with your Google account.
01:07 At this point, anyone with a Google account could view your app online if they
01:11 knew the URL. Note that the messaging that you see over
01:14 on the left repeatedly uses my project name, lyndacloud001.
01:18 And that the project owners, that's me, will be able to see the emil address of
01:24 any visitors. Okay.
01:25 So, we're going to enter in my password and a quick sign in.
01:34 Now, sometimes this can take a moment or two to set up.
01:37 And you'll get a message that says the data store index is not yet ready to
01:41 serve, please wait a minute and try to reload.
01:44 The app is pretty bare bones, but it shows my Google account name and email address,
01:50 as well as an avatar. There is the Photofeed title and a Choose
01:54 An Image button, let's click that to load an image.
01:59 Now, you can see a bit more of an interface.
02:01 There is a file field a uploaded file, as well as say a text area for comments.
02:05 First, I will click Choose file and navigate to my Exercise Files folder,
02:10 which is stored on my desktop. Then, I'll drill down to chapter two,
02:14 02_02, and open up the Images folder. There are a couple of public domain images
02:23 there, both provided by NASA, courtesy of the Hubble Space Telescope.
02:27 Click on the one that starts hs 1998, and then click open.
02:33 Next, add in a bit of a description. This happens to be Saturn's Aurora, and
02:44 I'll give a little credit where credit is due, and then click Post.
02:53 If you look in the lower left, you can quickly see a uploading and then there's
02:58 my image. I'll scroll down a little bit, and you can
03:01 see the comment. Let's bring in a second image.
03:04 Again, I'll click Choose File and now I'll click hs-2013.
03:10 Click Open. And this is the horse head nebula, also by
03:20 NASA's bubble space telescope. Click Post when you're ready and I can see
03:25 that it's uploading in the lower left, and there's my horse head.
03:29 Now, when the image is loaded it comes in sequentially, newer images are posted up
03:34 top and there's the first image that I put in.
03:38 Now, below the second image I loaded, I have my comment and room for others.
03:43 Let me go ahead and enter a comment on that.
03:47 Just enter coolness and click Post Comment.
03:51 And now, when I scroll back down, there's the horsehead nebula, my original
03:55 description, and a comment. With our app up and running now, we're
03:59 ready to see what's going on in the admin console.
04:02 And that's up next.
04:03
Collapse this transcript
Using the admin console
00:00 In this lesson, we'll take a tour of the app engine's robust admin console.
00:05 If you're following along, make sure you've completed the steps of the previous
00:08 lesson before you proceed. To get to the admin console for the app
00:11 engine, all you need to do, is to click on app engine from the Google cloud console.
00:17 That we saw in the previous lesson. So, highlight App Engine, click into that.
00:22 The admin console, contains, a ton of information and details.
00:26 Lets hit a number of high points. We start out in the dashboard where, up
00:30 top, there's a chart that initially shows requests per second.
00:34 Lets switch the time frame, to, three hours, so we can zoom in a bit.
00:39 If you did the previous lesson much earlier, like a day before, you may need
00:44 to choose a different time option. So there's our spikes of activity, as we
00:48 deployed the app and uploaded the photos. There are a great many chart views to
00:53 select from, let's take a look at CPU seconds next.
00:57 So, I'll expand this list, and choose CPU Seconds Used per Second.
01:01 As you can see, most of the processing went to setting up the app.
01:05 That's the very first sharp spike. Uploading the files, even at 800 kilobytes
01:12 each, barely registered. After the chart, is a list of our instances.
01:16 We currently only have one. Then there's the billing status, which as
01:20 you can see it's still free. Below that are the resources used and
01:25 we've barely touched them. Then all the way down on the left is the
01:28 current load. That's all the files used in the app and
01:32 arrows if any over on the right. Next, let's look at the logs to see what
01:38 kind of activity we've had. So I'll click on Logs under Main and here
01:43 you can see all of the files uploaded during deployment.
01:47 Let me scroll down just a bit so you can get a full sense of what's going on.
01:51 Now let's skip down to quota details. Again, I'll click the link over on the
01:56 left hand side. You'll want to keep an eye on this panel,
01:58 when testing, to get a sense of how much bandwidth and storage your app is taking up.
02:03 Now, let's examine what's being stored. Under Data, on the left, click on Data
02:09 Store Indexes. This will show us the structure of the data.
02:14 As you can see, there are different indexes for the comment and the photo.
02:19 To see what data is stored in this structure, click data store viewer.
02:24 I'll just clean up the screen up a bit by dismissing this advertising message.
02:28 Now, in the query section of the data store viewer, there's a drop-down list
02:33 that controls the kind of data shown. It starts with the comment and there's our
02:38 one comment, coolness. Now, let's switch it to photo and you can
02:44 see the data for the two images. You can view the images by clicking on
02:49 View blog, which loads the image into a blog viewer.
02:55 The last stop on our brief tour is to application settings under the
02:59 administration section. There's a lot of basic information here,
03:04 much of what, much of which you'll need when building your app.
03:08 You also have some key options like changing the front facing name of your app.
03:12 Let's change that to Lynda Cloud photo feed.
03:18 Now, I hit return there. I could have gone down and clicked Save Settings.
03:22 Now, if I go back to my app and sign out, click on my photo feed, image, and click
03:31 sign out, you can see that the title has changed.
03:35 Most changes to the admin console, take effect immediately.
03:38 Okay, lets go back to the, admin console, and scroll down, to the performance section.
03:44 Should your app need more computing power, from the app engine, this is where you
03:51 (INAUDIBLE) it up. The front end instance class, controls the
03:54 memory, and processing power. You start out, at F1.
03:57 With a 600 megahertz processor, and 128 megabytes of memory.
04:02 Now, I open up the list here. You can see that I can double it, by going
04:07 to F2. There's even a higher rate, at quadruple
04:10 the power. And, again, doubling the memory.
04:13 Of course, your charge rate is higher if you do increase your processing power, and
04:19 or memory. Double or four times.
04:21 So, use this power judiciously. As you can tell from the extensive
04:26 navigation in the lefthand column, there's a great deal more to the admin console,
04:30 but this lesson pinpoints some of the more crucial elements.
04:33
Collapse this transcript
Approaching Google App Engine services
00:00 When you come right down to it, many, in fact, most apps include functionality to
00:05 perform the same kind of operations like sending email messages or working with
00:10 ephemeral data. To simplify the lives of its developers,
00:14 Google Cloud Platform includes a number of commonly used services for each of the
00:18 supported languages. In this lesson, we'll take a look at how
00:21 to access them in general, and then hone in on a common but extremely powerful service.
00:26 The number of services available for the supported run time environments is really
00:31 different from language to language. Currently, Java has the most with 20,
00:35 followed by 18 for Python, and 11 for Go. PHP is the new kid on the block with only
00:43 6 to call its own. You can find out all about these by going
00:48 to the App Engine docs for the language you're interested in.
00:51 Under each language, there's a services section.
00:54 Let's take a look at the Java services. So, I'm going to go over into the
00:58 left-hand column, and expand Java. And there you see services, so let me
01:04 expand that. And I'll scroll down, so you can see the
01:07 full list. As I mentioned earlier, there are a good
01:11 number of Java services available. As of this recording, twenty in all.
01:15 Let's take a closer look at the Images service.
01:19 The Images Java API is great for a wide range of graphics manipulation including
01:24 cropping, resizing, rotating, and flipping.
01:27 There's also a filter for adjusting the image appearance and much more.
01:31 Click on Overview to dig a little deeper. The Overview gives us some more details
01:37 about the image capabilities, as well as specific code for implementation.
01:42 If we scroll down, you'll see graphic examples of common operations.
01:47 There's some initial code, and there are some handy graphing examples.
01:54 I head back up to the top of the page. Now, if you come down a little bit from
01:57 the top of the page, you'll see under Images, the specific services.
02:02 Besides Overview, there's also a Javadoc Reference.
02:05 Click that link and that will take you to a succinct breakdown of the various code elements.
02:11 Let me go back a couple more pages until I'm on the App Engine Doc's main page, and
02:16 then expand Python. Now, if you want to check out the images
02:20 service for any of the other languages, you'd go back to the App Engine Doc's page
02:24 and start there. You'll find an image service in all of
02:27 them except for PHP. Obviously, the service is tailored to the
02:31 specific runtime environment. So, let's take a look at the Python
02:35 version, I'll expand that, expand Services, scroll down, and there's Images.
02:44 And as you can see, the reference is structured completely differently.
02:48 One section that you'll definitely want to take a look at is Quotas and Limits,
02:52 that's found in the Overview page of each service.
02:55 Here, you can see how using the service affects your charges if at all, as well as
03:04 any limits. App Engine services are intended to reduce
03:07 your app development time. Be sure to look them over when you're in
03:10 the planning stage to see which ones you can take advantage of.
03:13
Collapse this transcript
3. Analyzing Massive Datasets
Working with BigQuery
00:00 A major aspect of the digital life is the amount of data it outputs.
00:04 Every online interaction, whether it's a gaming event, or a purchase, generates a
00:09 good deal of raw data. Multiply that by millions upon millions of
00:14 users, and there's a lot of info that could reveal market trends, and
00:18 potentially influence business decisions. Analyzing that data is what BigQuery is
00:23 all about. The primary purpose of BigQuery is to
00:26 analyze massive amounts of data, we're talking billions of records,
00:31 interactively, and very, very quickly. As you might suspect from the name,
00:36 BigQuery can handle large amounts of data, and scale to hundreds of terabytes when needed.
00:41 This Google Cloud platform component uses SQL-like queries to get its results.
00:47 You can access BigQuery in a number of ways.
00:49 You can work with a simple web interface, which we'll take a closer look at in other
00:54 lessons in this chapter. A command line interface called BQ, or you
00:57 can access BigQuery programmatically, via a REST or representational state transfer interface.
01:06 It's important to understand what BigQuery is and what it isn't.
01:10 BigQuery is a very fast data analyzer based on the artificial intelligence
01:15 principles of online analytical processing or OLAP.
01:20 It imports comma separated value or CSV files.
01:25 It works best with the relatively small number of tables that are not related.
01:29 BigQuery is not, however, a relational database.
01:34 In fact that's not really any kind of database at all.
01:38 There are no inserts, updates, or deletions.
01:41 There is no support for indexes, and it can only handle joins when one side is
01:47 much smaller than the other. If you need these types of features, use
01:51 Google Cloud SQL or Cloud Datastore, which we'll cover in Chapter six.
01:57 So, let me show you how to get started with BigQuery.
02:01 Head on over to your project page on the Google Cloud Console.
02:05 You can get there by going to cloud.google.com/console.
02:10 And you may have to sign in, if you're not already.
02:12 Once you're there, click on your project name, and then when the various components
02:17 are shown, click on BigQuery. You'll need to sign in again to verify
02:21 your access. After the intro message appears, click
02:26 Let's Go. That will take you to the Google API's page.
02:30 Now, this Google API page may be one that you haven't seen before.
02:35 It's actually the original one. And the one we looked at earlier is the
02:40 more experimental one, that Google appears to be rolling out per service.
02:45 So, you may get one or the other of these pages, and that's what it's like in this
02:50 fast-moving digital life of ours. If you do get this page, click on Services.
02:55 That will show you all the services with status switches, as well as courtesy limits.
03:02 So, go down to BigQuery API and click On. When you click it on, you'll be asked to
03:09 agree to the terms of service. I'll click the I agree to these terms
03:14 checkbox and then accept. This will return you to the services page
03:19 and we are ready to go. Notice that we have a courtesy limit of
03:22 10,000 request per day. If you are concerned about your project
03:26 exceeding that limit, click the pricing link.
03:29 To track your big query use, there are a number of tools available.
03:32 Lets click Big Query API. That will take you to the Reports section,
03:39 and since we've just started, there's nothing to report.
03:42 But that will change over time. Now, in the left column notice we now have BigQuery.
03:47 Let's click that, and you can see some more specific charts.
03:52 These are dedicated to BigQuery. There are also links up top for accessing
03:57 BigQuery, which we'll start to do in the next lesson.
04:00 Although, there is a pretty high limit to request, the reality is that you'll still
04:05 need to enable billing to work with BigQuery.
04:08 If you're just testing, you're very likely to never run up a charge.
04:11 But Google requires a credit card or checking account connection, just in case.
04:16 To enable billing, go back to your console page, and from Settings choose Billing.
04:24 Now, I already have my billing set up, but when you go to enable billing for the
04:27 first time, you'll see a series of forms that you need to fill out.
04:31 After filling out the information, you'll need to verify your e-mail address.
04:35 And once you've done that, it could take up to an hour to validate the account.
04:40 However, once billing is enabled, all the preliminary work is done, and you're ready
04:44 to start building your first BigQuery.
04:47
Collapse this transcript
Loading data
00:00 In this lesson, we'll create a data set to be used by BigQuery.
00:04 I've compiled a list of the most popular baby names as recorded by the US
00:08 government from 2002 to 2012 for our demo. Let's start by looking at the CSV file in Excel.
00:16 You can find the file in the Chapter 3 0302 Assets folder of the exercise files.
00:23 It's called babynames.csv. As you can see, it's a pretty simple file.
00:28 There are four columns, named, gender, frequency, and year.
00:33 If you look down at the bottom of the screen, you can see that there are 366,322
00:38 Rose, maybe not massive, but pretty darn big.
00:42 Okay, let's switch to the Google Cloud Console.
00:48 And, click on our Project Name. Now, click on BigQuery.
00:53 On the BigQuery page, click on the down arrow next to the project name on the
00:58 left, and choose Create new dataset. First, we'll give the data set an ID.
01:02 Use only numbers, letters, and underscores for this.
01:06 I'm going to enter baby_names_lynda, and then click OK.
01:12 You'll see the data set appear under the project name.
01:15 Hover over the name until you see a plus symbol and a down triangle.
01:20 Click the plus to bring up the Create and Import dialogue box.
01:24 Now, we'll add a table ID. Again, only letters, numbers, and underscores.
01:29 I'll enter bn_2002_2012 for baby names from 2002 to 2012.
01:41 When you have that, click Next. Next up, we choose the source of the data.
01:45 Ours is a CSV file, which happens to be the default.
01:49 We also have the option of importing JSON file or Using an App Engine data store backup.
01:55 Let me click Choose File and I'll navigate to my Exercise Files.
02:02 I'll go into Chapter 3, 0302 > Assets, and then choose baby_names.csv.
02:10 If you want to look at the entire Zip of baby names, they're organized by year and
02:15 they go back to 1880. Those are also included for your own use
02:20 and play. Be warned though, you can waste many hours
02:23 of precious time investigating your own birth name and those of others.
02:28 All right, I have my file ready to upload. Now, I'll click Next.
02:34 And now, it's time to define the schema. BigQuery needs to know the name of each
02:38 column and its data type. The syntax is column name, colon, data
02:43 type, comma, and then the next column name and data type.
02:47 So, I'll enter the name of the first column, which is name, a colon, and then
02:53 the data type. And this is a string.
02:56 Enter in a comma and we can enter in the next name value pair.
02:59 Next column name is gender, colon, this is also a string.
03:04 Follow that with a comma, and then frequency, the number of times that that
03:09 baby name appeared in that year, colon this is an integer, comma.
03:15 And then, our final column, year, colon. And you could set this up, either as an
03:19 integer, or a string. I'm going to put it in as an integer.
03:22 And once you have that, click Next. The final panel allows us to define the
03:27 field delimiter. Which in our case is a comma, the default option.
03:31 We can also include the number of header rows to skip.
03:34 For us, that's going to be 1. I don't want to allow any errors, so I'm
03:39 going to leave the number of errors allowed at 0.
03:41 I'm ready to hit Submit. You'll see the table name appear under the dataset.
03:47 And in a few moments, it will show a loading message.
03:51 When it's finished loading, if all went well, you'll see a success message.
03:55 Depending on the size of the file and your internet connection, it could take a
03:58 moment or two. If the file upload completes, and you
04:01 don't see any success message, click on Job History, and that will verify that the
04:06 load has been completed. If there was a problem, you'd see an error
04:10 message here. Okay.
04:11 It looks like we're in. Our data set is established.
04:14 And table loaded with ten years of baby names.
04:17 We're ready to analyze our data, which we'll do in the next lesson.
04:21
Collapse this transcript
Querying data
00:00 In the previous lesson, we loaded BigQuery with data from ten years of baby names in
00:05 the US. And now we are ready to analyze that data.
00:08 If you don't already have your BigQuery page open, start by going to the Google
00:12 Cloud Console page and click your project name.
00:16 Then, from the Cloud Console choose BigQuery.
00:20 If you played around with the query function, you'll see a list of recent queries.
00:24 I haven't, so my page is blank except for the welcome message.
00:29 Now, let's expand the data set we want to work with, baby_names_lynda, and then
00:35 select the table we're targeting, bn_2002_2012.
00:40 Initially you'll see the schema for your table.
00:43 Now, let's dive in a bit and click on Details tab over on the right hand side.
00:49 There are some basic info about your table, it's ID as BigQuery views it, with
00:53 your project ID preface, this file size of the table, the number of rows and when it
00:59 was created and last modified. Below that, is a preview of the first five
01:04 rows of the actual data as you can see, Sophia was the top choice for female names
01:09 in 2012. Okay, let's start building our query, lets
01:13 set as a goal to find the top ten names of men in the last ten years.
01:19 There are a couple of ways we can start, we could for example click Compose Query
01:24 and enter our SQL statement by hand. That's always an option but since we have
01:28 the table that we want to work with available, why not let Google write some
01:32 of the code for us? So, click Query Table and there is a basic
01:37 select statement, properly (INAUDIBLE) in our dataset ID and table ID as the from value.
01:43 We'll need to specify which columns to view.
01:45 So, again I could just enter in name, frequency after SELECT or I can switch
01:52 back to schema by clicking Schema here, and then point and click.
01:57 I'll click name first and you can see it now says, SELECT name, and then frequency,
02:03 It adds the comma and frequency. Now, let's change the limit to ten and set
02:08 up a WHERE clause. So, I'll go in and I'll change the LIMIT
02:13 to 10 and then after the FROM clause, I'll enter in WHERE gender equals M, in quotes.
02:24 Okay? Now, let's click Run Query and see what we get.
02:27 And in 1.3 seconds, it's run through the entire 366,000 plus rows and shows us a
02:36 list of ten boys' names. But, that's not from all ten years in
02:41 fact, it's just the first set of ten names that BigQuery runs into, those from 2012.
02:47 To check that, you could add end year equals 2012 to the WHERE statement.
02:55 And the names shouldn't change. Remember that we set up year as an
02:58 integer, so don't put quotes around it because it's not a string.
03:01 So, I'll go ahead and run this. We should get Jacob, Mason, Ethan, Noah,
03:06 Williams, Liam, Jayden as the first seven. And, there it is.
03:10 Looks exactly the same for us. I do want to point out that down below
03:14 you'll only see the first seven rows here on the window.
03:16 There is a next and previous buttons that we can look at to take a look at those names.
03:22 Okay, let's lose the year we set up as a test, now I'm going to add an ORDER BY
03:28 clause, so that BigQuery will look at all the data and pick the top ten by adding a
03:34 DESC or descending attribute. And that will go after my where clause
03:40 ORDER BY frequency and I will click Run Query.
03:45 Well, that's an unusual set of names and if I look back to my SQL, I see that I
03:50 left out the descending attribute so, I'll add that back in and we run the query.
03:56 Now I see that I've got the most popular names but there are duplicates, I see
04:01 Jacob in four different places in the top seven.
04:04 The duplicates I suspect are from different years, let me quickly verify
04:08 that by adding year to the list of columns I want to show.
04:13 So, after frequency I'll put a comma, year and then rerun the query.
04:18 Yep that's the story. Notice how quickly you can test out
04:21 different scenarios with BigQuery, even though it's going through almost 400,000
04:25 records, it's taking only 1.7 seconds this time.
04:30 Now, I can undo that last edition by clicking into the query box and pressing
04:35 Cmd+Z, Ctrl+Z on the PC, I'll need to click it a couple of times, so that I get
04:41 rid of all my additions. Now, I know that I'm going to have to use
04:45 an aggregate function to sum up the frequencies, but I also want to get rid of
04:50 this duplicate. A common way to do that in SQL is to use
04:53 the distinct keyword, so let's try that. (SOUND) Now when I run the query, I get an
05:02 error message. BigQuery does not yet support SELECT DISTINCT.
05:06 Nicely, Google suggests that I use GROUP BY instead.
05:10 Okay so, let's do that and we'll also bring in our aggregate function.
05:16 First I'll go in and get rid of my DISTINCT and then, I'll sum up frequency
05:22 by putting in the keyword SUM wrapping frequency in parentheses, and give it an
05:27 alias as count and we do that so we can use ORDER BY.
05:31 And I'll change ORDER BY from frequency to the new alias count and now I'm ready to
05:38 drop in my GROUP BY clause, and I want to group it by name.
05:44 Okay, so the full thing reads, SELECT name, SUM frequency AS count FROM
05:50 baby_names_lynda.bn_2002_2012, WHERE gender equals M, GROUP BY name, ORDER BY
05:56 count DESC LIMIT 10. Let's run the query.
06:03 Bingo, there are the ten most popular names for boys from 2002 to 2012 And it
06:09 took just 1.2 seconds to run. We've done all of our BigQuery work
06:15 interactively through the web UI. I want to remind you that these same
06:20 results can be achieved programmatically by using a rest interface in your app.
06:24 BigQ couples a extremely powerful engine, with an available from anywhere web
06:29 interface, to analyze a great deal of data quickly.
06:32
Collapse this transcript
Exporting data
00:00 Once you've analyzed your data, the typical output is more data.
00:04 BigQuery makes working with your results very straightforward whether you are on
00:08 your own system or within BigQuery itself. Let's start again from the Google Cloud
00:13 console page for (INAUDIBLE) project. I want to show you how quickly you can
00:16 retrieve and use recently run queries. So, I'll enter the console page and then
00:22 BigQuery and there's a list of all my recent queries.
00:26 Now, when I hover over any of them, I'll see a lightening bolt.
00:30 When I see that, I could click for my query to run.
00:33 Let's go to the one that we finished with. I'll click that.
00:39 And I can click Show Previous Query Results or just click Run Query, and
00:44 there's my results again. To work with this data, I have two
00:48 options, both conveniently grouped on the right above my results.
00:51 Let's check out the external file option first.
00:54 I'll click Download as CSV, and because I am in Chrome you can see the file having
01:01 downloaded on the lower left. Once the file's downloaded I'll click it
01:06 to open. And let me increase the magnification here
01:11 and there are my top ten names along with the header row.
01:14 I can now turn this into a chart, a table and a presentation or import it into my
01:19 own database. Pretty easy right?
01:21 Now let's head back to BigQuery, and this time, let's look at how we can work with
01:26 the same data in BigQuery again. I'm going to change my query to make it a
01:30 bit more substantial. Let me increase the limit to 1000 and I'll
01:36 remove the gender restriction. This should give me the top 1000 names
01:45 regardless of whether they're male or female.
01:47 I'll click Run Query. And in 1.5 seconds, as you can see here,
01:54 I'm seeing the first seven names of the top 1000.
01:57 Notice that it's a mix of male and female names, with Emily and Emma coming in at
02:01 five and six, respectively. Now, let's save these results as a table,
02:05 so I can do further work with it in BigQuery.
02:09 I'll click Save As Table. And that will bring up the Save as Table dialog.
02:13 It's correctly identifying my project, and my data set.
02:16 So, all I need to do is give it a table ID.
02:19 And I'm going to call this top 1000 names. And click OK.
02:24 And now if I expand baby names Lynda again, you can see I now have two tables.
02:30 Let's select it. And there's the schema, just name and count.
02:35 And if you wanted to you could click details to see that information.
02:38 Now, you can easily analyze the previous results.
02:41 Let's click Query Table again to set that up.
02:45 I'll insert an asterisk after Select so that I will get both of my columns and
02:51 let's change the limit to ten. Well and there's seven of the top ten names.
02:59 And we go to next, and there are the other three.
03:02 So basically, we just pull the top ten names off of our previous results, which
03:07 as you can see, is the same as what we have before.
03:10 Remember, Emily and Emma in five and six. Of course you can, and definitely should,
03:15 run much more significant queries. But you get the idea.
03:18 There's no limit to the number of data tables you can store and work with in big query.
03:23
Collapse this transcript
Managing data projects
00:00 BigQuery, along with the other components of the Google Cloud platform, provides a
00:05 basic interface for managing separate projects.
00:08 You can even create a team and control any members access.
00:12 Let's start from our project based Google Cloud console page.
00:16 Now, I'll click BigQuery. There are my recent queries on the right,
00:22 and if you expand the down arrow, next to the project name, you'll see the start of
00:27 some management options. We can create a new data set, switch to a
00:31 different project, or refresh the page. Before we look at different projects, let
00:37 me click off of that, and let's look at the down arrow next to selected data sets,
00:42 so you can see those options. Here's where you would create a new table.
00:46 You can also share the data set and set permissions for those with access.
00:52 I'll click that, so you can see who currently has access, project readers,
00:56 writers, and owners. And we can easily add anyone via their
01:00 e-mail address. They'll need to accept the request to
01:03 confirm it. Let me hit Cancel.
01:05 Any people you've added will show up on your team, which I'll cover in a bit.
01:10 You can also delete the data set when you no longer need it.
01:14 Now, let's move on to table management. So, I'll expand the data set to show our
01:20 various tables. And again, when I hover a table, click the
01:24 down arrow, you can see the available options.
01:26 You can copy the table to the same, or a different data set in the same project.
01:31 You can also export the table. Let me choose Export Table.
01:35 You can export it in either CSV or JSON format.
01:39 Tables are exported to a specific Google Cloud storage location or URI.
01:44 We haven't set that up yet, so it's not available to us at this time.
01:48 And, of course, you could delete the table.
01:53 Okay. So, let's jump back to that switch to
01:55 project menu I found by expanding the project options.
02:00 It shows the current project, and I could display another project if I had one.
02:04 Since I don't, and if I choose display project, the Add Project dialog box is displayed.
02:13 Now, under the same menu, let's choose Switch to project > Manage projects.
02:19 This will take us to the dashboard page under Google APIs, which we saw in an
02:23 earlier lesson. We can get the BigQuery specific
02:26 information by clicking BigQuery in the left column.
02:31 And now, you can see our spike of usage over on the right side of the chart.
02:35 Earlier, I mentioned the team potential. This is were we set that up.
02:39 Click team to see the current line up. It shows the Google administrators, who
02:44 have access to your project, should anything go wrong, the overall project and yourself.
02:48 You can easily add members here. Again, I'll click Add Member, all you need
02:53 to know is the e-mail address and permission level.
02:55 Anyone added is listed on the "Teams" page, with their access level shown,
03:00 editable by any owners of the project. As you can see, many of the management
03:07 functions overlap. Team members set up in BigQuery are
03:10 available to the same project in App Engine and vice versa, as well as most
03:14 other components. Managing data is just one aspect of
03:18 overall project management on the Google Cloud platform.
03:21
Collapse this transcript
4. Enabling Multi-Tiered Storage
Overview of Google Cloud Storage
00:00 As far as most consumers are concerned, and that's pretty much everyone, storing
00:05 your digital items was one of the main first uses of the Cloud.
00:09 It's still big business today and getting bigger, and that's our cue to start
00:13 talking about Google Cloud Storage. All of the personal cloud storage
00:17 companies like Dropbox, iCloud, SkyDrive, even Google Drive all have limits.
00:23 Google Cloud Storage does not. As of this recording, you can upload an
00:27 unlimited number of files of an unlimited file size to Google Cloud Storage.
00:32 Of course, Cloud Storage uses Google's own infrastructure.
00:37 And that not only means infinite storage and high security, but also very fast
00:42 retrieval from data centers around the globe.
00:46 What are the primary uses for Google Cloud Storage?
00:49 Well, besides the digital storage and its partner retrieval, there is secure sharing.
00:56 Google Cloud Storage supports access control lists or ACLs with full OAuth 2.0
01:03 Authentication, so that you can be sure your data is only shared with those with
01:08 whom you want it to be shared. Once your massive amount of data is
01:11 stored, it can quickly be analyzed with Cloud Storage's integrated partner BigQuery.
01:18 And if your online game or app calls for the use of large assets, you've got
01:22 complete integration with Google Cloud's App Engine.
01:26 Moreover, you can also use cloud storage to serve your static web content.
01:33 There are many paths to accessing Cloud Storage.
01:36 First off, there is the Google Cloud Storage Manager, a web based interface
01:41 that we'll explore later in this chapter. You also have the standard HTTP protocol
01:46 access that supports standard file transfer commands like Get, Put, and Post.
01:52 For programmatic transfers, there's a REST or representational state transfer interface.
01:58 And finally, Cloud Storage has its own command line interface written in Python
02:02 called gsutil. Cloud Storage is highly structured, like
02:07 other Google Cloud platform components, it's project based.
02:11 Within projects major divisions or containers are called buckets, and what's
02:16 in those buckets? Objects.
02:20 We've looked at projects in earlier chapters on App Engine and BigQuery, so
02:24 let's focus on buckets. A bucket is the main storage container in
02:28 Google Cloud Storage. Buckets provide access level control and
02:32 are project specific. In other words, you can't share buckets
02:35 from one project to the next. Moreover, you cannot nest buckets, they're
02:39 not like Russian dolls. An object, in Cloud Storage parlance,
02:43 refers to the content. Objects are bucket specific and cannot be
02:47 shared between buckets. An object comes in two parts, the object
02:51 itself and the objects metadata, which is a series of name value pairs that describe
02:57 the object. Objects are verified after upload and if
03:03 they're able to be read, immediately available.
03:05 It works the same way on delete, once removed, they're gone.
03:09 Cloud storage supports browser-authenticated uploading and
03:12 downloading with the ability to pause and resume.
03:16 In case of communication failure or other problems, the uploading and downloading
03:20 will resume automatically. You can also create a user-controlled
03:24 pause/resume feature with the XML API. Google Cloud Storage is a very robust
03:29 feature to store and retrieve the ever expanding digital assets for your games,
03:35 applications, and sites.
03:36
Collapse this transcript
Activating Google Cloud Storage
00:00 Before we can start with the Google Cloud Storage buckets and objects, we'll have to
00:04 activate the service. Just to be clear, if you just want to
00:08 access something stored in Cloud Storage, you don't need to active it.
00:12 Likewise, if you trying to read or write to an existing bucket and you are set up
00:17 as a team member on a Cloud Storage project, activation is not required.
00:22 But because we want to play with our buckets and objects, we'll need to
00:25 activate Google Cloud Storage. So, let's get started.
00:28 I'm going to switch to my Google Console page, where I'll click on my project...
00:35 You must have a project already set up as described in the Chapter 1 lesson, Getting Underway.
00:41 The other requirement is to enable billing.
00:43 We did that back in the start of chapter 3, Working with Big Query.
00:47 Now, depending on the path you've taken with the Google cloud platform, you may
00:52 have already activated Google cloud storage unknowingly.
00:55 Let's check that by going to the Google API page.
01:01 As of this recording, the self-proclaimed Experimental Version of the Google API's
01:06 access page does not show the status of Google Cloud Storage.
01:10 To view that, click the link in the lower left.
01:13 API access page. To go to the earlier version.
01:18 In the left column, you can now see that Google cloud storage is listed, indicating
01:23 that it's already activated. And that's because we worked with the app
01:26 engine and BigQuery first, and had to upload some assets.
01:29 But, let me show you how to activate it if it's not enabled.
01:32 Lets click Services. Once you are on the services page, click
01:37 the Google Cloud Platform tab to just show those services.
01:41 As you can see, Google Cloud storage is showing as enabled.
01:44 Lets turn that off for a moment. Notice that the listing on the left has
01:49 been removed, now let me enable it again. An alert appears to remind you that you'll
01:55 need to sign up for billing before you can create any buckets.
01:58 It'd be nice if Google could detect that I've already done that, but I'm sure
02:01 that's a bug that will be addressed in short order.
02:03 And they may have done so by the time you see this recording.
02:06 So, you may not see the message that you see on the screen right now.
02:10 If you're planning on working with Google cloud storage JSON API, that's beyond the
02:15 scope for this course, so we'll leave that unselected.
02:19 Now if we go back to our Google cloud console page and click on the project and
02:25 then click on cloud storage. You can see we're ready to create our
02:29 first bucket, and that's where we'll pick it up in the next lesson.
02:32
Collapse this transcript
Working with buckets
00:00 Buckets are the primary storage container in Google Cloud Storage.
00:04 As covered in the first lesson in the chapter, a bucket is located within a
00:08 project and itself contains the uploaded objects.
00:12 I'll show you all you need to know about Google Cloud Storage buckets in this lesson.
00:17 If you are following along and having completed the previous lesson, you need to
00:20 do that before proceeding. I'm logged into my Google Cloud Platform
00:24 project when the cloud 001. And I've clicked on Cloud Storage to get
00:30 to the page you see now. As it says, I'm ready to create a bucket.
00:34 Perhaps the hardest thing about using buckets is naming them.
00:37 Buckets must be unique across the entire Google Cloud Storage namespace.
00:42 And because they can appear in a DNS entry as part of a C name redirect, they have to
00:48 confirm to DNS naming conventions. This means that they have to between three
00:54 and 63 characters long, can only use lowercase letters and numbers and the
01:00 underscore, dash and period symbols. Also, they must start with a letter or
01:05 number, but they cannot begin with the prefix goog.
01:10 Got that? Okay.
01:11 Let's go create our first bucket. I'm going to click on that big red button,
01:16 and in the dialog that pops up I'll enter lynda_gcp_assets_001.
01:26 You will be able to use the same name so substitute your own unique variation.
01:32 Google will let you know if its not unique.
01:37 Alright, there is our bucket lets try it out.
01:40 With that bucket selected and you can tell it selected because its in red and the
01:43 path name is displayed. Click Upload > Files.
01:47 When the Files dialogue box appears, navigate to the Exercise Files > Chapter
01:55 Four > 04 03 images. Here, I'm going to select, tpa_bg.jpeg.
02:02 When I click open, Google will warn you, that any files with the same name path and
02:08 name will be replaced and ask if we want to proceed.
02:11 I'll click OK. You can see a progress panel appears in
02:15 the lower right. The upload happens very quickly, so we
02:18 already can see it's successful. And the file now appears in my bucket.
02:23 Now, let's try that other option under Upload folders.
02:27 I'll choose Upload > Folder. Again, we're back at our images folder and
02:32 this time I'm going to choose the Gallery folder.
02:35 Now, let me just expand that for one second to show you that there are four
02:39 files in here and I could if I wanted select all of the files within.
02:43 And I could if I wanted select all of the files within that folder but by selecting
02:48 the folder Google will recreate the folder in my bucket and transfer all the files in
02:54 one operation. I will choose Select, again, I get my
02:58 notice, I click OK. Now, you can see in the upload dialogue
03:02 box, the four paintings are already uploaded, and if you look under the
03:06 bucket, you can see that the folder Gallery has been created.
03:10 If I click on it, I will go into the gallery folder and there are the four images.
03:16 Notice that the path has also changed. Of course, you don't always have to upload
03:21 folders you can create them on the fly. Lets go to the root of our bucket and I
03:25 can do that by just clicking on the path here, and then I'll click New Folder.
03:29 I'm going to call this one temp, and obviously there are no naming restrictions
03:34 for creating folders. To upload a file to a folder, you have to
03:38 be in that folder. That might seem obvious, but what's not
03:42 obvious is that you're actually sing the Google Cloud Storage Manager, which comes
03:48 with full drag and drop capabilities. To demonstrate, I'm going to drill down
03:54 into the Temp folder and then switch to my system file manager finder.
03:59 I've opened up the folder 0403 and exposed the single file within the images as well
04:05 as the folder of gallery images. Let's bring in tpa_bg.jpg just by dragging
04:13 it onto the screen. When I dragged it on, I got a plus
04:17 indicator, and when I released the mouse button, the transfer starts.
04:21 It's a relatively small file and the file's already been uploaded.
04:24 Now, let's go up a level. So, how do we get rid of a temp or other
04:29 directory in file? Just select the folder or file's check box
04:33 and then click Delete Objects. Google asks for confirmation and when I
04:38 give it a go ahead, the folder and all of its contents are removed.
04:42 You can batch delete objects as well. Let's go into the gallery.
04:46 And, if I wanted to remove these three images, all I'd have to do is select their
04:51 checkboxes, and then click Delete Objects. You may have noticed the bucket
04:55 permissions and object permissions options that become available when those elements
05:00 are checked. These features can be used to control who
05:02 has access to your projects Google Cloud Storage elements.
05:06 Because you have to use both bucket permissions and object permissions
05:10 together, we'll cover them in the next lessons that deals with objects.
05:14
Collapse this transcript
Manipulating objects in buckets
00:00 Objects are what Google Cloud Storage is all about., It's the content that you are
00:05 storing in the projects buckets as described in the previous lesson.
00:08 Cloud Storage objects are comprised of two parts, the actual file and a metadata
00:14 description of that file. Let's take a closer look at what you can
00:17 do with objects. We saw in working with buckets, the
00:21 previous lesson how you could easily upload files to your Cloud Storage project.
00:25 I now have that page open in Chrome, although you could use any browser.
00:30 I've expanded my bucket, lynda_gcp_assets_001 to display the
00:36 objects it already contains. In order to manage an object's metadata,
00:40 you have to select its checkbox. So, I'll do that for my tpa_bg.jpg file.
00:45 Now, the metadata option is active, let's click that to open the dialog.
00:51 As you can see, there are a number of descriptive name value pairs.
00:54 The most recognizable is undoubtedly the content type, which is also known as the
00:59 Mime Type. Other descriptions include Content
01:02 Encoding, often used for compressed files, Content Disposition, which notes how the
01:08 user should handle the file, like open or save it.
01:12 The obvious Content Language and Cache Control, which tells browsers how long to
01:17 cache an object. Cache Control, can only be used if the
01:20 object is publicly readable, which we'll cover in just a moment.
01:23 You can also add CustomMetadata. Lets ensure, that this object has a credit
01:28 properly attributed to it. So, in the first open field, I'll put in, credit.
01:33 And then enter NASA as my value. If I wanted to add more meta data, I'd
01:38 click the Plus again. The minus sign, as you probably guessed
01:42 removes it. When I'm done, I'll click OK.
01:45 You can read the metadata of an object by using a tool called gsutil, a Python
01:50 command line application detailing the specific coding of gsutil is beyond the
01:55 scope of this course. But you can find all, all about it by visiting
01:59 developers.google.com/storage/docs/gsutil.
02:07 The other aspect of objects that you can control in Cloud Storage is access.
02:12 It's possible to grant read or write access to a projects bucket and objects.
02:16 Lets say I wanted a particular user to be able to view a file, this tpa_bg.jpg that
02:23 I've uploaded to my Google Cloud Storage project.
02:27 Here's how you do it. First, select the bucket that contains the
02:30 file, the checkbox next to lynda_gcp_assets_001, and then click
02:37 Bucket Permissions. To add an individual to view an object
02:41 within a bucket, I need to add a new permission for a user, and then give that
02:45 user's email address and whatever access level I want.
02:49 So first, I'll change Group to User. And then enter in an email address that I
02:56 know is a Google account user name. It's mine.
02:58 I'll keep the user permissions at reader and then click Add.
03:05 And there's my new entry. Now, I have a particular problem here.
03:09 You'll notice that there doesn't appear to be a OK button anywhere on the screen.
03:13 That's actually positioned a little bit lower to the left but the dialogue box is
03:17 just a tad too big to fit on the screen correctly.
03:20 Even though it's off screen I can tap down to it.
03:23 Now, the last button that had focus was Add.
03:26 So, I'll press Tab once and then hit Return.
03:29 Now, before we complete this step, let me show you what happens if I try to access
03:34 this file. To access the file you'll need to give
03:37 your user a specific URL to reference that particular file on Google Cloud storage.
03:43 The generic path is https://storage.cloud.google.com/ your
03:52 bucket name/your object name. So now, I'm going to Sign out, open a new
04:01 tab and I'm going to try that URL, http://storage.cloud.google.com/ and then
04:13 my bucket name, which you'll recall is lynda_gcp_assets_001/tpa_bg.jpeg.
04:25 Now, when I hit Enter, Google sees that that image is within where all
04:33 authenticated users can read it. So, it asks for an authenticated user, and
04:38 I will sign in. And even when I am signed in, I get an XML
04:44 response of access denied. So, its time to take the necessary second
04:48 step, so I will go back to the Cloud console page and Sign in.
04:53 I will click on my project name and then Cloud Storage and now let me expand the
04:58 bucket and select the object that I want to make available.
05:03 And now I'll click Object Permissions. Now, I have a similar set of permission
05:07 options for my object as I did for my bucket.
05:09 I have a couple of choices here. I could add the same individual user for
05:14 this object, but what if I want other users to see it as well?
05:17 I'd rather not have to enter their names twice once for the bucket and once for the object.
05:21 So, I'm going to set authenticated users, that is all users with Google account to reader.
05:28 And again, I will tab down through the various buttons, there is the Add button
05:33 and I click Tab one more time and hit Return.
05:36 Now, time to do that little Sign out dance again.
05:38 So, I'll sign out of here, open up a new tab and put in my URL.
05:44 I don't have to type much before I can take advantage of Google's memory of page history.
05:49 So, I'll just use my arrow key to go down to that and then hit Enter and again,
05:53 Google asks me to Sign in. So, I'll sign in and this time, I see the image.
06:00 You could also use the gsutil command line tool to manage permissions both on the
06:05 bucket and object level. Again, see the documentation for specific
06:09 coding examples.
06:10
Collapse this transcript
5. Managing Cloud-Based Private Networks
What is the Google Compute Engine?
00:00 Compute, the word, comes from the Latin to count.
00:03 And that's essentially what a computer does, it counts.
00:06 Our every day computers, that we have on our desk, in our briefcases, in our back
00:10 pockets and before you know it, in the clothes we wear, can handle an amazing
00:15 amount of counting. But they are nowhere near powerful enough
00:18 to solve the ever-increasing computing needs that programmers, scientists,
00:21 marketers and even artists come up with. And that's where, Compute Engine comes
00:25 into play. Compute Engine is intended to run
00:29 large-scale computing workloads as efficiently, quickly and consistently as possible.
00:35 It uses the same infrastructure as massively used tools like Google search,
00:40 Gmail and Google Ads. Compute Engine works by creating and
00:45 networking virtual machines or VMs, on demand.
00:49 In essence, Compute Engine is an engine that makes computers.
00:53 These VMs distribute the computing power for faster computations.
00:58 Compute Engine can scale up VMs very efficiently not only in terms of their
01:02 number, but also their processing power with the ability to apply 1, 2, 4 or 8
01:09 core processors per VM. As you might suspect when working with a
01:14 number of machines virtual and otherwise networking is critical.
01:18 Compute Engine offers both static and ephemeral external IP addresses.
01:23 Isolation from other from other computing systems and unauthorized access is robust
01:28 and automatic. The compute engine network is protected by
01:31 one or more easily configured firewalls. There are a variety of data storage
01:36 options with compute engine, if your data is only needed for the life time of the
01:41 virtual machine working with it, you can use an ephemeral disk.
01:44 For more long term needs, persistent disk with network storage up to 10 TB are available.
01:50 Compute Engine also integrates with Google Cloud Storage, for unlimited storage of
01:55 large scale data and assets. The heart of Compute Engine is the Virtual
02:00 Machine and its ability to create different VM instances.
02:04 Each VM is considered an Instance resource and part of an Instance collection.
02:10 An Instance resource relies on other resources such as a Disk resource for
02:14 storage, a Network resource for connectivity and an Image resource, as in
02:19 disk image not graphics image. As with other Google Cloud platform
02:24 components, the VMs in Compute Engine are project-based.
02:28 Moreover, each resource for a particular VM is assigned a particular plane, global,
02:35 regional or zonal. Global resources like a network resource,
02:39 are available from any VM in the project. Regional resources such as a static IP
02:45 address, can only be used by instances in the same region.
02:49 Zones are specific geographic locations and resources can be assigned per-zone
02:54 status, to isolate them as needed. Access to Compute Engine is available via
02:59 a number of routes. There's a Python base command line
03:02 interface called, gcutil. The Google Compute Engine Console which
03:07 we'll be using, is a graphical UI, that requires no installation.
03:11 You have access to the Google API Client Libraries and finally, there's a RESTful
03:16 API for representational state transfer methods.
03:20 Now that you've had a chance to get your head around Compute Engine, you're ready
03:23 to get started, by launching your very first virtual machine instance.
03:27
Collapse this transcript
Launching an instance
00:00 Before we can create our first virtual machine instance, we'll need to activate
00:04 the compute engine. Luckily, if you've already created the
00:07 project, it's a one stop operation. We'll start on the Google Console page for
00:12 our project, and then click Compute Engine.
00:16 This page is the Compute Engine Console, and again, there's a big tempting red
00:20 button just waiting for us, so let's click it.
00:23 I'll click New Instance. First, we need to give our instance a name.
00:27 Once more the naming convention is particular.
00:30 But this time it's a little different. For instances, you can give only small
00:34 letters, numbers, hyphens, no underscores, so I am going to call this one lynda-vm-001.
00:45 And I'll enter a brief description. Call this my First VM Instance.
00:52 Now, we can add some tags as further identifiers.
00:55 These need to be a single word separated by a comma.
00:58 So, I'll add lynda,test. We can also add metadata for machine read content.
01:05 I'm going to set myself up as creator, not that I have any sort of complex or anything.
01:10 Next, let me scroll down a tad, in the Location and Resources section, we get to
01:16 choose the geographic zone where you want your instance housed.
01:20 Let me open up the list, so you can see what's available.
01:23 There are two in Europe and three in the US currently.
01:26 I'm going to select the one in the US that has the next maintenance date the furthest
01:31 out with the thought that it's the one that was most recently serviced.
01:36 So, let's go with US Essential 1B. Now, you get to decide how much computing
01:41 power you want for your instance. Let me expand the list and show you the options.
01:46 The number after the name tells how many core processors are being used.
01:50 Keep in mind the greater the number of core you use and the larger the memory,
01:54 the faster your computation will likely run but at a more expensive rate.
01:58 So, you need to balance costs with reward. I also want to point out the least
02:02 powerful, the f1-micro at the top of the list.
02:06 If you have a smaller job, that might still be adequate.
02:09 So, I'm going to go with the default which is n1-standard-1 for now.
02:13 Next, Boot Source determines the disk used to boot the instance.
02:17 You can choose a persistent disk from an existing image, a snapshot, which we don't
02:22 have yet, or an actual disk, something else not yet available.
02:26 There's also an option to use a scratch or ephemeral disk, but that's not recommended.
02:31 As you can see when I expand the options. So, let's go with persistent disk from image.
02:37 And, of course, now we can specify the image to use.
02:40 Well, this sets the OS, operating system, installed.
02:43 You can use any of the ones that you see listed here.
02:46 Of course, it's really not recommended that you use a deprecated version, but you could.
02:50 So, I'm going to go with the default which was debian-7-wheezy.
02:55 We have no additional disk, so we'll skip that one for now.
03:00 Now, we come to the network. Compute Engine creates a network for us
03:03 called default. You can create up to four additional
03:06 networks for isolating instances, or for testing and for production.
03:11 Keep in mind that networks cannot span projects.
03:14 They're project-specific. We'll keep the default network.
03:17 External IP addresses are set up to allow connections from outside the network.
03:22 You can select either an Ephemeral, meaning one that lives and dies with the
03:26 VM, Static, or None. We're going to stick with Ephemeral.
03:31 Finally, there's the Project Access section.
03:34 This helps us connect to other Google Cloud platform components within our project.
03:40 I'm going to leave everything at the default setting for now.
03:42 We can always change it later. Okay.
03:44 We're ready to click Create. As the message says, this could take a few moments.
03:48 And there's our message that it's being created.
03:51 And there's our instance, congratulations, your virtual machine is alive.
03:57
Collapse this transcript
Working with resources
00:00 As mentioned in the first lesson in this chapter, Compute Engines VM instances are
00:05 considered a resource that uses other resources.
00:08 In this lesson, I'll show you how to work with two instance related resources,
00:12 snapshots and firewalls. A snapshot is a copy of your disk image
00:17 taken at a particular point in time, snapshots make great backups.
00:21 Let me show you how to create a snapshot with the Compute Engine console.
00:24 We are in the Compute Engine console page now having created our first instance in
00:30 the previous lesson. So, let's click Snapshots to begin.
00:34 As you can see no snap shots have been created yet, so, let's create a new one by
00:38 clicking new snap shot. I'll enter the name of the snapshot first.
00:42 The same naming convention, lowercase letters, numbers and hyphens only, applies.
00:48 I'll call this snapshot-001. In the description, I'll enter the date
00:53 that I took the snapshot. Call this snapshot of persistent disc on 06/4/2013.
01:03 Finally, I'll choose the persistent disk I want to take the snapshot of, and of
01:07 course, this is easy because we only have one.
01:10 Because this is our boot disk, Google automatically prepends the word Boot to
01:16 our VM instance name. So, now I can click Create, and it looks
01:21 like it's going to be created just fine. It will take just a moment and when it's
01:24 done, you'll see a green check mark to the left of the snapshot name.
01:28 Alright, our snapshot resource has been created.
01:31 Now, let's dig a little deeper and create a new firewall.
01:34 As I'm sure you're aware, a firewall controls who has access to your network.
01:39 As you'll soon see the default network we chose when we created the VM instance
01:43 includes two firewalls, but you can add your own with Compute Engine.
01:47 Click on Networks to begin. Once you have your network shown click on
01:52 the entry. This will provide you with more details
01:55 about the network itself including the existing firewalls.
01:59 Let's expand default SSH to see the details on that one.
02:04 I'll scroll down ahead. This firewall allows TCP connections from
02:09 any source to any instance on the network over port 22.
02:13 So, let's open that up a bit, and create a new firewall that allows HTTP connections
02:18 over port 80. I'll click Create New next to Firewalls.
02:23 In the Create a new Firewall dialog, I'll first give it a name, and I'll call this allow-http.
02:30 Now, I'll enter a brief description, Allow HTTP access from port 80.
02:41 Okay. We'll keep the source IP ranges as is, so,
02:45 it can come from any IP address. Under Ports and Protocols, I'll enter in
02:50 TCP, that's my protocol, :80, my port. If I wanted to add another option, I'd put
02:56 in a comma and then protocol colon port value.
03:00 Let's leave the optional source tags and target tags as they are and click Create.
03:07 Message indicates that our Firewall is being created, and it looks like it was successful.
03:12 And there it is, showing up in our listing under Firewalls.
03:16 Let's expand it just to double check. It looks good, and you can also see that
03:21 there is a Delete This Firewall should you decide you no longer need it.
03:24 So, pretty cool, right? We were able to manage the snapshot and
03:27 firewall resources through the Compute Engine console.
03:30 These and other resources can also be managed through the gcutil tool that I
03:35 mentioned in the initial lesson for this chapter, as well as the restful interface.
03:40
Collapse this transcript
Managing instances
00:00 The biggest selling point for Compute Engine is its ability to create and work
00:04 with multiple VM instances. In this lesson, we'll look at how you can
00:08 manage those instances. So I'll start from my Google Cloud Console
00:12 from my project LyndaCloud001. And let's click on Compute Engine again.
00:19 And now, since I've created an instance, I see that instance listed and there's also
00:23 a chart showing a couple of different things.
00:26 CPU utilization, Network traffic and Disk traffic.
00:30 As you can see, everything's pretty low key at the moment.
00:33 Now let me click on my instance entry here.
00:37 This will show me details of the instance and give me some editing options.
00:42 I can for example edit the tags, so I'll click Edit next to tag and let me change
00:48 test to external. I'll click Save and there's my new tags.
00:56 We scroll down somewhat. Now if we had an additional disk setup, we
01:03 could go ahead and attach it to this instance.
01:05 I can also change the custom metadata. So I'm going to add a new metadata entry.
01:15 And, I'm going to just call this one Access, and set it to External since we
01:20 added that new firewall. I've clicked Save and now it's updating
01:25 the custom metadata. And, let's scroll on down...
01:32 Now if you do start to work in REST or SSH, the Compute Engine console has very
01:37 handy links to the code that you'll need. If I click on REST here, you'll see the
01:43 original REST response and this is very useful for, for working with REST.
01:49 You also have a link to the REST API reference.
01:52 I'm going to click close. Same thing with SSH.
01:56 Finally, I want to point out the Delete Instance button obviously since this is my
02:00 only instance. I don't want to remove it.
02:02 Instead, let's clone it. Cloning an instance is a powerful feature
02:06 that allows you to make a copy of an existing instance and then modify key elements.
02:11 So I'll click Clone This Instance, and I get a new dialog box.
02:14 Compute Engine already labels this instance as a clone, and I'm fine with that.
02:19 So let's keep the name, but I'm going to change the description to Clone of first
02:25 VM instance. I could modify the tags or the metadata,
02:30 but honestly that's pretty inconsequential.
02:32 What I really wanted to show you is that you can create a cloned instance that uses
02:36 the same persistent disk image to boot from.
02:39 But change the processor so you can create your own network of machines with varying capacities.
02:45 So let me scroll down a little bit more to the location and resource section.
02:51 And I'll open up Machine Type list, and then I'm going to up to the very top to F1 micro.
02:57 Compute Engine gives you control over how much processing power your VMs use, so you
03:02 can relegate a smaller job to a clone better-suited for the task.
03:06 Okay, those are all the changes I want to make.
03:08 Let me click Create. And it looks like my clone is underway,
03:15 and there it is. A lot faster than I could call down to my
03:18 IT department, if I had an IT department. The clone is up and running.
03:22 Of course you always have the possibility of selecting any instance and deleting it
03:26 when no longer needed. That's one power you definitely want to
03:29 use judiciously.
03:31
Collapse this transcript
6. Developing Databases in the Sky
Understanding Google Cloud SQL
00:00 In this chapter, we'll take a look at the Google Cloud platforms, two database
00:04 solutions, starting with Google Cloud SQL. Google Cloud SQL is a fully managed
00:11 relational database service for MySQL databases.
00:14 The service takes full advantage of its Google list by replicating your data
00:20 across multiple data centers. One of the key advantages of Cloud SQL is
00:26 its use of the widespread MySQL databases. You can import the existing databases via
00:32 mysqldump and export them as well. Cloud SQL allows you to create multiple
00:40 database instances, each capable of 100 gigabytes of storage and allotted 16
00:45 gigabytes of RAM. As with most other Google Cloud platform
00:51 components, there are multiple ways to access Cloud SQL.
00:55 You can use either Java or Python. But there's also a Java-based Command line tool.
01:01 As well as this SQL prompt right in the Google APIs Console.
01:04 We'll be using the SQL prompt later in this chapter.
01:09 There are a few restrictions to the Cloud SQL implementation.
01:13 Neither user defined functions nor MySQL replication are supported.
01:19 Move over, certain MySQL statements won't work.
01:23 These include Load Data Infile, select into Outfile Dumpfile, Install Uninstall
01:27 Plugin, Create Function and Load File. As you would expect, Cloud SQL is tightly
01:40 integrated with the other Google Cloud Platform components.
01:43 For example, it uses Cloud Storage to handle its imports and exports.
01:47 Cloud SQL can be accessed from either App Engine or Computer Engine, so that's the
01:53 lay of the land for Cloud SQL. Now you are ready to create your first
01:57 database instance.
01:58
Collapse this transcript
Managing instances
00:00 As noted in the intro to Cloud SQL, you can easily create and manage database instances.
00:06 We'll walk through all the necessary steps in this video.
00:08 Like several other of the Google Cloud platform components, Cloud SQL is
00:13 automatically activated, but you will need to enable billing to get started.
00:17 Unfortunately, as of this recording, there is no more free access.
00:20 However, you can opt for a per-use plan which, for most testing, would cost only a
00:26 dollar or two for a month's use. Okay, so, from the Google Cloud Console,
00:31 for my project, let's go down to the bottom of the page, and click on Cloud SQL.
00:39 Now, I am ready to click New Instance, once that screen is up, I only have a few
00:44 fields to complete. First, let's give our database instance a
00:48 name, again, it has to be unique, I am going to enter lynda-db-001.
00:58 As you can see, Google checks as you're typing to make sure that it's unique.
01:02 Okay, I've entered my name. Feel to create your own unique instance name.
01:06 Next, I could choose which region I want my database hosted in.
01:10 Of the two options, let's stick with the U.S.
01:13 Then, I could choose my tier. I'm only going to be doing some minor testing.
01:19 So, I'm going to go with D0, the lowest tier, which is only 2 and a half cents per
01:23 hour, and still allows 128 megabytes of RAM.
01:27 Now, for the billing plan, I can choose either per use or package.
01:31 The least expensive way to go, especially for the testing I'm going to be doing, is
01:34 per use. So, I'll stick with that.
01:37 Let's scroll down to see the other elements, on the new instance dialog.
01:41 The next section deals with backups. Let's leave that to a later lesson.
01:45 And finally, we can pick, which App Engine applications we want to work with.
01:48 Our existing app pre-populates the selection, and I'm good with that.
01:52 So, lets click Confirm. And our Cloud SQL instance is created
01:58 pretty quickly. Once it's created, I can click its name to
02:01 see the details and also handle some management chores.
02:07 You can see our current status running right up top.
02:09 If we want to change any settings, all I have to do is click Edit.
02:14 This allows me to change my tier if I should need some more database power, for
02:18 example, as well as my billing plan, backup options and App Engine connections.
02:22 I'm going to hit Cancel to go back because I don't need to make any changes at this time.
02:26 Next to the Edit button are Import and Export controls.
02:29 We'll discuss those in the next lesson. And next to them is the Restart button.
02:34 Occasionally, you may come across a need to reset some configuration on your
02:38 database instance that require a restart. It's a one click option here in the Cloud
02:42 SQL Console, so let's give it a shot. It asks for confirmation.
02:47 I'll click OK. You can see by the status that it is restarting.
02:51 And now, we're back to running. Great.
02:54 And finally, there is a delete button for deleting your database instance.
02:58 Of course, you want to be very careful before you delete an instance.
03:01 Not only will it wipe out any data you have stored on the database, but it also
03:05 prevents you from using the same instance name for two months.
03:08 So, be sure before you delete. And that completes our brief tour of Cloud
03:13 SQL instance management. In the next lesson, I'll show you how to
03:16 import and export your data.
03:18
Collapse this transcript
Importing and exporting data
00:00 Cloud SQL works with your existing MySQL data by importing it directly.
00:05 Likewise, you can export the existing Cloud SQL database content to a compatible
00:10 format, which can easily be imported into your local databases.
00:13 The key is Google Cloud Storage. All import and export operations for Cloud
00:18 SQL go through Cloud Storage. As you may recall from chapter four, Cloud
00:23 Storage allows you to upload files of any size, and that includes SQL files with all
00:29 the database structure and content necessary to restore the database.
00:33 Let's go to Cloud Storage from our Google Cloud Console page to bring in a file to import.
00:41 Now, I could use my existing bucket, lynda gcp assets 001.
00:46 You'll recall that buckets are containers for my uploaded objects, but let's create
00:51 one just for the data. So, I'll click New Bucket, and then give
00:55 it a unique name. I'm going to call this data underscore
00:59 roux, because we're pulling in data from Roux Academy, underscore 001.
01:03 I'll click Create, and there's our bucket. Now, with that selected, let's Upload the file.
01:09 I have an SQL file called students dot sql with 2500 fictitious students from that
01:21 not-quite-real Roux Academy, prepared and waiting for us here in the Chapter 6 > 06
01:27 03 > Assets folder. I'll select my file and then click Open.
01:35 I'll give Google the go ahead and the uploading starts as you can see in the
01:39 lower right. All right, our file is already uploaded.
01:44 Now, I want to take a moment to discuss the file itself.
01:47 This SQL file was output from a MySQL database that has three interrelated
01:52 tables by using PHP My Admin. You could also use the command line
01:57 command MySQL Dump. Before you import the file however, make
02:01 sure that it creates the database and includes a use database statement, like I
02:06 have here on lines 20 and 21. You may have to alter it in a text editor.
02:11 I did for this file. Otherwise, you're likely to run into
02:14 import errors with Cloud SQL. Okay, let's head back to the Cloud Storage Console.
02:19 Here, we have our file and we're good to go.
02:21 So now, we can go to Cloud SQL by way of the main console page.
02:25 Now, I can select the database instance I've created in the previous lesson and
02:35 click Import. In the Import Data Dialogue, I'll enter in
02:41 the Cloud Storage path to MySQL file. The syntax here is critical.
02:45 I'll enter gs colon slash slash to point to the Google Cloud Storage.
02:51 And then, the unique name of my bucket, which is data underscore roux underscore
02:58 001 a slash, and the name of my SQL file, students dot SQL.
03:03 And because I explicitly create a database of my file, I'll leave the Into Database
03:08 field blank. Let's click OK.
03:10 And I'll give it a whirl. Now, you'll see the status Importing Data
03:14 and, after a moment, that message will disappear and status will change to
03:19 Running, but you're not in the clear yet. Click on the Operations Log to see whether
03:23 you've succeeded or if there's an error. excellent, we're in.
03:27 Now, even though we haven't modified the file, click back to Summary.
03:33 And let's do an Export. In the Export dialogue box, I'll enter the
03:38 path I want the file to export to. I'll use the same bucket, but I'll give it
03:41 a different file name. And because I want to export all the
03:44 databases in the file, I can leave the second field blank.
03:48 So, I'll enter gs colon slash slash my bucket name data underscore roux
03:53 underscore 001 slash and then the file I want to export to, I'm going to call this
03:59 studentsExport dot SQL. As I said, we'll leave Databases to export
04:05 blank and click OK. Again, they'll be a status message of
04:09 exporting data and when that's done and the status has reverted to running, we can
04:13 check our operations log. Okay, let's click on Operations Log, and,
04:18 again, looks like we've succeeded. Now, let's head over to Cloud Storage to
04:22 verify the actual file is in our bucket. I'll click on data roux 001, and there are
04:32 two new files, my exported file along with a log entry.
04:35 From here, you could download the file by clicking it and putting it to use locally.
04:39 So, good job, we transferred a SQL file to Cloud Storage.
04:43 Imported it, and then exported it back out to Cloud Storage.
04:47 A good day's work in a few minutes time, if I say so myself.
04:50
Collapse this transcript
Querying the SQL database
00:00 Once you've have your database instance populated, how do you interact with it?
00:04 Well, as noted in this chapter's intro, there are a number of ways including Java,
00:08 Python and Command Line. But another way is to use the SQL prompt
00:13 accessible online, and that's what we going to be working with in this lesson.
00:16 To get to the SQL prompt, lets go to the Cloud SQL console page for our project.
00:23 I'll scroll down to the bottom, and click Cloud SQL.
00:25 Once there, I'll click on my database instance.
00:29 Now, as of this recording, the SQL prompt is maintained on the Google API console.
00:35 There's a link to it on the lower left. Let's click that.
00:38 Again, you'll see a list of your database instances.
00:42 We have just the one. So, let's click on that instance, to get
00:47 more details. There you can see in the chart, the spike
00:50 of our activity. And now, we can also see the SQL prompt
00:54 tab, which I'll click on. Up top there's a field for entering your
00:58 SQL statements. Before we can use that, let's change the
01:01 database list to select our imported database students.
01:07 Now, we can enter any SQL statement. Let's try a simple one first.
01:11 I'll type select, asterisk, so I get all the fields, from, the name of my table is
01:17 also students. And now I'll click Execute.
01:21 The SQL Prompt returns the first 200 records in the record set.
01:25 You'll note, as I scroll down, that there is no way to show the rest of the record set.
01:33 So SQL Prompt is best used to quickly test out your SQL Syntax.
01:37 So, let's try out something a bit more challenging.
01:40 I'll scroll back to the top. I've prepared a code snippet for an SQL
01:44 statement that returns the student last name and GPA for all students who have
01:49 studied abroad, with a GPA of 3.0 or better.
01:52 This works with two different tables in a relational format.
01:55 You can find the snippet in the Chapter Six 06-04 assets folder.
02:00 I have it open in my code editor, so I'll switch to that.
02:03 As you can see it uses full back ticks and proper syntax for my SQL, and it also
02:08 limits the responses to the first 100. Okay, I'll select copy.
02:15 Now, back to my Cloud API's page. I'll remove my previous query, and paste
02:21 in the new one, and then click Execute. And there it is, in real time, very, very
02:27 quickly 114 milliseconds. Now, as I scroll down the records you can
02:32 see that there's a yes in the studied abroad column.
02:36 And, that they all have a GPA over 2.99. It starts at 3.99 and slowly descends.
02:45 Again, this is just the top 100 entries. Okay, so it looks like our database is up
02:52 and running on the relational database manager, Cloud SQL, that's excellent.
02:57
Collapse this transcript
Scheduling backups
00:00 Data is the life blood of many applications and its imperative that you
00:04 keep data safe and protected. Cloud SQL includes a full back up
00:08 capability so that you can easily schedule backups regularly.
00:12 Let me show up how to set that up. We'll start in the cloud SQL console.
00:16 I will click my data base instance to see the details.
00:20 You may remember having seen the back up options when looking at the instance
00:24 details earlier. Right now, they're set to the default options.
00:28 So, let's click edit to make a change. I'll scroll down, so those options are
00:35 front and center. Now, from the backup window list, we can
00:39 choose a four-hour window for the backup to occur.
00:44 So, if I expand the list, I'd like to reset it to a pretty inactive time, like 1
00:49 AM to 5 AM. You can tell by the UTC, the Universal
00:53 Time Code, that it's correctly calculating the time according to where I am, which is California.
00:59 Now, I chose that inactive time because rather than go with Asynchronus
01:05 replication, which can be handled faster, but a little less reliably.
01:08 I want to stick with Synchronus. It takes a bit longer, but its more
01:12 reliable, and the database would be unavailable during those backups.
01:16 Click Confirm when you're ready. After the first backup has run, you'll see
01:22 it listed in the backup section here at the bottom.
01:25 It will list the day and time, and it will also have a restore link.
01:29 Clicking that link will immediately restore the database, so be sure that's
01:33 what you want to do before taking that action.
01:36 As you can see, back up is integral to Cloud SQL, as well as being quite
01:40 straightforward and efficient.
01:41
Collapse this transcript
Working with Google Datastore
00:00 Not every databased problem requires a relational solution.
00:04 Quite often as long as the data is indexed properly it can be served quickly and
00:09 efficiently, and what's more it can scale to a massive size.
00:12 Google's cloud data store was designed to fill these needs.
00:16 Let's take a look at how it works. Cloud Datastore is intended for
00:21 non-relational data, and is schemaless, meaning that there are no pre-defined data columns.
00:27 It supports transactions as well as robust queries and like Cloud SQL it is fully
00:34 managed by Google. Although just recently introduced Cloud
00:38 Datastore is really the next generation of the Google App Engine High Replication
00:44 Datastore known as HRD. HRD had truly exceptional performance over
00:49 one petabyte of data stored stored and a petabyte is a 1000 terabytes.
00:55 Also 4.5 trillion transaction per month all these with 99.95% overall up time.
01:04 There are several methods of development environments for cloud data store ,these
01:09 include Java, Phython, Node.js. The SDK also includes development server
01:15 for local stimulation, there are some key concepts that will help you understand how
01:21 the Cloud Datastore works. The first of these are entities, entities
01:26 are datastores primary object, each entity can have one or more properties.
01:31 And each property can have one or more values more over properties can be of
01:37 different types, types as in integers, string and so forth.
01:43 Each entity is a particular kind and a kind is a type of category, the example
01:50 that Google gives is that an employee would be a kind in a human resources app.
01:56 Kinds are used by queries for sorting and filtering.
02:00 Each entity must have a unique identifier, be it a key name string, or an integer.
02:05 Entities can be hierarchical, when you create one, you can designate another
02:10 entity as its parent. This hierarchy is referred to in Data
02:14 Store speak as "ancestor paths". An app retrieves results from the Cloud
02:19 Data Store through a series of queries. Queries can include the entity kind, a
02:25 filter to search entity property values, ID's, or ancestors, and, one or more sort
02:31 orders, for sequencing the results. Queries, are aided by a robust set of
02:38 indexes so that they can become, quicker, and more efficient.
02:41 An index is a table of entities ordered according to their properties, and
02:46 optionally, ancestors. Indexes are automatically updated whenever
02:51 an entity is modified for efficiency. The Google Cloud Data Store can be a
02:55 little difficult to get your head around, particularly because it is quite different
02:59 from a typical database. Hands on experience can help and we'll
03:03 take a look at an example use in the next lesson.
03:06
Collapse this transcript
Running an example project
00:01 I'm going to ask you to think back, to the very first project we did.
00:04 Setting up the sample app photo feed. It probably escaped your notice at the
00:09 time, but that app used Cloud Datastore to save and retrieve the data.
00:15 In this lesson, we'll take a look, at how you can view the result, in Cloud Datastore.
00:20 From the Google Could Console page for our project, lets scroll down to the bottom,
00:26 and click Cloud Datastore. This will take you to the Could Datastore
00:30 console, where you're first presented, with a series of stats.
00:35 From the intro to Cloud Datastore, you'll recognize the keyword kind.
00:41 And if I open up the list of kinds, you can see it says, All kinds and then the
00:46 three kinds already set up, Comment, Demo User, and Photo.
00:52 Let's leave it at All kinds for now for the broadest overview.
00:55 You can see over on the right, there is a graphical depiction of the storage used,
01:00 separated by entity kind. If I scroll over each of these entries,
01:05 you can see the exact values. Below that, are some entity statistics,
01:11 concerning the entity size and number. Then, there's a breakdown by property type.
01:17 Remember that properties are related to entities, including a pie chart for better visualization.
01:24 Now, let's turn to queries. I'll click Query over here on the left.
01:29 In our app, you'll recall that there were three form fields in our app.
01:33 Two that were visible, the Comment and the Photo, and one that was hidden: the User.
01:39 We can query each of them. Let's switch to Photo here, in the Kind list.
01:44 And click Run Query. That displays the two photo records stored
01:49 as Blob Objects. If I select the check box next to an
01:53 entry, the delete option appears. I could also click ID to see more of the
01:58 related details. We can also delve into the Indexes.
02:03 I'll click Indexes and you can see the two used in the app, Photo and Comment, and
02:08 how they relate to the other fields. We can also see a breakdown of the number
02:12 of entries and total size. Now, let's head back to the top level of
02:17 Cloud Datastore. As you can tell by that marvelous big red button.
02:23 We are not stuck with the previously created entities, we can create new ones.
02:28 Essentially, a new data point. I'll click Create Entity to start.
02:34 Let me switch the Kind from Comment to Demo User.
02:39 Now, I just need to complete two fields, e-mail and nickname.
02:43 So, I'll enter in a different e-mail address and a nickname.
02:49 Can't get much more of a nickname than Joey.
02:51 I could also add a custom property by clicking add property here.
02:56 But let's keep our examples simple. Now, I'll click Create.
03:00 After the new entity has been created, I'm taken to the query section.
03:04 Let's switch Kind to Demo user, and click Run Query.
03:11 And there's my new entity, complete with the required ID.
03:16 There's a great deal more you can do with Cloud Datastore.
03:19 We've barely scratched the surface. You can dive deeper into the subject by
03:23 investigating Google's robust documentation.
03:27 You'll find a link to it directly on the Google Cloud console page, right under
03:31 Cloud Datastore.
03:32
Collapse this transcript
Goodbye
Next steps
00:00 Thanks for taking the time to check out Up and Running with Google Cloud Platform.
00:04 If you're shopping for Cloud services, be sure to look at Up and Running with Amazon
00:09 Web Services with John Peck, right here on lynda.com.
00:12 Thanks again for watching, and have a great day.
00:14
Collapse this transcript


Suggested courses to watch next:


Are you sure you want to delete this bookmark?

cancel

Bookmark this Tutorial

Name

Description

{0} characters left

Tags

Separate tags with a space. Use quotes around multi-word tags. Suggested Tags:
loading
cancel

bookmark this course

{0} characters left Separate tags with a space. Use quotes around multi-word tags. Suggested Tags:
loading

Error:

go to playlists »

Create new playlist

name:
description:
save cancel

You must be a lynda.com member to watch this video.

Every course in the lynda.com library contains free videos that let you assess the quality of our tutorials before you subscribe—just click on the blue links to watch them. Become a member to access all 104,069 instructional videos.

get started learn more

If you are already an active lynda.com member, please log in to access the lynda.com library.

Get access to all lynda.com videos

You are currently signed into your admin account, which doesn't let you view lynda.com videos. For full access to the lynda.com library, log in through iplogin.lynda.com, or sign in through your organization's portal. You may also request a user account by calling 1 1 (888) 335-9632 or emailing us at cs@lynda.com.

Get access to all lynda.com videos

You are currently signed into your admin account, which doesn't let you view lynda.com videos. For full access to the lynda.com library, log in through iplogin.lynda.com, or sign in through your organization's portal. You may also request a user account by calling 1 1 (888) 335-9632 or emailing us at cs@lynda.com.

Access to lynda.com videos

Your organization has a limited access membership to the lynda.com library that allows access to only a specific, limited selection of courses.

You don't have access to this video.

You're logged in as an account administrator, but your membership is not active.

Contact a Training Solutions Advisor at 1 (888) 335-9632.

How to access this video.

If this course is one of your five classes, then your class currently isn't in session.

If you want to watch this video and it is not part of your class, upgrade your membership for unlimited access to the full library of 2,024 courses anytime, anywhere.

learn more upgrade

You can always watch the free content included in every course.

Questions? Call Customer Service at 1 1 (888) 335-9632 or email cs@lynda.com.

You don't have access to this video.

You're logged in as an account administrator, but your membership is no longer active. You can still access reports and account information.

To reactivate your account, contact a Training Solutions Advisor at 1 1 (888) 335-9632.

Need help accessing this video?

You can't access this video from your master administrator account.

Call Customer Service at 1 1 (888) 335-9632 or email cs@lynda.com for help accessing this video.

preview image of new course page

Try our new course pages

Explore our redesigned course pages, and tell us about your experience.

If you want to switch back to the old view, change your site preferences from the my account menu.

Try the new pages No, thanks

site feedback

Thanks for signing up.

We’ll send you a confirmation email shortly.


By signing up, you’ll receive about four emails per month, including

We’ll only use your email address to send you these mailings.

Here’s our privacy policy with more details about how we handle your information.

Keep up with news, tips, and latest courses with emails from lynda.com.

By signing up, you’ll receive about four emails per month, including

We’ll only use your email address to send you these mailings.

Here’s our privacy policy with more details about how we handle your information.

   
submit Lightbox submit clicked