Learn how to create and configure an S3 bucket.
- [Instructor] Let's start by creating an S3 bucket. I've recently visited S3, so I see the icon here under Recently visited services. And if you don't have that, you can either expand All services and find it in the list below under Storage, or you can simply type in S3, and the drop-down will review. So going in to see S3 console, you can see that we have a number of options here, primarily, Create bucket. I want to point out the Global designation up here in the top right corner. Normally in AWS services, you select a region in which you are going to create your resources.
That is still the case in S3. S3 buckets are tied to a region, and it's something you want to consider when you create your buckets. However, it says Global because this interface makes no distinction about which region your buckets are in. Once you have a few, even if they're across different regions, there will be one place to come and view them, right here. Let's go ahead and create the bucket by clicking Create bucket. Now, we're going to have a primary bucket and a secondary bucket for logging. The name of the bucket needs to be in the format that would work as a URL because this bucket name will be embedded into URLs for the objects that are within it.
We'll call this demo-primary. Now, surprisingly, it seems like this name is not necessarily taken, but just to be sure, we're going to tack on my initials here, and you'll want to do the same because remember, we have a global name space for S3 bucket names, and we want to make sure that the names that we choose are unique. We don't need to copy any settings for an existing bucket. Here you can choose the region, like I mentioned. You want to pick the region that is closest to where the workload is going to be that will be accessing this data.
Right now for purposes of this demo, US West Oregon is fine, that's close to where I am, so I'm going to click Next. Now you see we have a few options on the next screen. Versioning allows us to keep a history of the changes we make to objects in the S3 bucket. Let's go ahead and enable that so we can look at it later. Next, logging. Now, I do want to enable this, but I won't be able to do it right now because the first thing I have to choose is a target bucket to receive the logs. I could log back into this bucket we're creating right now, but I'd rather not.
Instead, we'll create another bucket, and we'll come back in a minute and enable that option. Here under Tags, we can set tags for the entire S3 bucket. Don't confuse this with tags for the individual objects that are inside the bucket. Rather, these tags will be applied to the entire bucket. So we could give this bucket a Creator tag, and I could put my name here. So that could help me later to understand who created which buckets in my account. I'll click Save, and we've got one tag. Click Next. Now you see the permissions that have been applied to this bucket.
The owner of this account automatically gets read and write privileges over the whole bucket. If I had identity and access management accounts created within my AWS account, this would be the place where I could give those kinds of users individual permissions. You can also see that we have the option to set access for another AWS account, so if your organization has adopted a multi-account strategy for their AWS resources, this would be a great way to give permissions on the bucket across those accounts. By the way, it's also the way that if you want to share an AMI image of an EC2 instance, you can share across accounts using this option.
We don't want to set anything public, so we'll just go ahead and click Next. Here you see the summary of the options that we just chose, and we'll go ahead and click Create bucket. Now we have our demo-primary bucket, and we'll create another bucket to be our logging bucket. Demo-logging and my initials. We'll keep it in Oregon West. We'll select all the same options except we're not going to have logging, and we're not going to have versioning. We'll go ahead and click Next, Next again, and create the bucket. Now that we have the logging bucket, we can go back to the primary bucket and set its options.
Up here under Properties, you'll see some of the same things that we saw in the Creation Wizard, including Logging, and we'll click into there. Choose Enable logging and look at the drop-down. We'll choose the logging bucket that we just created, and we have the option here to set a target prefix. So what you might want to do in production is have one S3 bucket that is the target for logging from a number of other S3 buckets. Prefix in S3 refers to just the first part of the key name of objects, but remember that S3 will take the prefix of an object if it has a slash and treat that like a folder.
So what I think I'd like to do with this logging prefix is name it after the source. So we called it demo-primary, so let's say that the target prefix for these logs is going to be demo-primary-log/ with a forward slash. That way when the primary bucket does logging into the target bucket, it will do so in what appears to be a folder named demo-primary-logs/, I'll put an S there. Now hit Save. Now that we have our buckets set up, let's look at adding some data to the buckets and seeing how that logging works out.
Join AWS architect Brandon Rich and learn how to configure object storage solutions and lifecycle management in Simple Storage Service (S3), a web service offered by AWS, and migrate, back up, and replicate relational data in RDS. Find out how to leverage flexible network storage with Elastic File System (EFS), and use the new AWS Glue service to move and transform data. Plus, learn how Snowball can help you transfer truckloads of data in and out of the cloud.
- What is data management?
- AWS S3 basics
- S3 bucket creation
- S3 upload and logging
- S3 event notifications
- S3 data lifecycle configuration
- Working with Amazon Elastic Block Store volumes
- Creating and mounting an EFS
- Creating an AWS RDS instance
- RDS backup and recovery
- Moving data with AWS Database Migration Service
- Moving data with Data Pipeline and Glue