In this video, Jeff Winesett introduces Amazon's CDN solution, CloudFront. CloudFront optimizes both static and dynamic website content delivery by caching content in edge locations and optimizing the request routing to origin servers.
These often get stored on the web server, and, as such, requests for these assets have to be requested from the web server. User requests have to be routed through the internet, which can take several DNS hops and paths. And every single one must resolve to the web server to return the content. This can both result in high latency, due to the users being at large geographical distances from the location of the web server, and also a lot of load on the web server that has to respond to all these requests.
CloudFront offers another option. It can store and cache an application's most popular static content, at globally distributed edge locations around the world. Keeping the data in cache ensures quick accessibility, which greatly improves performance, and it prevents every single request from having to hit the server. It can also increase the delivery of dynamic content. Without CloudFront, users' requests for data are at the mercy of general internet routing rules, which are often not optimized.
CloudFront can be used to proxy request for dynamic content, which are then routed back to your source web server, via routes that have been optimized for performance. CloudFront optimizes the connections between its edge locations and your web server, to avoid internet bottlenecks, and fully utilize the bandwidth between the end user and the edge location. CloudFront, at the highest level, essentially has two primary components, a distribution, which specifies what content to deliver, along with some configuration rules about how to deliver that content, and the edge locations, where these distributions exist, which are a geographically distributed network of caches, from which to best serve the content.
It's these edge locations that are key to the performance CloudFront provides. These are globally distributed, which allows your content to reside as close as possible to the users. CloudFront is optimized to ensure the fastest delivery of content, and uses latency-based routing. When a request comes in for your content, it gets routed to the least latent network node in the network, which may or may not be the closest one to the end user.
To further improve performance, CloudFront has also introduced something called regional edge caches. These are regional caches around the world, which provide an additional caching tier. Rather than every edge location going back to the origin server, in the case the data is not in cache, it will now first try to get it from a regional location. And then only hit the origin server in case the data was not found in the regional cache. This further reduces load on the origin and increases the size of the cache, compared with only having edge locations.
Optimize for performance, rule number two. Take advantage of Amazon CloudFront. It will speed up the delivery of web and mobile application content. It also helps with the scalability of an application, while at the same time, reducing the load on application servers, enabling more throughput, with less server infrastructure.
- Benefits of cloud services
- Making architectures scalable
- Examining cloud constraints
- Virtual servers, EC2, and Elastic IP
- Using the Amazon machine image
- Elastic load balancing
- Using CloudWatch for monitoring
- Security Models
- Elastic block storage
- S3, CloudFront, and Elastic Beanstalk
- Handling queues, workflows, and notifications
- Caching options and services
- Identity and access management
- Creating a custom server image
- Application deployment strategies
- Serverless architectures