Join Kevin Skoglund for an in-depth discussion in this video Denial of service, part of Programming Foundations: Web Security.
- A denial-of-service attack denies authorized users access to a server, service, or resource to which they would normally expect to have access. Denial of service is an attempt to prevent legitimate users from using a service. The underlying service may remain unaffected but it's no longer available. Remember, data security has three goals: confidentiality, integrity, and availability. Denial of service is an attack on data availability. Denial-of-service attacks usually use flooding or crashing to make data unavailable. Flooding is when a system is overwhelmed with too many requests. It could be too many requests to a web server, it could be using up a limited number of connections to a database, or it could be sending so many data packets to a router that legitimate traffic slows to a crawl. Crashing is when software or hardware crashes and stops operating. For example, an attacker could use a vulnerability to trigger a memory buffer overflow and use up all server memory. The motives for denial-of-service attacks vary but most fall into three basic categories: revenge, extortion, and activism. It's uncommon for a denial-of-service attack to be performed by a single computer. Servers and routers are robust and designed to handle a lot of traffic. It takes more to overwhelm them than a single computer can muster. Distributed denial-of-service attacks use hundreds or even thousands of computers working together. They could be controlled by humans such as activists working together but more often they are part of a zombie botnet. A botnet is a collection of computers or other Internet-enabled devices which have malware on them. An attacker can send commands to the botnet and launch hundreds of simultaneous attacks. out of Internet of Things devices such as closed-circuit cameras, home routers, and baby monitors is a recent and famous example. These attacks are powerful enough to overwhelm a network or a server and may last for days. In 2018 GitHub was hit with a record 1.35 terabits per second of incoming traffic. That's 127 million packets of data per second. The longest distributed denial-of-service attack was in 2018 and lasted for 297 hours. That's more than 12 days. Preventing denial-of-service attacks is challenging. How can an average web developer be expected to design a system that will handle millions of packets of data per second? The first question you should ask is is denial of service in my threat model? If your website is likely to be the target of retaliation, activists, or extortion attempts then DDOS attacks should be in your threat model. For most developers it will not be. The odds of a regular website being hit by a DDOS attack are very low. If it is in your threat model then the best advice is to outsource the problem to experts. There are many companies which offer DDOS mitigation services. If you sign up with one all incoming traffic will be routed through their extremely robust servers before it's sent to your servers. They have the hardware and expertise to handle millions of data packets per second. Cloudflare, Microsoft Azure, AWS Shield, and Akamai are some of the largest providers. Project Shield offers free DDOS mitigation to websites that have media, elections, political, and human rights content. These are frequently-targeted sites which typically cannot afford expensive solutions. If you decide not to use DDOS mitigation services and suddenly see a spike in your traffic first see if it might be legitimate traffic. A low-traffic website can get a big boost if it gets mentioned on television or in popular culture. Increasing available RAM, deploying additional servers behind a load balancer, or talking with your ISP may be enough to get you through the rush period. If it is malicious traffic then your remaining choices are throttling, filtering, sinkholing, or blackholing the traffic. Throttling is regulating the flow of input to keep it below a maximum level. It's also called rate limiting. It slows down malicious activity. Filtering applies a set of rules to incoming traffic. Requests of certain types or with certain characteristics are allowed and all others are quickly rejected. Sinkholing redirects the traffic to a new destination. This is usually a server which can capture and analyze the traffic. Blackholing is a more extreme version of sinkholing. All traffic is rerouted to nowhere, also known as the null resource. These are all actions that can be performed by firewalls. Hardware firewalls are more robust but software firewalls are a less expensive option. Internet service providers or ISPs may also be able to help. If a website gets hit by a DDOS attack the ISP may start blackholing the traffic immediately to avoid harming other customers who share hardware or infrastructure with the targeted server. A combination of relying on experts and these techniques will help to mitigate the potential damage from denial-of-service attacks.
- Threat models
- Least privilege
- Defense in depth
- Validating and sanitizing input
- Credential attacks
- SQL injection
- Cross-site scripting