The advantages of a static site are many, but the most salient are extremely fast load time, impeccable security, and ultimate stability.
Static sites do not use a database or scripting language like dynamic sites. That means their pages are served up to website visitors quickly and efficiently. Dynamic sites have to do a fair amount of data processing to serve even simple pages: a scripting language queries a database and builds pages as they are requested. Though caching helps reduce the time required for this data processing, static sites are generally much faster and score high in search engine optimization metrics. This means happy visitors and higher search engine rankings.
The technology behind static sites is relatively simple. That means fewer moving parts and fewer points of failure. While complexity is called for in the case of elaborate web applications, it is completely unnecessary for straightforward content. If not carefully maintained and vigilantly monitored, dynamic sites are susceptible to all manner of breakage: errors, slowness, and security warnings to name a few. Such problems may crop up without your even knowing about them because dynamic sites are, well, dynamic. Static sites only change when you’re ready to change them.
The content and design of static sites can be changed at any time, but nothing has to change. Dynamic sites run on a “stack” of sophisticated software that has to be kept up to date when new software versions are released. These are the same kinds of updates that are required on your own computer: updates to the operating system or your web browser for example. Such updates can’t be ignored indefinitely because security is at stake. The simplicity of static sites makes such requirements unnecessary. They can continue operating indefinitely without a thought given to software maintenance.
Most common website hacking vectors involve vulnerabilities in databases and scripting languages. Because static sites use neither, there’s almost nothing to hack. Even if a static site’s files are tampered with, a clean version of the site can be easily regenerated with no “back doors” or other malicious residue. The code that generates the site is not accessible through the web server and remains untouchable by everyone but the developer.