Is Your Varnish A Ticking Security Vulnerability?

Varnish Security Vulnerability

Is Your Varnish A Ticking Security Vulnerability?

Varnish is an HTTP accelerator used by many content-based applications. It is used by small and large scale companies, enterprises, and content sites such as Pinterest, Udemy, Twitch, and the New York Times. But what exactly is an HTTP accelerator? What are the vulnerabilities? and is your current Varnish installation a ticking security vulnerability?

What is an HTTP Accelerator?

An HTTP accelerator, also known as a web accelerator, is a proxy server aimed at reducing the time it takes to access content on a website. While other web accelerators focus on other protocols such as FTP, SMTP, and other network protocols, Varnish is designed as a purely HTTP accelerator. Varnish is a separate instance that sits isolated from the actual application itself and is often used by content-heavy sites to reduce database query loads and increase the responsiveness of a site’s content.

A common technique used by an HTTP accelerator is caching, where frequently accessed documents are ‘saved’ on the proxy and sent to the client without actually requiring a call to the source. This often reduces the latency experienced by the client, giving a sense of responsiveness and speed.

Caching works particularly well when the content is mostly static — that is, it is not expected to change often. This is why it is often employed by content sites, where the expectation is that the content remains the same after its publication. If a change does occur, depending on the HTTP accelerator’s settings, a manual clearing of the cache may be required to refresh the content in order to properly update. If not, HTTP accelerators tend to work on an ‘eventually up to date’ policy.

Varnish Encryption, Throttling, and Authentication

Content Delivery Networks, or CDNs, use caching as one of the backbone methodologies to prioritize speed due to expected large volumes of traffic requests involved. When it comes to Varnish, it is the first HTTP gate that a user’s request goes through, which means that it is vulnerable to various types of attacks and exploitations. Varnish comes with multiple tools to mitigate these problems and includes solutions such as encryption, request inspection, throttling, authentication, and authorization logic.

Varnish encryption is supported through HTTPS, where data passed between the server and user is encrypted. It supports SSL/TLS features and runs on Hitch TLS — an event-driven architecture that can handle around 3,000 TLS protocol negotiations per second.

The significance of native SSL/TLS support by Varnish means one less third-party module to maintain. The in-built functionality allows for network architects to enable a caching system that is tightly integrated with encryption. This means that the layer is in-built and doesn’t require modifications to make it work seamlessly with Varnish.

Varnish uses an AES256 encryption key system, which focuses on unique request fingerprints that only return the intended cache object and nothing else.

Varnish Security Vulnerabilities

Due to the nature and volume of content availability requirements, the biggest security vulnerabilities that Varnish often faces are brute force and DoS (Denial of Service) attacks. Although these attacks are common and are often mitigated quickly by Varnish updates, brute force and DDoS attacks have evolved over time as a response.

The latest reported Varnish vulnerability involves a combination of installed Varnish Cache and varnish-modules on your proxy instance. This is a common occurrence on Varnish servers. While Varnish Cache on its own does not pose a security risk, the combination of Varnish Cache and varnish-modules has the ability to trigger an assertion failure, or the infamous NULL pointer dereferencing on Varnish Cache. This is achieved through triggers on the header.append() and header.copy() functions of varnish-modules. Some VCL (Varnish Configuration Language) files are configured with permissions to automatically restart, which can reduce overall availability and performance. During this downtime, backend resources may experience higher loads and spikes, which can lead to a domino effect of network failures.

For instances running older Varnish Cache versions, specifically 6.0.4 LTS, and 6.1.x and 6.2.x prior to 6.2.1, HTTP/1 requests can be configured by a malicious user to trigger an automatic restart with a clean cache. When this occurs, Varnish needs to communicate with the backend again in order to repopulate its cache with content. This type of DoS is a trigger attack that forces a backend network to essentially attack itself over time and overload its own systems through high volumes of requests.

Updating and Patching Varnish

The first step to securing your Varnish proxy server is to upgrade it. Varnish’s major vulnerability lies in its restart function. While there may be legitimate reasons to restart a Varnish server, such as to enforce a content or application update, triggered restarts by malicious users can result in your backend being overloaded.

The easiest way to prevent exploitation from occurring is to ensure that your Varnish is always up to date. To upgrade your Varnish cache to the latest version, you can run the following commands on your instances.

For Linux:

For Ubuntu 14.04 LTS:

This will upgrade your Varnish version to 6+.

Version 6+ was originally released in March 2018. This means that if you are still running version 4 (released 2014) or 5 (released 2016), your Varnish is highly susceptible to DoS attacks and various restart methods that have been discovered and exploited by malicious users for nearly a decade.

Older versions of Varnish — that is, anything that is version 5 or below — tend to be vulnerable to memory leaks and sensitive information exposure. This is due to older implementations of SSL/TLS methodologies that are now mostly redundant. It is also easy to crash a Varnish server if there is a mishandled if statement in the varnishd source code. This is because it triggers an assert related to an Integer Overflow. This can cause the varnishd worker process to overload, abort and restart — losing the cached contents in the process.

Conclusion: Where To From Here?

By design, Varnish only focuses on HTTP acceleration, which means that the surface area of an attack is limited to only HTTP-based requests. Your other protocols and ports are safe.

But what about the pesky restart issue? How do you prevent it from occurring in general? Apart from keeping your Varnish updated to the latest version, the second place to assess this vulnerability is in your VSL file. Understanding how your VCL file is structured and configured is beyond the scope of this piece — but it is a good space to explore further if you want to deepen your understanding of how Varnish works.

While there are instances where a restart is necessary, when it comes to updating your VCL file for your Varnish proxy server, it is possible to upload it without restarting Varnish itself. This means that your new VCL can be implemented without losing the cache. Ways to do this is through init.d scripts, varnish reload VCL script, or via a custom script.

Overall, Varnish is generally secure. The main issue that Varnish experiences are when it is forced to restart unnecessarily. The task of physically overwhelming a Varnish proxy server takes more than just sheer effort, but exploitations of restart-based vulnerabilities to instigate network failures. However, if your infrastructure is elastic by design, bringing down Varnish on its own is generally not enough to topple your content’s availability. It might cause major spikes, which are easily mitigated if alerts and monitoring are put into place.

Aphinya Dechalert

Aphinya Dechalert / About Author

Aphinya is a skilled technical writer with field experiences in software development, agile, and JavaScript full stack with AWS and Google cloud. She is a developer advocate and community builder, helping others navigate their journeys and careers as developers.