Serverless Architecture Vulnerabilities and How to Prevent Them

Serverless Architecture Vulnerabilities

Serverless architectures are becoming a staple in big and small applications. The idea dates back to 2008 and with more than a decade of developer adoption and cloud services providing it as an option, it leads to the eventual question — what exactly is serverless, what are the vulnerabilities and how do you prevent them?

Serverless in a nutshell

The idea behind serverless is a simple one — there are no servers involved because the cloud provider does all the heavy lifting for you. All you have to do is turn up with your code, upload it, connect up everything and that is it. You don’t have to worry about computing or networking models, architectural topologies, or anything relating to infrastructure — which leads to the name serverless.

Serverless cloud offerings are structured as FaaS platforms — or Function as a Service — which means that pricing models are based on usage rather than uptime. With FaaS, you are not paying by the hour, but based on the number of network requests and activity that occurs on your deployed functions. This model is beneficial for startups and general cost optimization.

With any code, the robustness of an application in the cloud also depends on the written quality of the code itself. The chances of your serverless Lambdas getting breached are approximately the same as your EC2 falling apart if the same code-based vulnerabilities are present.

However, this doesn’t quite answer the questions: are there any vulnerabilities and how do you prevent them?

How Serverless Works

The nature of serverless opens up a new realm of potential security issues that are not covered by traditional mitigation methods. For starters, the attack surface area is different due to differences in architectures and how everything gets connected up. DAST (dynamic application security testing) is no longer effective because it only covers HTTP interfaces. Meanwhile, serverless consumes more inputs from non-HTTP sources such inputs and outputs directly from cloud storage events and NoSQL databases to produce REST APIs. These are some of the multitude of factors that contribute to the architectural setups of serverless, thus increasing the potential attack surface area and complexity.

Despite this, serverless can still be viewed as a secure solution for your software application needs. Why? From a developer’s perspective, it reduces the amount of administrative workload that’s associated with security updates, patches, and operating system compatibility support. However, common programming mistakes can reduce the effectiveness of a serverless setup.

The serverless service provider, such as AWS, Azure, and Google Cloud, is responsible for securing the resources you use. the developer, however, is still responsible for writing the application’s logic in a way that prevents malicious acts from occurring.

But what are these malicious acts and how do you prevent them?

Function Event-Data Injections in Serverless Lambdas

Traditionally, event-based data injection is associated with form inputs. A malicious user injects code via a form submission to gain access or perform unwanted actions on your application. However, when it comes to serverless, the process of event-data injections is much more complex due to the increased non-HTTP based connections.

A collection of serverless functions often includes connections to other services within the FaaS provider’s ecosystem. This leads to an increased number of trigger sources — not just an HTTP input-based one. Serverless functions can consume an input from an event source such as database events, IoT telemetry signals, message queue events, and push notifications. Each input can result in different message formats that may not be handled properly. This can lead the way to common injection flaws such as OS command and SQL injections, publish/subscribe message data tampering, object deserialization attacks, and server-side request forgery.

A way to mitigate this issue is to always assume that inputs will not always originate from an expected trigger. For example, your original function gets triggered by an image upload. This image gets passed to another function that processes the image. However, inadequate checks and sanitization result in the image name triggering a database execution that domino effects into a cascade of connected serverless functions.

Authentication Inconsistencies

Serverless architectures are synonymous with microservice-oriented system design. This means that applications are built to contain dozens, if not hundreds, of distinct serverless functions — each with a clear and specific purpose. The functions are stitched together and orchestrated in a way to perform the overall system logic. It may consume events and data from multiple source types, some needing authentication whilst others do not.

The issue with authentication is that it is complex to implement by design and nature. Even in a standard and non-serverless application, authentication systems are still prone to security vulnerabilities. Then there is also the need to factor in authentication system management and maintenance.

When you start throwing serverless into the mix, traditional architectures for authentication may fall under the need to support multiple layers required by each function and its communication with the multitude of services it needs. A flaw in the authentication system can allow an attacker to bypass application logic and create flow manipulations, exposing private data to unauthenticated users.

To mitigate this, it is best to use your chosen cloud provider’s authentication facilities rather than try and build your own. This is because the provider’s services are optimized to work with each other, giving accessibility and functionalities required to grant the right permissions to each other.

For example, AWS offers Cognito or SSO (single sign-on) to user-based authentication, in addition to API Gateway authorization facilities. Google has Firebase Authentication, Azure has App Service Authentication / Authorization, and IBM has their Bluemix App ID and SSO implementations. The risk of authentication failures is delegated to the cloud providers and access to authenticated areas becomes the responsibility of the code implemented.

Conclusion

Serverless as a service offered by cloud providers does not lack the necessary security protocols required to keep your applications safe. Rather, they offer plenty of tools and capabilities to make your serverless orchestrations secure.

However, the vulnerabilities ultimately lie in the software development itself through common mistakes such as inadequate checks, or over-granting resource accessibility permissions on serverless functions. The latter is often the highest security risk.

Over-granting accessibility permissions are easy to do. It may make the development feel faster because you don’t have to deal with request denied logs and errors. However, fine-tuning your functions’ ability to only have accessibility to the scope of its needs is safer and better general practice. It is better to keep security tight from the beginning, rather than try and tighten it up after the code has been created. By standard practice, a serverless function should only have the essential privileges needed to perform its intended logic — which is also known as the principle of “least privilege”.

Forgotten or misconfigured authentication, authorization, and permissions can cause widespread weakness across the application. This weakness also gets extended to the connected services that may act as triggers to other parts of your cloud-based deployment.

Overall, serverless is secure through your cloud provider’s implementation of the service. The final strength of a serverless orchestration lies in the software development processes a team has in place to ensure that security is a priority.

Aphinya Dechalert

Aphinya Dechalert / About Author

Aphinya is a skilled technical writer with field experiences in software development, agile, and JavaScript full stack with AWS and Google cloud. She is a developer advocate and community builder, helping others navigate their journeys and careers as developers.