Micro Focus is now part of OpenText. Learn more >

You are here

You are here

5 ways to make your serverless functions more secure

public://pictures/ericj.jpeg
Eric Johnson Principal Instructor, SANS Institute
 

Developers love serverless applications. The technology offers programmers the ability to create custom functions without worrying about infrastructure, and speed deployment without having to wait for operations to deploy containers or virtual machines.

No wonder, then, that serverless cloud services are taking off. In 2019, more than one third of companies surveyed stated that they had adopted the technology in some way, representing 50% growth from the previous year, according to RightScale's 2019 State of the Cloud Report.

Yet, because serverless platforms encourage development teams to easily write code and quickly deploy applications, security is often assumed to also be taken care of by the provider. That's not true.

Here are five steps developers should take to lock down their functions.

1. Know that your code is your responsibility

If you are already in the cloud, you are responsible for maintaining your infrastructure, such as virtual machines, container images, and orchestrators. With functions, all of that time-consuming overhead goes away. Developers can basically say, "Here is my code," include dependencies, set the permissions that the code needs, and then have a working feature.

Developers love serverless functions, because they make it easy to get their code out into the world. Yet security is still the responsibility of the owner of the function. All traditional application security controls still apply to the function. The owner of the code, 100% of the time, needs to sanitize inputs, perform validation, and test for security issues, such as the OWASP Serverless Top-10 list.

One way to learn about serverless execution environments is by reverse engineering the environment hosting your functions. We created some vulnerable functions simulating command injection and local file including (LFI) vulnerabilities, which are found often in application assessments.

[ See TechBeacon's special coverage of RSA Conference 2020. Plus: Don't miss the post-conference highlights from RSAC 2020. ]

2. Keep a secret

Cloud-based systems almost always use secrets, whether passwords for their GitHub and Bitbucket repositories, authentication tokens for third-party services, or API keys for authenticating to other services.

Historically, IT organizations have used a variety of techniques for keeping secrets, with varying security impact. The worst-case scenario is that the developer hard-codes a password or key into the application, and then an inadvertent leak of the code exposes the key and leaves the application and its users open to attack. Deploying the secrets in configuration files and environment variables is slightly better from a security standpoint but can lead to leaked credentials and allow an attacker to steal the credentials, if they compromise the runtime environment.

Knowing the credentials for serverless functions will allow attackers to pivot and extend the compromise. We have demonstrated this attack using local file inclusion (LFI) to read secrets from a YAML configuration file.

The best approach is to use a cloud key management service (KMS) or an on-premises key manager. Developers can rely on the service provider to keep the service updated and secure. In addition, the secrets are not stored on disk or in the environment variables. Instead, the function reads the value into memory on start up.

3. Establish access controls and permissions

The proper access controls and permissions are essential for keeping serverless functions secure.

If a Google Cloud Platform (GCP) function, for example, runs under the Default Service Account, a vulnerability such as command injection could allow attackers to extract the service account OAuth token from the environment. While the function only needs read access to one secret to do its job, the OAuth token grants read and write access to all cloud resources in the project.

Application storage and secrets, for example, should only be available using a private cloud endpoint that's accessible only from the function's private subnet. Should an attacker attempt to access a storage account using a stolen OAuth token from the Internet, proper access controls can detect and block the request.

4. Get visibility into your functions

Having visibility into your serverless infrastructure and regularly—if not continuously—looking for anomalies is critical. While protecting secrets and establishing access control policies are both important, monitoring is necessary to detect any misconfigurations being exploited by attackers.

In the Amazon cloud, for example, we made a call to an S3 bucket and got an error—access denied—because the bucket can only be accessed from an internal private endpoint. Trying to access storage from an external source leaves a log entry in Amazon CloudTrail, AWS's logging service, that can be used for automated detection and incident response controls for monitoring software that relies on serverless functions.

As soon as possible, application security teams should create an inventory of functions the development teams are using and which cloud providers host those functions. Then, they should focus on getting application audit logs, network audit logs, and activity audit logs into a security information and event management (SIEM) service.

5. Automate security controls for function code

Organizations moving toward serverless functions will be responsible for maintaining and monitoring thousands of function definitions in their cloud environments. Automating the configuration process and creating test-driven checks that can catch misconfigurations are both necessary.

Development and application-security teams should build out automated code scanning, such as static analysis security testing (SAST), to identify hard-coded secrets or secrets stored in cleartext config files in source code. Such testing can also help identify local file inclusion (LFI) and command injection scenarios.

As soon as possible start checking your function permissions for excessive permissions that can allow attackers to pivot to and compromise other could resources.

Within six months, your development and deployment teams should move to infrastructure as code—architecting the creation of functions and their associated resources through code. Serverless framework, Terraform, and CloudFormation are all ways to do this.

The benefit is that infrastructure code can be scanned as well, enabling automated discovery of excessive function privileges, enforcing the audit logging policy, and network configuration to help forensics and incident-response teams detect compromised functions and credentials.

Security has to come first

As development teams continue to adopt serverless functions to ease programming and speed deployment, they should not overlook security. While serverless is enticing because the cloud provider manages the infrastructure, the serverless users are responsible for security.

Eric Johnson, a principal instructor with SANS Institute, will discuss potential attacks against serverless functions and what defenders can do to harden their programs at the RSA Conference. ​​​

Keep learning

Read more articles about: SecurityInformation Security