How to secure your software supply chain

The advent of DevOps and the large-scale automation of software construction and delivery has elevated the software supply chain—and its underpinning delivery pipeline—to mission-critical status in every modern enterprise. But this new status also means that your software delivery pipeline is a potential point of failure when it comes to security.

The increased velocity of modern pipelines and the removal of manual checks and balances have created additional security risks.

The automotive, consumer electronics, and pharmaceutical industries have long understood the need for both provenance (understanding the origin of materials) and veracity (ensuring the integrity of their manufacturing processes) in their supply chains. But even vigorous and thorough control of supply chains has unfortunately not removed the risk of contamination. Think of the Chicago Tylenol murders or, more recently, the Bloomberg exposé of modified motherboards from Supermicro.

If a physical supply chain can be easily subverted, how does an organization even begin to govern its software supply chain, which is, by its very nature, open by design?

Here are key steps to secure your supply chain.

How to Build a DevOps Toolchain That Scales

Scope out your pipeline

The first step in securing your software supply chain is understanding the construction of the pipeline and how raw materials—software source, components, and packages—enter this pipeline.

Produce a simple map of the pipeline and entry points where components can be ingested; this allows you to determine where controls can be introduced. For example, if developers have full access to component repositories on the Internet, then it is essential that all software builds be scanned for vulnerable components using a software composition analysis tool.

A more secure operating model is to restrict the unfettered usage of components and provide a local instance of known, vetted, and approved components on a repository server. In any case, the use of a well-publicized policy and/or guideline for usage of components and libraries should be created, so developers are aware of the risk and develop software within acceptable safety bounds.

Finally, it is imperative that an organization constantly scan and assess for risks arising from third-party components. Many software-composition analysis tools allow scanning either as code is written or at the point of deployment.

You need to include with this a pragmatic approach to addressing technical debt inherited via third-party components. This can help ensure that major problems such as the Equifax breach are a thing of the past.

[ More from DevSecCon London 2018: Liz Rice shares 9 practical steps to secure your container deployment | DevSecCon: Container, serverless growth haunts DevOps security ]

No bypasses allowed

The second step in securing your software supply chain is ensuring that your security controls in the pipeline cannot be trivially bypassed or circumvented. In my experience running an app sec program at a large bank, I encountered many instances where developers under pressure to meet a deadline simply bypassed security tests in their build process!

In other cases, pipelines may be poorly constructed and governed, resulting in poor test coverage and repeatability.

The use of immutable pipelines built from a "golden source" helps ensure that build pipelines are constructed in a uniform fashion, reducing variability. The use of digital signing methods and application manifests allow the veracity of a pipeline process to be enforced and verified. Potentially, a release candidate could have a "signature of authenticity" associated with it, asserting that all security testing has been completed in accordance with policy.

The rise of the cloud-native movement affords an excellent opportunity to bake supply chain security into the build pipeline. Much of the container lifecycle development process enables security as a first-class citizen, rather than as a bolt-on or afterthought.

For example, it is possible to use only approved and vetted base images (from the Docker Trusted Registry) to build containers. Also, numerous tools allow for vulnerability scanning in virtual real time as containers are built, and the build chain itself can be hardened and made impervious to external tampering or tweaking.

Most exciting, perhaps, is the ability to enforce governance at the point of deployment using the concept of an "admission controller." This is a robust mechanism within an orchestrator to enforce the veracity of the pipeline before allowing a candidate image to be released to production.

Tools to the rescue

In many cases, organizations won't have the relative luxury of building container-centric tool chains and must leverage other mechanisms to enforce veracity. The Update Framework (TUF) provides an excellent reference for a robust software update process, preventing against attack or compromise. In fact, TUF is implemented as Docker's Notary in the Cloud Native Computing Foundation to further ensure container security.

An excellent and generically applicable implementation of a supply chain protection framework is In-Toto. This allows a pipeline designer (the design authority) to pictorially design the steps within the build pipeline and then to use In-Toto's command-line tools to cryptographically sign each stage of the process.

This way, a final release candidate can be formally verified to have achieved all stages of the specified process. In-Toto's generic and modular design allows it to be applied to legacy build pipelines, allowing supply chain security to be bolted on. A full transition to a container lifecycle is not a prerequisite.

Pipelines themselves are next

A final note of caution: As organizations race to adopt continuous delivery and enforce security scanning in their pipelines, the pipeline itself will become the most attractive attack vector for an attacker.

If the pipeline itself can be compromised, then all the other security controls are fruitless. Think back to the example of the Supermicro motherboard compromise; it wasn't necessary to inject suspect components into the supply chain; it was all too easy to subvert the pipeline itself by bribing the factory owner.

For a software pipeline, consider who has root access to the pipeline and apply principles of least privilege access as far as possible.

Container methods matter

The threats to the software supply chain look certain to increase, either by the careless ingestion of vulnerable components or by negligent or rogue actors tampering with the supply chain.

Simple hygiene steps such as mandating vulnerability scanning or using known good components address a significant level of supply chain risk. The opportunities afforded by modern container lifecycle development methodologies can help ensure that supply chain security can be built in by design in such a way that renders compromise highly improbable.

Colin Domoney spoke on this topic at DevSecCon London. 

The DevSecCon London conference and training sessions took place on October 18-19, 2018. DevSecCon Singapore runs February 28-March 1, 2019.

Topics: DevOpsSecurity