BSIMM12 is here: 9 key takeaways for software security teams

John P. Mello Jr. Freelance writer

Since 2008, organizations have been using the Building Security in Maturity Model, or BSIMM, to compare notes on how they're securing software. Through BSIMM, research and data experts and consultants in the Synopsis Software Integrity Group gather data about the security practices of participating organizations to compare the paths they've taken to secure their software.

BSIMM has grown from nine companies in 2008 to 128 in 2021, covering an array of verticals that include financial services, FinTech, independent software vendors, IoT, healthcare, and technology organizations. It has more than 3,000 software security group members and another 6,000 satellite members, also known as "security champions."

Data from the 2021 BSIMM12 report provides a high-level summary of observed trends and insights about the participating companies. Here are key takeaways from that report.

1. 92% of participants implemented risk-based controls across their software portfolios

Risk-based controls allow software security teams to focus on the applications with the highest risk to their organizations, said Susan Jackson, head of DevSecOps at life insurance company MassMutual.

"It could be applications with PII, PHI, PCI, or other regulatory requirements as opposed to an application that is just marketing material."
Susan Jackson

Michael Isbitski, technical evangelist at Salt Security, a provider of API security, noted that risk-based approaches and controls underpin most security practices.

"There are limited personnel to work on problems in an organization, and they’re often working with constrained resources or tight timelines. Prioritization is key, and making choices based on relative business or security risk is fundamental to security strategy."
Michael Isbitski

In practice, the term risk-based controls means that the organization has a way of evaluating the underlying risk of individual applications, asking whether is it customer-facing or internal, for example, or whether the data it manages includes any privacy sensitive information, explained Larry Maccherone, DevSecOps transformation evangelist at Contrast Security, a maker of self-protecting software solutions.

"It’s less about find and fix earlier and more about focusing on where the biggest risk is."
Larry Maccherone

2. 91% had host and network security basics in place across data centers and networks

This trend demonstrates that many have a solid understanding of how to secure data centers and networks, observed Caroline Wong, chief strategy officer at Cobalt Labs, a penetration testing company. "The solutions are known, mature, and accessible," she said.

Isbitski pointed out, however, that many organizations operate network and infrastructure scanning as a bare minimum for achieving security or satisfying compliance.

Unfortunately, host scanning and infrastructure hardening still leave many gaps in the security of applications and APIs. "The high number isn’t surprising, but it also doesn’t equate to effective application security, in my mind," Isbitski said.

"It’s funny because the 'BSI' in BSIMM stands for 'build security in.' Host and network security are bolted-on, not built-in, approaches. They are necessary and the most mature part of cybersecurity, but the reality is that we’ve over-invested in them precisely because security folks favor bolted-on silver bullets over the tough work of actual culture change within the software development world."
—Larry Maccherone

3. 89% of organizations are identifying PII requirements

Jackson explained that a number of regulations, such as the EU's GDPR, focus on the protection of personally identifiable information. "If an attacker gains access to PII, it can be used in a number of ways to leverage an attack against the company or target a specific person," she said.

"Many regulations use the language of PII directly, or they label additional types of information that are private or sensitive. Organizations must comply with these regulations and standards or face severe financial penalties."
—Michael Isbitski

4. Pen testing is popular—87% say they use external penetration testers 

Manual pen testing is extremely important because entire classes of vulnerabilities cannot be discovered via automated means. These include race conditions, business logic flaws, and chained exploits, Wong said.

She added that for the past 40 years, the biggest challenge to pen testing has been access to skilled talent. In the last few years, she continued, a new model has emerged: pen testing as a service (PTaaS).

"It's changing the game and allowing organizations to pen-test their software applications on demand. Historically, an organization might only be able to pen-test 10% to 25% of their software portfolio, but PTaaS enables much greater coverage."
—Caroline Wong

But Isbitski cautioned that issues that can pop up with pen testing include contractors with low skill sets, projects with a limited scope of what is tested, and the fact that an activity can be point-in-time or continuous.

5. Companies pay more attention to identification and management of open source

The increased interest in open source can be tied to supply chain integrity. "We've seen increased efforts to identify and control open-source risk, which is one of the big factors in understanding overall supply chain security," observed Eli Erlikhman, managing principal at Synopsys and one of the authors of the BSIMM12 report.

Wong noted that open-source software can be vulnerable, and introducing it into a codebase can introduce vulnerabilities that significantly impact an organization’s risk posture.

"I think it’s wise that security groups are identifying and managing free and open-source software to build visibility into the inevitable consequences of including code that you did not write yourself into your software products."
—Caroline Wong

6. Dramatic increase in cloud and container technologies use

Cloud and container technologies have significant operational and business advantages, Wong noted. "Cloud platforms and container technologies are relatively low-cost, high-flexibility, on-demand, and redundant compared to traditional hosting," she said.

Data center consolidation is often a driver behind cloud migrations, observed Isbitski. "It’s often cheaper to pay a cloud service provider than to procure and maintain hardware on premises," he said.

"Adoption of container technology is often driven by cloud-native design patterns. This evolution also brings an array of benefits, including consistent infrastructure provisioning, reduced discrepancies between operating environments, and accelerated application delivery."
—Michael Isbitski

7. Software security groups shift from mandating behaviors to a partnership role

The big change provides resources, staff, and knowledge to DevOps practices.

"I think that, for decades, security practitioners have tried to tell technologists and engineers what to do, and that approach hasn’t worked out well," Wong observed. "I believe that the best way for security folks, development folks, and operations folks to actually increase security and reduce risk is to collaborate and work together to find, fix, and prevent security vulnerabilities."

Isbitski maintained that the shift from mandates is necessary. "Collaboration is essential within DevOps and DevSecOps practices so that an organization can operate and secure its networks, systems, applications, APIs, and data," he said.

"To succeed in application security program work, mature organizations employ the security champion concepts that the BSIMM lays out and that speaks to this collaborative aspect."
—Michael Isbitski

8. More firms implement defect discovery and continuous monitoring and reporting

This shift involves an alternative approach to using a point-in-time defect discovery approach. "We're seeing organizations taking large, monolithic security discovery activities, such as static analysis and software composition analysis, that were done as a point-in-time assessment at the end of the development lifecycle and breaking them down into smaller, incremental activities that are done throughout the SDL," Erlikhman said.

That's part of the trend toward "shifting left" with security, toward the development phase, he continued.

At the same time, he noted, an effort to shift other activities to the right. Those include defining security parameters; monitoring assets in production as they are created, which is common in container environments; and automatically verifying infrastructure security in production.

"It's all part of 'shift everywhere.' You want to do your security testing at the time these artifacts are created."

As we move toward CI/CD, some artifacts are being created farther right, closer to deployment and closer to the production environment, he explained. "You can't test them early in the SDLC because they don't exist yet."

9. BSIMM12 notes increase in software bill of materials activities

This, too, is tied to an increased concern with supply chain security.

What we have seen as an additional effort to manage supply chain risk is overlaying software inventory with a software bill of materials (SBOM), Erlikhman noted.

"They're taking the existing software inventories that they have and expanding them with the SBOM so they have greater visibility into all their software assets."
—Eli Erlikhman

Isbitski explained that SBOMs are a way to catalog dependencies in use by application packages and the digital supply chain. "Organizations use SBOMs to understand potential open-source license compliance or known vulnerabilities."

"You can’t secure what you don’t know you have."
—Caroline Wong

Follow the leader

These takeaways are just a taste of what BSIMM can provide an organization. By using BSIMM's data, an organization can formulate its security strategy based on what leading peers in their industry are doing. Sometimes following the leader can be a good thing.

Read more articles about: SecurityApplication Security

More from Application Security