by David Lindner, Chief Information Security Officer – Contrast Security
Are you ready to strip naked, technology-wise?
Or are you still keeping the details of your software ecosystem close to the vest, hidden from data pickpockets like it’s the cyber-equivalent of a tourist’s tucked-away money belt?
If you’re like many organizations, you’re not ready — even though governments are moving to mandate transparency in an effort to stop cyberattacks. As it is, an increasing number of cyberattacks target the software supply chain. Just a few examples are SolarWinds, Colonial Pipeline, Kaseya and Log4j: a vulnerability in millions of applications that many companies weren’t even aware they had given that software is often bundled with other software…that’s bundled with other software.
Transparency is coming sooner, rather than later. Now is the time to get ready for the type of transparency that will reveal where flawed components — such as the open-source Log4j library — are tucked inside your applications.
There’s a roster of new cybersecurity regulations and enforcement in the works, both at the state and federal level in the U.S. as well as around the world. Most recently, in mid-September 2022, the White House built on President Biden’s May 2021 cybersecurity executive order when the Office of Management and Budget (OMB) announced (PDF) new guidance with the aim of ensuring that federal agencies only use secure software.
At the minimum, federal agencies will be empowered to require a Software Bill of Materials (SBOM) that can prove a vendor’s compliance. Agencies may also require software producers to hand over results from automated tools that validate source code integrity and that check for known or potential vulnerabilities, as well as require that software providers run a vulnerability disclosure program.
TL;DR: Get ready to reveal what’s going on under the hood, to an unprecedented extent.
It’s a bigger problem than many of us realize.
Suppose, say, that a third-party, commercial software producer published an SBOM. It could list every software build they’ve released, replete with links — perhaps to CVE data, for example — to each line item.
You download the SBOM, and then what? Is it on the consumer to somehow consume that SBOM and make sense of what it means? And does the consumer then have to make sense out of every SBOM for every product they’re using? Will they be forced to go to 50 different sources for the 50 software products they’re using? How do they deal with wrangling all that?
Complicating matters is the fact that currently, those 50 or 100 or fill-in-the-blank number of future SBOMs wouldn’t share a common format: For example, some allow you to add associated Common Vulnerabilities and Exposures (CVEs); some don’t. It’s messy. It means that the SBOMs aren’t currently consumable due to lack of a standard format, making it difficult to turn them into useful and actionable reports.
What’s needed is a clearinghouse of sorts for SBOMs, similar to how MITRE acts as a clearinghouse, managing CVEs. Similarly, we need a clearinghouse or some other type of entity — be it a public company or government agency — that manages and maintains SBOMs that can somehow be consumed by the end user, be that consumers trying to assess the security of their mobile banking app or government agencies responsible for securing critical infrastructure.
As it is, many organizations lack even an inventory of what third-party components they’re using, what known vulnerabilities those components contain and which libraries are actually used.
Here’s a real-world anecdote to drive home the consequences of lack of transparency: Recently, Contrast reached out to a third party who creates a commercial product that is essentially a library its consumers pull into their applications. This third party was unwilling to share the information essential to Software Composition Analysis (SCA) tools, such as Contrast SCA, to provide insight into any known CVEs or issues with the libraries they produce. Their unwillingness to provide this information led to breaches of their customers via products usage. This left their customers scrambling and, in some cases, dealing with lawsuits.
The time for that kind of thinking is over. We’ve all got to think outside of that box and make such data available. Creating transparency around the supply chain and encouraging continuous library updates will reduce both risk and cost.
Detection needs to be in the tooling. We must empower developers to build secure code. Security needs to be woven into customers’ typical processes. That means SCA: the process of automating visibility into the use of open-source software (OSS) for risk management, security and license compliance. Besides enumerating OSS components, developers also must be empowered to learn to write secure code that gives them highly accurate, real-time feedback when they introduce a vulnerability. This involves automated, continuous application security testing running in the background, rather than periodic scans that require days or weeks to get error-laden feedback to developers.
Clearly, SBOMs are just one lug nut in the car. There are plenty of other risk factors: For example, what’s a vendor’s breach history? What kind of risk score should be assigned to, say, Office 365? What’s the exploitability or reachability of a particular CVE? People don’t have that full picture yet, and SBOMs alone won’t give it to them.
It requires a combination of modern application security tooling and having the industry frame the whole picture, beyond just SBOMs. We need a standardized mechanism for relaying security posture — not just some SOC2 ISO report, but rather one that enables end users to make semi-educated, risk-based decisions.
To get there, we need to let the sunshine in. We need to stop keeping our security profiles in the dark. We need to stop hiding our breach histories and stop shielding our software components from sight. We need, to brace ourselves to grapple with, and make public, the dirty laundry of vulnerabilities that come with them.
David Lindner, Chief Information Security Officer
David is an experienced application security professional with over 20 years in cybersecurity. In addition to serving as the chief information security officer, David leads the Contrast Labs team that is focused on analyzing threat intelligence to help enterprise clients develop more proactive approaches to their application security programs. Throughout his career, David has worked within multiple disciplines in the security field — from application development, to network architecture design and support, to IT security and consulting, to security training, to application security. Over the past decade, David has specialized in all things related to mobile applications and securing them. He has worked with many clients across industry sectors, including financial, government, automobile, healthcare and retail. David is an active participant in numerous bug bounty programs.