Innovation at the Expense of Cybersecurity? No More! – Security Boulevard

Earlier this month, Jen Easterly and Eric Goldstein of the Cybersecurity and Infrastructure Security Agency (CISA) at the Department of Homeland Security signaled a major shift in the federal government’s approach to cybersecurity risk and responsibility. 

In their Foreign Affairs article Stop Passing the Buck on Cybersecurity, Easterly and Goldstein make a strong case for IT and software organizations to accept greater liability for cyberattacks, emphasizing that, “Consumers and businesses alike expect that cars and other products they purchase from reputable providers will not carry risk of harm. The same should be true of technology products. This expectation requires a fundamental shift of responsibility. Technology providers and software developers must take ownership of their customers’ security outcomes rather than treating each product as if it carries an implicit caveat emptor.”

This is the strongest position the U.S. government has taken thus far, made even more noteworthy by tech’s seemingly laissez-faire approach; sacrificing security for the sake of innovation. From the standpoint of policymakers and leadership, it is no longer acceptable to ship software with the assumption that the end user is responsible for inspecting or ensuring the quality of the goods they receive–an approach we wholeheartedly agree with. 

Assessing Cybersecurity Risk, Responsibility and Legal Liability

As we mentioned in a previous blog post, there was already a shift brewing to hold organizations more responsible for the outcomes of cybersecurity attacks. In the wake of high-profile attacks like NotPetya and SUNBURST, governments across the globe have responded with new regulatory guidance and best practices aimed at improving cybersecurity resilience.  

We’ve also seen the growing power that legislation like GDPR can have against organizations involved in attacks where end-user data is compromised. And in these cases, an indemnity clause, like those often found in a EULA, has little ability to absolve an organization’s liability.

Easterly’s and Goldstein’s point of view in offering secure cybersecurity recommendations and policy for the U.S. government marks another important example of liability evolution for organizations producing software.

What Does Cybersecurity Liability Look Like?

As much as Easterly and Goldstein called out the need for change without saying exactly what that change should be, we can extrapolate some of the expectations based on the examples they used. These represent analogies to other industries and can help align expectations to policy and a soft reveal of policy to come.

Our first glimpse into these expectations can be seen in this quote, “Whether automobiles or other sectors such as aviation or medical devices, it took crisis to force people to focus on the need for additional safety measures. Such a safety crisis is already here in the cyber-realm, and now is the time to address it.”

It is unlikely these examples are called out randomly. And, at least in the case of automobile manufacturing, the recent Atlantic Council paper on open source as infrastructure uses the automobile analogy (among others) as well. 

In the wake of the attacks we mentioned previously, as well as ongoing vulnerabilities and near-daily headlines highlighting attacks that compromise user data, it’s pretty clear we are in a crisis today. Based on the sentiment that, “Such a safety crisis is already here in the cyber-realm, and now is the time to address it,” we can assume Easterly and Goldstein also believe this.

Building Products That are Secure by Default and Design

While it took a crisis to motivate change, the industry itself didn’t change until government action and regulation was put in place. Each of the three industries cited as examples in the strategy–automobile, aviation and medical devices–are all heavily regulated within the U.S. and other countries. 

However, these regulations aren’t taken lightly and are most often derived from or intended to maximize the end user’s safety. These regulations also indicate an association of liability such that it is unacceptable to expect end users or consumers to be responsible for the safety of the products these industries produce. 

Easterly and Goldstein used two sets of terms to qualify what core foundations would drive new recommendations and regulations: They stated that software products should be “secure by default” and “secure by design.” 

They also put forward that it is the responsibility of the U.S. government to define these standards, “It will not be easy to make these changes and convince companies to build and deliver more secure products, but the U.S. government can start by defining specific attributes of technology products that are secure by default and secure by design.”

Though the article does not establish what those standards will be, once again, we can look to their use of analogies to the automotive industry and better understand the expectations. “Secure-by-default products have strong security features—akin to seatbelts and airbags—at the time of purchase, without additional costs,” they said.

What About a Software Bill of Materials (SBOM)?

SBOMs are perhaps the most important and dramatic shift mandated since the Executive Order on Cybersecurity in the U.S. 

In the executive order, the call for including a software bill of materials to better secure and track the quality of a software supply chain has led organizations to double down. Almost every business engaged with the U.S. government has prioritized checking that box and shipping a product with a list of all components used. 

What this article and the previous quotes are pointing out, though, is that checking that box will no longer be acceptable. That list of parts may very well be used to build a case of liability in instances where a product was shipped with known vulnerabilities. That surely isn’t secure by default or secure by design.

And this is something we’ve talked about, as well. In our piece on the shifting landscape of software supply chain attacks, we also drew analogies to the automotive manufacturing industry. Specifically, it would be unacceptable to assume placing a list of parts in the glove compartment box of every vehicle would do anything to help identify and address safety and security issues!

In contrast, through government recommendations and regulations, the automotive industry, as well as most other manufacturing industries, works to provide capabilities like an intelligent recall. These systems are designed to protect the end user, not hold them responsible for understanding the quality of parts in the products they purchase. In short, while providing an SBOM will be a continued requirement, it will no longer be the only requirement. 

Solving Cybersecurity Risk and Liability Today

But for most, this is a daunting task. Nearly every week, there appears to be a new vulnerability associated with open source software. However, our research shows this is only a small portion of the risk associated with open source; around 4%, to be specific.

Last year, in our 2022 Software Supply Chain report, we showed data indicating that 96% of components downloaded with a vulnerability had a non-vulnerable version available. We also saw that months and even over a year after the Log4shell issues, nearly 30% of Log4j vulnerabilities appeared in the vulnerable version of the framework.

This indicates that the problem isn’t with open source or shipping products that become vulnerable. This is a problem related to the consumption of low-quality parts. And we believe these are the types of activities new legislation will target. Luckily, tools, best practices and processes are available to address this ahead of any recommendation or regulation change.

Understanding the Changes Ahead

There’s a lot more in Easterly and Goldstein’s article, including establishing secure operating practices for software organizations (Spoiler: there are tools, best practices and processes available for this as well). We highly recommend reading it to form your own conclusions and better understand the changes ahead.

Organizations should be preparing for this inevitability of ensuring that the products they produce are secure by default and secure by design. It is a duty of care they owe their end users.

Easterly and Goldstein aren’t shy about calling this out directly:

“Building on this progress will require U.S. agencies to impose increasingly stringent secure-by-default and secure-by-design requirements in the federal procurement process, which will help prompt market changes toward creating a safer cyberspace ecosystem.”

Today, the nature of attacks and cybersecurity risk goes well beyond the compromised data of end users–national infrastructure and the lives of consumers are also at risk. It’s time organizations take responsibility for that risk. 

Recent Articles By Author