Save Open Source: The Impending Tragedy of the Cyber Resilience Act

By: Dirk-Willem van Gulik, VP, Public Affairs, ASF

 TLDR; 

Software, including open source, is becoming regulated the world over. This lengthy blog post explains the background to the Cyber Resilience Act in the European Union, what is good, its flaws and the likely negative impact on open source. And it also explains the arcane process by which it moves through the EU system, to help understand the timeline and how to make a change.

If you are looking for a more verbal introduction – Mike Milinkovich at Eclipse gave a very up-to-date and lucid presentation that covers the same ground. If you are more into short calls to action – then try GitHub, CNLL (in French), the Linux Foundation or the more comprehensive response of the wider industry.

Background

Although the IT industry is still small compared to other large industries and sectors, over the past decades it has become crucial to society. It is now common to see large events in the software and IT industry in the news. And, more often than not, it’s a story triggered by some sort of disaster: a misconfiguration, bug, or criminals and state-actors that “got in” apparently too easily. With poor IT practices now also affecting the major industries, from energy transport to manufacturing to finance to democratic processes and good government.

Because of this, societies and various governing bodies have certainly taken notice and, as a result, around the world all sorts of software regulation and legislation are being prepared. 

History

Using engineering history as an example, such regulation is a perfectly normal result. In the late 1800s, the mechanical industry saw incredible growth thanks, in part, to the invention of the steam engine. But as this industry grew; so did the number of accidents with exploding steam boilers. Often flattening half of a given town. 

After the explosion of the steamship Sultana in 1865, which saw 1,167 people killed, pressure was placed on the industry in the United States. This resulted in the creation of the American Boiler Manufacturers Association (ABMA) to start self regulation of the industry. It took several hundreds of such explosions; and a particularly expensive one in 1905 at a Boston shoe factory for the intervention of government policies to come into being. 

Interestingly, it was not ABMA that responded to the 1905 disaster, but a group of five engineers, members of the American Society of Mechanical Engineers, a professional organization of individuals, rather than companies. These people wrote the first version of the Boiler Code that subsequently was endorsed by the Massachusetts legislature shortly thereafter. 

In many ways, these engineers, these individual volunteers “scratched an itch” to solve the problem; much like we do today in Open Source at the ASF as well as, for example, the Internet Engineering Task Force (which sets the standards for the Internet). It was the professional community which solved the problem: Not their employers, the industry, or the ABMA.

Situation today

There is currently a lot of legislation in process in almost all parts of the world; with the US and the European Union slightly ahead (and with plenty of coordination between the policy makers of the various countries). 

In this blog post we’ll focus on just one for now: the Cyber Resilience Act (CRA) in the EU, as that is “first” from a timeline perspective.

It is by no means the most important piece of legislation. At the ASF we gauge that the impact of the EU’s Product Liability Directive (introducing “strict-liability” to software), the US Executive Order 14028, “Improving the Nation’s Cybersecurity” and the “Securing Open Source Software Act of 2023“ (US), as perhaps having an even larger impact. 

This may be especially true as the US legislation could set standards for that nation through the National Institute of Standards and Technology (NIST), which is typically faster than standards developed by the EU (and thus may well set the global standard).

Is this Something That Can Even be Done?

In day to day practice, software developers rarely need to consider regulation (unless you work in some specific field, say medical, aerospace, finance, or nuclear). Open Source licenses (on our downstream outflow) and committer license agreements (on our instream) tend to have far ranging disclaimers. And we often equate code to codified knowledge or speech.

However in actual practice, things are not that simple. For example, here at the ASF we’ve had, over the years, the need to file some paperwork to let he Bureau of Industry and Security (BIS) in the United States know the exact location of cryptographic code that we make available for download [https://infra.apache.org/crypto.html]. And code distributed by the ASF cannot be exported (or re-exported) to certain destinations or to people on a certain list.

Cyber Resilience Act

In the EU the Cyber Resilience Act (CRA) is now making its way through the law-making processes (and due for a key vote on July 19, 2023). This act will apply to a wide swath of software (and hardware with embedded software) in the EU. The intent of this regulation is good (and arguably long overdue): to make software much more secure.

The act attempts to do this by a number of ways. The most important is that the CRA will require the market to apply industry good practice to security when designing, building, releasing, and maintaining software. At a most basic level, the CRA formalizes what is by and large already policy at the ASF: manage your bugs and accept, triage, and fix security vulnerabilities. This is also done by pairing this with good governance or practices; such as registering CVEs when appropriate, doing release notes, and decent versioning (and in fairness, some of those we should further formalize and improve). 

The CRA will also attempt to ensure that any and all software in the European market meets some sort of minimum level of security by fairly simple self certification documented in a CE conformity declaration. Or, for software that is more critical, such as a firewall or a secure cryptographic key enclave, an actual “real” certification and audit by an external, regulated, and notified body. The CRA will also define a number of processes to monitor compliance in the market.

EU policy makers recognize that these “industry best practices” are not yet well defined (within the industry in general, the ASF is the exception, not the rule) — and a lot of the CRA relies on the international standards organizations to create the standards one can use to audit one’s project (self certification) or that can be used by external auditors.

There is also an expectation that significant vulnerabilities will get special treatment – and that these will get reported early. More on that later.

Impact on Open Source

If you’ve followed the various blogs and letters, there has been a lot of focus by open source foundations to help refine the current wording of the CRA to make open source software “exempt”; i.e, have the CRA apply only when the code leaves the open source commons; and then continue to apply throughout the entire commercial supply chain. And also to stop the CRA from applying when something, e.g. a security fix, comes back and enters the commons again. 

By and large, these efforts have not been successful. Successive versions of the documents changed considerably – but not around this specific policy issue. 

To understand why, representatives of the ASF (together with OpenSSL), spoke directly to the EU on July 7, the first time we actually were able to interact with lawmakers in a meaningful way. 

From this conversation, we learned that the policy makers are very aware that open source is crucial to the IT industry — both for “production” and innovation. And, because of this, they want to avoid killing the goose that lays the golden eggs.

On the other hand, the EU lawmakers also realise that open source is often 95% or more of the software stack on which a typical European Small and Medium Enterprises (SME) operates or is licenced. And it is that entire stack which the SME, as the party that places it on the market, is liable for.

From what we understand, the policy makers assume that these process improvements (and (self) certification) are costly; on the order of 25% more in cost overhead. This is based on recently introduced similar regulation in the medical sector and the CRA impact assessment (any EU law proposed needs to have its likely impact in economic terms documented).

So looking at the whole stack of an SME (i.e., 95% open source, 5% secret sauce), then for most European SMEs this extra effort over the full 100% would be several times their engineering effort and hence would not be feasible. Whereas, the thinking is at the EU, certifying the 5 or 10% of the code they build on top of the open source stack is a lot more achievable.

So, for this reason, the policy makers1 have made it crystal clear to the ASF that they intend to have the CRA apply to open source foundations. The current exceptions for open source are either for pure hobbyists, code that is not used in real life, or for things such as mirrors and package repositories like NPM or Maven Central. The way they do this is a presumption of commercial intent if the software is used anywhere in a commercial environment.

EU process and the CRA’s current state

A piece of EU legislation is generally drafted by the European Commission (who also prepare things such as impact studies). It is then discussed in Parliament. This is generally done in smaller committees. These committees prepare reports and ultimately legislation then goes to a plenary session of the parliament for voting2.

For the CRA the main committees are LIBE, IMCO, and ITRE.

The first, LIBE (Committee on Civil Liberties, Justice and Home Affairs) — where things such as `free speech’ are discussed — declined to produce a report. Next IMCO, the Committee on the Internal Market and Consumer Protection, looked at what is important for the consumers and the internal market. It produced a report that was fed into ITRE. 

ITRE, the Committee on Industry, Research and Energy, has since produced a consensus document that is expected to be discussed publicly the week of 20230717 and gets its final committee endorsement (they generally do not vote on things when there is consent).

Once this completes, the proposal goes through the European Parliament for voting. Depending on how controversial or consensual it is at that time, there may, or may not, be discussion and a free vote.

In the meantime – the third party of the EU – the Council – also prepares its version of the Act. These are essentially the relevant ministers of each country that look at it from a national perspective. The three versions (EC, Parliament and Councils) are then discussed, behind closed doors, in the Trialogues – which then yields the final version that becomes law.

State of Play

Right now all parties in the lawmaking process are said to have reached rough consensus – and two of them shared with the ASF their opinion that there is no controversy. Also, copies of the various consensus documents have leaked – so we know that they are not far apart, and we can now also start to analyze them.

The problems with the CRA for the industry 

The current definitions3 are such that the CRA applies to the ASF, all of its (volunteer) developers, and all our output. And, as the ASF understands from its meeting with policy makers, this was intentional.

There are quite a few concerns with the CRA; but the following are probably tops for the ASF community.

No concept of a commons distinct from the commercial market; it is an all-in approach: The first issue is that the CRA takes a binary all or nothing approach. You are either in or you are out. And when you are in – what is applied to you is, essentially, what needs to be applied to a full blown commercial product that is sold to consumers.

While open source can be close to that (e.g., Apache Netbeans or Apache Zeppelin – albeit not sold) — open source generally is not part of that commercial setting. Instead it may be managed as a piece of shared knowledge or a `commons’. Much like for example academic papers or reference blueprints. The CRA does not acknowledge this – and hence applies itself in `full’ (as opposed to for example just applying the elements of the CRA that could make sense in that context — such as good vulnerability handling, versioning and SBOMs).

The CRA would regulate open source projects unless they have “a fully decentralized development model.” However, a project where a “corporate” employee has commit rights would not be exempt (regardless, potentially, of the upstream collaboration having little or anything to do with their employer’s commercial product). And some projects, like the venerable OpenSSL project have an even more complex model.

This turns the win-wins of open source on its head. If corporate maintainers are banned, corporations may pull back from allowing their employees to maintain projects, harming the open source innovation ecosystem and, ironically, undermining its resilience and its significant economic/growth generator (9bn€ per year according to the EU impact assessment). 

It also makes it very hard to see who in the ASF community would do the extra (self) certification work that the ASF would need to do.

The net effect for this is actually quite broad. To give an example from the “Recitals4, 10a” (and there are many such examples): 

Similarly, where the main contributors to free and open-source projects are developers employed by commercial entities and when such developers or the employer can exercise control as to which modifications are accepted in the code base, the project should generally be considered to be of a commercial nature.

Here the lack of a transactional connection between those contributors and the commercial employers is problematic. For example, the developer could be an airline pilot employed by a commercial airline (i.e. a commercial entity) – who contributes to open source in their spare time: this part of the policy would make that contribution ‘commercial’. Also, at the ASF, the main contributors (committers) are of course able to exercise a level of control over what goes into a codebase5.

And what makes matters worse is that the type of open source organizations most affected are also exactly those that, today, tend to have very mature security processes, with vulnerabilities getting triaged, fixed, and disclosed responsibly with CVEs to match. While it generally is further downstream; with the companies that place the product on the market — that the CRA needs to drive significant improvement. It now risks doing the reverse. 

The CRA affects projects that are entirely volunteer-led and -driven (e.g. such as at the ASF) where no one company has any influence on what the product does and releases. Any project where an employee of a commercial entity has commit rights is affected.

This leads to the problem: that both commercial companies and open source projects will need to be much more careful as to what committers can work on code, what funding they take, and what patches they can accept.

In the certification there is the strong assumption that (self) certification of modules is `transitive’; i.e. that if you build something from certified modules, you only have to certify the few `extra’ things you have done. Unfortunately this is not true in general; certification is generally very much about showing how, as the final, liable organization, you have made sure that what you delivered is fit for the purpose you delivered it for the specific setting at your customer. Information that is not available `upstream’ at the open source organizations that self-certified building blocks.

The core of certification is to ascertain that what you release is suitably secure for its intended purpose. Specifically, that you have done your security by design, mapped out your threat actors, vectors and risks. And then made reasonable engineering compromises based on risk. 

Unfortunately in open source we often have no idea how our software is going to be used. And, as we’ve learned (the hard way) over the past decade, it is key for the good governance of our shared commons, that we do not discriminate or otherwise limit in our licenses (in fact – that is part of the open source definition).

Some of the obligations are virtually impossible to meet:  for example there is an obligation to “deliver a product without known exploitable vulnerabilities”. This is an almost impossible bar to set; especially as the open source authors, neither know, nor have control over, how their code is integrated downstream.

The next problem is around standards. The CRA refers to a large number of `to be written’ international standards (generally assumed to be created at CEN-CENELEC). The IT industry in general, and open source in particular, does not have a great track record of working with these standard bodies — in part as almost all key internet standards (also at the ASF) are maintained at the IETF and W3C. In fact, it is not uncommon for the bylaws of these standards organizations to not allow open source organizations to be members in a meaningful way.

The CRA requires the disclosure of serious unpatched and exploited vulnerabilities to ENISA (an EU institution) within a timeline measured in hours, before they are fixed. This is opposed to what is industry best practice — responsible disclosure of the fix and workaround. 

And not only does this too-early reporting distract from getting a fix out – for international communities it is easy to run foul of other countries insisting on the same information or, worse, prohibiting such sharing. Thus breaking the very core of the fair and equitable reporting culture that open source relies on. 

And, as this information is only useful to ENISA when it is then widely shared — it is rational for organizations to choose the prudent, globally ‘fair’ option and take the easy path out: ensure you never hear about them. Or, the opposite, simply make things public right before your (first) reporting deadline rolls over, i.e., before they are fixed. 

So this is yet another example where, with all its good intentions, the CRA may end up accomplishing the exact opposite.

An Effective CRA

Looking at the IT industry in Europe now, one can observe that it is generally not open source (especially coming from the likes of the ASF) that is the root cause for the sorry state of security in the IT industry. Quite the contrary.

While, in contrast, most SMEs in Europe rarely update their dependencies and are generally not well-versed in dealing with security issue reports. And (regular) updates at the ASF creating even more (re)certification work for them may make them even slower to pick up on our updates and security fixes.

However, there is also a lot in the CRA that is feasible, and where we know that it is likely going to be effective; also at the level of open source organizations such as the ASF. 

In fact, we do most of this already today, such as good triage of vulnerability reports, responsible disclosure, registering CVEs, and being careful with version numbers. And to this we apply good governance, with board reporting by the projects and the occasional project that gets moved to the Attic when their time has come.

The problem is more that the CRA also piles on a whole range of requirements that are either threatening the very fragile “win-win” of open source contributions or our commons, which go against industry good practices or are downright impossible, i.e. it tries to treat the open source commons identical to the commercial sector. 

In fact the USA appears to realise this, and is taking the path with NIST to work with the industry to document these existing good practices. 

And to some extent – it appears that the US is closer to the historical engineer and individual -led ACME process that produced the Boiler code; while the EU seems to be more on a path of asking the manufacturers, rather than the experts.

The Internet routes around issues like this

There is of course an elephant in the room: the well-oiled mechanism that “The internet treats censorship as a malfunction and routes around it” (John Perry Barlow). 

We saw that mechanism come in action in the 90’s, when the USA tried to regulate cryptographic software. And only “export strength” cryptography could leave the US. That led to a lot of cryptographic industry and staff leaving the US, physically and legally; and a move of that industry from the USA to Europe. From where the companies would then simply import their code back into the USA or ship it from Europe, unencumbered by USA BXA rules, to the rest of the world. It took over two decades for this to normalize (and we still have the vestiges of that at the ASF).

So, as the ASF, we also need to factor in the risk that our communities may split on the CRA. 

Especially if our European communities are not able to muster enough capacity and capability to implement the CRA at the ASF.

Timeline actions

The week of July 17, 2023 will see the ITRE vote. This is the parliamentary committee that recommends to the Members of the European Parliament how to vote. Once that is done, the Trialogues will likely start after the Summer 2023 recess. If the consensus between the three powers holders (as they appear for now) – this process may conclude as early as December.

So, in the very short term, one can reach out to the MEPs of ITRE. It generally helps if these messages are polite, sent by a party with some political or economic standing (e.g. the CEO, a SME organization) and are tuned to your local setting, such as to a parliamentarian of your own country in your own local language, and mindful of the political position of the party they represent. As the regulation of open source is intentional, and there is also a lot of common sense, good (open source) practices, in the CRA: the expectation is that we are past the point where asking for a blanket exception is productive.

At the ASF we expect to focus on the Council version (as its text generally `wins’ and right now is a bit better than the ITRE consensus text). For this we can use your help: in particular if you can help us get the executives of larger SMEs in your country engaged and willing to explain the impact at a national level (just contact the VP of Public Affairs; dirkx(at)apache(dot)org). 

  1.  i.e. the people at the European Commission (DG-Connect) and the Rapporteurs -- the people at the European Parliament) ↩︎
  2. See https://www.consilium.europa.eu/en/council-eu/decision-making/ordinary-legislative-procedure/ for the details. ↩︎
  3. both in the Council and in the ITRE consensus documents (that includes IMCO input) ↩︎
  4. This is the section of the Act that explains and sets the context for the remainder of the document. It is here that the intent is documented. ↩︎
  5. A much better version would be:

    Similarly, where the main contributors to free and open-source projects are developers employed by commercial entities and when these commercial entities can exercise control as to which modifications are accepted in the code base, the project should generally be considered to be of a commercial nature. 

    But these suggestions have not been taken up - it is the intention to cover open source foundations.
    ↩︎