Background
I’m going to assume that you’re in some way familiar with the state of the world regarding cyber attacks. It’s a big, scary problem and isn’t going away anytime soon.
Those trying to effect change are using legislation as the big stick, with various types of carrots as encouragement. (some of the go-to-jail-do-not-pass-go style too)
The software industry certainly needs ‘encouragement’. It’s not our fault that bad guys are exploiting our software, but whether you like it or not, we developers are at the forefront of the cybercrime battle. A battle we’re mostly not equipped to fight - yet.
For a very long time, the IT industry has shielded developers from the gory details of cyberattacks. Initially, it was to prevent discovered vulnerabilities from being exploited by copycat attackers, and later, because dealing with vulnerabilities frequently impacted delivery.
By giving away the primary responsibility for security, we have been slowly de-educating our developers.
The cyberattack landscape has changed dramatically over the last 10 years, but the software industry has been operating as if the Equifax hack hadn’t happened, or at the very least, couldn’t happen to them. Software legislation and some of those related sticks have been a rude wake-up call for many organisations.
That 4 am feeling
That wake-up call has been more like a bucket of water in the face for many: the left-field emergence of AI as a potent business differentiator and disruptor, and as a significant weapon in the hands of the bad guys, has added more legal concerns for anyone involved in the software business. And as is often said, everyone’s in the software business now.
Digging Deeper
Although there are many discrete legislative or regulatory activities underway, they can be distilled down into some essential common elements:
Proof
Evidence, audit trails - name it what you will, but an underlying principle is that the decisions made about the software you use, the software you ship, those AI models, the data, etc, have to be recorded and kept safe and untampered.
From the first selection of a dependency (or any software artefact) by the developer through the software supply chain and onward to the consumer, recording the decisions made is becoming imperative.
Responsibility
There are legal and moral responsibility elements here. Shipping safe and secure software that will not cause harm is the legal responsibility of commercial software producers and the ethical responsibility of all software creators. It’s no longer reasonable (if it ever was) to create software that could be easily exploited.
Inadvertent design flaws and coding errors aside, creating software that ships in default insecure mode and/or suggests that “using this software safely is the consumer’s problem” is no longer acceptable.
Accountability
For things to change, governments need to create incentives, set out consequences and provide details on how to gain one and avoid the other. However, they need to choose the target for this attention wisely, otherwise nothing happens, or worse, unintentional effects occur.
Right now, and this may change and become more nuanced, the focus is on those who have a commercial interest in the software being created or shipped, as well as those who influence it. In essence, if you have a commercial software product (even if you give it away for free or make it open-source), then your organisation is accountable.
If there is a connection between that software and your business model, then you have some accountability. How liable you are is complicated and will require your lawyers to assess.
If you contribute to an open-source project in your own time, then there is no accountability. If you are paid to contribute to an open-source project, then your organisation is theoretically accountable and liable; it’s going to depend on how much influence you have. See, I told you it was complicated.
Due Diligence
There is no get-out-of-jail card available for anyone. Whether you are legally accountable or not, we’re all in this together. Whether open source project or commercial offering, the software world we live in is fundamentally interconnected. Without open source, there would be almost no commercial offerings, and the technology innovation that drives the 21st Century would stall.
To keep the momentum and steer towards safer software, we have to take responsibility for our part in the software supply chain and do the right thing.
In this case, the right thing means applying due diligence to the software we consume, the ways we create new offerings and what we ship.
It’s no longer acceptable to claim that a failing in your software is the fault (and hence the responsibility) of the creators of a component you downloaded from an open source repo - unless you have applied reasonable efforts to evaluate the component and can prove so.
This applies to any component that the said component requires. It is your responsibility to assess what's in your product.
Next Steps
If you take a step back and look at what I’ve laid out, our work will mainly be achieved through a solid software engineering approach. There are parts, such as the due diligence element, that go beyond this, but mostly, it's about rediscovering software engineering and helping our development teams think more wisely about their choices.
Does this impact productivity or the pace of innovation? That might be the initial perception, but my experience and that of many others has been that in life, slow is smooth and smooth is fast.
REFLEX
It is an educational attempt to help developers rediscover software engineering in this new world of AI, bad guys, and legislation. It’s an attempt to add the due diligence, evidence, and audit processes needed, as well as teach them about the bad guys in ways that they will understand and be essential in creating the robust and safe software we need.
REFLEX is what it says. It attempts to instil the right kind of developer muscle memory in how software is created: a way to make that thinking a habit, not just a checklist.