There's a new Draft for Comments available on the NIST website of a publication entitled Systems Security Engineering: Cyber Resiliency Considerations for the Engineering of Trustworthy Secure Systems:
You can download the PDF file from the above page. Quoting from the announcement:
"Certain types of advanced threats have the capability to breach our critical systems, establish a presence within those systems (often undetected), and inflict immediate and long-term damage to the economic and national security interests of the Nation."
I think we need to take these threats seriously and ensure that the systems under our control are as secure as we can make them. The publication itself quotes Carl Landwehr in the February 2015 issue of Communications of the ACM:
"This whole economic boom in cybersecurity seems largely to be a consequence of poor engineering."
Be careful out there!
This sound like an Organisation justifying it's existence.
"This whole economic boom in cyber-security seems largely to be a consequence of poor engineering."
I disagree - it's about lack of knowledge/understanding and educating Engineers into best practices - and I'm no better or worse than my fellow designers !
"....we must develop trustworthy, secure IT components, services, and systems that are cyber resilient. Cyber resilient systems are those systems that have required security safeguards "built in" as a foundational part of the system architecture and design; and moreover, display a high level of resiliency which means the systems can withstand an attack, and continue to operate even in a degraded or debilitated state --carrying out mission-essential functions."
Fine words - you design a system which is 100% resilient and we'll prove it isn't 100%!
>Fine words - you design a system which is 100% resilient and
>we'll prove it isn't 100%!
Well, 100% is a fine goal, but it will always be aspirational. Think of it as a never-ending creativity contest between a remote adversary and your engineering.
I do imagine that NIST is under a lot of pressure to address demonstrated flaws in control system security, and the resiliency approach adds an additional layer of thinking onto the issue -- a sort of belt-and-suspenders (uh, that's "braces" to you Brits) approach.
What happens if there is a security breach? Are other systems in place to prevent catastrophic damage? Can we detect and recover quickly?
In some ways, this is just good engineering -- eliminating single points of failure in critical systems, except that the "single point" has to be viewed as the collection of all things that can be remotely compromised. An interesting problem.
Cybersecurity is all about defense in depth--layers of protection rather than assuming one defense is foolproof protection. Poor engineering has often compromised what should have been easy layers--hardcoded passwords on embedded systems for example.
You also have to know what your level of acceptable risk is--just like in functional safety, there is no 100%, there is a judgement call on what level of risk is acceptable. Standards groups can provide guidance on what levels of risk are acceptable for different industries and processes.
It would at least be nice to raise the cybersecrity bar from high-school kids to nation-state level advanced persistent threats wherever we can.