1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 <2023> 2024 2025 | Index | 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 <2023> 2024 2025 |
<== Date ==> | <== Thread ==> |
---|
Subject: | Re: [EXTERNAL] Re: EPICS Software Supply Chain Risk Management (SSCRM) |
From: | Gedare Bloom via Tech-talk <tech-talk at aps.anl.gov> |
To: | Joshua Einstein-Curtis <joshec at radiasoft.net> |
Cc: | "tech-talk at aps.anl.gov" <tech-talk at aps.anl.gov> |
Date: | Wed, 12 Jul 2023 09:28:55 -0600 |
Just to tack on if I missed it, the workshop also brought up the NIST recommendations 800-82 (https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-82r2.pdf). We can talk all we want about "security" and "safety", but until we define what that really means in each context, the discussion is another problem altogether. Especially with the phrase "safety-critical" -- formally verified code does not mean the communication network between meets the same level of safety. I really should read those ICALEPS reports though.EPICS not being a source of malware is one thing, but what could happen to an epics system if malware is on the network is another. Just saying one has a "trusted developer" is already pushing the risk off on whoever validated that developer in the first place. "Where the trust falls" if you will. That's one reason inCommon is so interesting to me -- it is a pay-to-verify identify system at the organizational level.Buying a commercial solution outright (EPICS or otherwise) shifts the risk and responsibility of what happens when there is a failure to the commercial partner. Like the DO-254 certification process -- as long as you know the requirements, it can be certified....until something happens that wasn't in the requirements.- Josh ECOn Tue, Jul 11, 2023 at 6:48 AM Evans, Richard K. (GRC-H000) via Tech-talk <tech-talk at aps.anl.gov> wrote:Hi Shen,
Oh, yes. We’ll do all those things too as appropriate for each system if that time comes. As I mentioned to Jonathan, my aim in this post was only to share what the NASA GRC-ATF rationale is for getting EPICS on the agency’s “Assessed and Cleared” list. This has nothing to do with how well suited it is for any specific use but rather is focused on it explaining why we have confidence that it won’t be a source of malware from a development point-of-view (hence the whole Trusted Developer/Trusted Repository model).
In other words, we have to show that it is safe for the Agency to install at all before we can even begin to discuss if it is capable of being used to mitigate/control a given hazard inherent in the system that needs to be controlled. That said, I’m encouraged by your response here.. It sounds like if/when we get to that point there is indeed a path for using it in situations that require safety critical controls.
Thank you!
/Rich
From: Shen, Guobao <gshen at anl.gov>
Sent: Tuesday, July 11, 2023 8:41 AM
To: Evans, Richard K. (GRC-H000) <richard.k.evans at nasa.gov>; Jonathan Jacky <jon.p.jacky at gmail.com>
Cc: tech-talk at aps.anl.gov; S Banerian <banerian at uw.edu>
Subject: Re: [EXTERNAL] Re: EPICS Software Supply Chain Risk Management (SSCRM)
Rich,
To be compliant with the cybersecurity requirement, you might also consider to design the system as a whole together with your IT infrastructure.
For example, deploying a firewall dedicated to your instrument control to isolate your whole system from the public network, from your campus network, and from your office network in case you have dedicated firewall for that; or further adopt other method (e.g., MFA) to restrict access to your facility instrument control network which could reduce the risk.
Together with the other technical solutions mentioned, you might be able to satisfy the needs.
Thanks,
Guobao