EPICS Controls Argonne National Laboratory

Experimental Physics and
Industrial Control System

1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  <20232024  Index 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  <20232024 
<== Date ==> <== Thread ==>

Subject: Re: Safely proceeding machine learning applications
From: Joshua Einstein-Curtis via Tech-talk <tech-talk at aps.anl.gov>
To: tech-talk at aps.anl.gov
Date: Tue, 29 Aug 2023 10:40:07 -0600
Tong,

I think about this a lot when working with our ML models -- and the best I can some up with are guidelines similar to those in any safety-critical design:

- Don't let your controller even have the ability to output something that might damage anything

I am not a fan of relying on access controls as any sort of primary safeguard, as those are outside the purview of the controller itself. If a controller has a capability to damage something (PPS or MPS), then it feels like that is just a huge risk. Seeing PID loops go wrong in RF really highlights that. Now on the flip side, I love access controls for mitigating possible configuration errors -- and having something pop up if you write the wrong PV by mistake is critical. But where that is controlled and who configures that is an interesting question -- I'd rather a pva/ca proxy running on the same machine as the controller and build the access controls right into it.

I'd love to hear other people's thoughts -- this would be a great topic at a workshop.

Josh EC

On Tue, Aug 29, 2023 at 9:52 AM Zhang, Tong via Tech-talk <tech-talk at aps.anl.gov> wrote:

Dear Colleguages,

 

Machine learning applications in accelerator controls are indeed gaining popularity, and there are exciting developments in progress. However, concerns persist regarding equipment protection, particularly when dealing with black-box ML models that may make risky decisions, especially during optimization iterations.

 

When it comes to ML model generation, utilizing archived data is a viable approach. However, during the application phase, these models may still generate audacious decisions. Even when trained with live data, the risk remains.

 

As far as I know, leveraging Channel Access security configuration is a sound strategy to manage PV write permissions at a granular level, covering individuals, groups, and workstations. This level of control ensures that the ML code's write permissions can be finely tuned. I’m still wondering is this way totally secure?

 

Absolutely, incorporating the machine protection system as the primary safeguard on the device side is crucial. Your valuable insights/experience on this subject are greatly appreciated.

 

Thanks,

Tong

 

--

Tong Zhang, Ph.D. (he/him)

Controls Physicist

Facility for Rare Isotope Beams,

Michigan State University

 


Replies:
Re: Safely proceeding machine learning applications Pete Jemian via Tech-talk
References:
Safely proceeding machine learning applications Zhang, Tong via Tech-talk

Navigate by Date:
Prev: Safely proceeding machine learning applications Zhang, Tong via Tech-talk
Next: Re: Safely proceeding machine learning applications Pete Jemian via Tech-talk
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  <20232024 
Navigate by Thread:
Prev: Safely proceeding machine learning applications Zhang, Tong via Tech-talk
Next: Re: Safely proceeding machine learning applications Pete Jemian via Tech-talk
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  <20232024 
ANJ, 29 Aug 2023 Valid HTML 4.01! · Home · News · About · Base · Modules · Extensions · Distributions · Download ·
· Search · EPICS V4 · IRMIS · Talk · Bugs · Documents · Links · Licensing ·