EPICS Controls Argonne National Laboratory

Experimental Physics and
Industrial Control System

1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  <20212022  2023  2024  Index 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  <20212022  2023  2024 
<== Date ==> <== Thread ==>

Subject: Re: Channel connection performance
From: Matt Newville via Tech-talk <tech-talk at aps.anl.gov>
To: "Hu, Yong" <yhu at bnl.gov>
Cc: "Veseli, Sinisa" <sveseli at anl.gov>, "tech-talk at aps.anl.gov" <tech-talk at aps.anl.gov>
Date: Tue, 26 Jan 2021 17:28:51 -0600
You might find the write-up at http://pyepics.github.io/pyepics/advanced.html#strategies-for-connecting-to-a-large-number-of-pvs useful too.  Although the Python objects created there do have minimal overhead, most of the time is spent waiting for the initial connection of the channel, and the CA calls shown there are nearly exactly what you would see with C and libCA.   

As Sinisa and Yu Hong find, connecting to and getting 1000 PV can be as fast as 0.25 seconds, but probably not much faster than that.  


On Tue, Jan 26, 2021 at 5:11 PM Hu, Yong via Tech-talk <tech-talk at aps.anl.gov> wrote:

On an ordinary Linux server, I did a few tests using cothread, which is a python binding of EPICS V3 Channel Access. And I got similar results as Sinisa got:

 

# Firstly, get a list of PVs (i.e. from a file). Connecting to 1269 PVs (which are from lots of different IOCs) takes about 0.3-seconds

>>> len(pvs)

1269

>>> pvs[:10]

['ACC-TS{EVG-AcTrig}Bypass-Sel', 'ACC-TS{EVG-AcTrig}Divider-SP', 'ACC-TS{EVG-AcTrig}Phase-SP', 'ACC-TS{EVG-AcTrig}SyncSrc-Sel', 'ACC-TS{EVG-Dbus:0}Map-Sel', 'ACC-TS{EVG-Dbus:0}MapConv-Sel_', 'ACC-TS{EVG-Dbus:0}Omsl-FOut', 'ACC-TS{EVG-Dbus:0}Src-Sel', 'ACC-TS{EVG-Dbus:1}Map-Sel', 'ACC-TS{EVG-Dbus:1}MapConv-Sel_']

>>> from cothread.catools import connect

>>> import time

>>> t0=time.time(); results = connect(pvs, cainfo=True, throw=False); t1=time.time(); print(t1-t0)

0.299956083298

 

# another test on 38380 PVs: connection takes less than 7-seconds

>>> len(pvs)

38380

>>> t0=time.time(); results = connect(pvs, cainfo=True, throw=False); t1=time.time(); print(t1-t0)

6.67425608635

 

http://controls.diamond.ac.uk/downloads/python/cothread/

 

 

From: Tech-talk <tech-talk-bounces at aps.anl.gov> on behalf of "tech-talk at aps.anl.gov" <tech-talk at aps.anl.gov>
Reply-To: "Veseli, Sinisa" <sveseli at anl.gov>
Date: Monday, January 25, 2021 at 3:48 PM
Cc: "tech-talk at aps.anl.gov" <tech-talk at aps.anl.gov>
Subject: Re: Channel connection performance

 

Hi,

 

If you are already using pvaClientCPP, you could also take a look at the MultiChannel class (examples can be found here: https://github.com/epics-base/exampleCPP/tree/master/exampleClient/src)

 

 

Here is how long it takes on my machine to connect and retrieve values for 1000 CA channels using pvapy (both client and IOC are running on the same machine):

 

>>> import time

>>> from pvaccess import *

>>> cList = ['X%s' % i for i in range (1,1001)]

>>> c = MultiChannel(cList, CA)

>>> t0 = time.time(); pv=c.get(); t1=time.time() ; print(t1-t0)

0.21088194847106934

>>>

 

Sinisa

 

 

--

Siniša Veseli

Scientific Software Engineering & Data Management

Advanced Photon Source

Argonne National Laboratory

(630)252-9182


From: Tech-talk <tech-talk-bounces at aps.anl.gov> on behalf of Michael Davidsaver via Tech-talk <tech-talk at aps.anl.gov>
Sent: Monday, January 25, 2021 2:19 PM
To: Axel Terfloth <terfloth at itemis.de>
Cc: tech-talk at aps.anl.gov <tech-talk at aps.anl.gov>
Subject: Re: Channel connection performance

 

On 1/25/21 11:00 AM, Axel Terfloth via Tech-talk wrote:
> Hello,
>
> i’m new to EPICS and take a closer look at it for evaluation purpose. To obtain a better understanding I started to dig deeper by implementing some tests which are based on the C++ examples. From a functional point of view everything I tried works nicely so far. Also the performance is generally very good in most of my test scenarios. Nevertheless one issue i came across is the time required for connecting to a channel  via PVA. In this scenario I set up 4000 process variables and implement one client which implements high frequency put and get access to a quarter (1000) of these proces variables. On my machine (which runs all processes for now) if takes about 250ms for a single channel connection. Once all channels are connected is really fast. Nevertheless connecting 1000 pvs takes more than 6 minutes. I currently connect channel by channel using the PvaClient API in my tests. Is there a more efficient way to connect to a large set of channels?  I would appreciate any hint.
Can you be more specific about what you application is doing?  A common source
of slowdown with CA or PVA clients is for user code the is to perform network
operations in sequence.  This can be done in parallel, which can give a nice speedup.


You might want to investigate my new (hopefully replacement) PVA library
which makes it easy to run many operations in parallel.

https://mdavidsaver.github.io/pvxs/

eg. the equivalent of 'pvget'

https://github.com/mdavidsaver/pvxs/blob/d0e82744d30256e36585b51af2704cc01a938e18/tools/get.cpp#L109-L132


If this isn't an option, then is the 'pvac' wrapper API which provides a
similar abstraction, though with more verbose syntax.

http://epics-base.github.io/pvAccessCPP/

eg. parallel get operations

http://epics-base.github.io/pvAccessCPP/examples_getme.html




References:
Channel connection performance Axel Terfloth via Tech-talk
Re: Channel connection performance Michael Davidsaver via Tech-talk
Re: Channel connection performance Veseli, Sinisa via Tech-talk
Re: Channel connection performance Hu, Yong via Tech-talk

Navigate by Date:
Prev: Re: Channel connection performance Hu, Yong via Tech-talk
Next: RE: EPICS Archiver routing issue Manoussakis, Adamandios via Tech-talk
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  <20212022  2023  2024 
Navigate by Thread:
Prev: Re: Channel connection performance Hu, Yong via Tech-talk
Next: Altering timestamp of softIOC PV Reimer, Paul E. via Tech-talk
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  <20212022  2023  2024 
ANJ, 26 Jan 2021 Valid HTML 4.01! · Home · News · About · Base · Modules · Extensions · Distributions · Download ·
· Search · EPICS V4 · IRMIS · Talk · Bugs · Documents · Links · Licensing ·