1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 <2012> 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 | Index | 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 <2012> 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 |
<== Date ==> | <== Thread ==> |
---|
Subject: | EPICS channel access performance test |
From: | tanushyam bhattacharjee <[email protected]> |
To: | [email protected] |
Date: | Tue, 10 Jul 2012 00:10:49 +0800 (SGT) |
We want to measure channel access performance say for an integer data type using "catime" tool and as per our expectation we have got the result in a linux-x86 platform which concludes something like this: (i) The plot "No. of PV" vs "Average Time" is like a bath tub curve. (ii) The initial slope explains the network overhead is more than the size of PV (iii) The middle plateau explains the optimised performance of CA as the number of PVs are high. (iv) The final slope explains that CPU is overloaded with no of PVs and from here we can draw the threshold optimised performance of CA for an ioc. However, when we are performing the same for ARM9 and FPGA-microblaze the final slope is not coming with the data generated though the ioc CPU is overloaded and it maintains the plateau for ever. Can anybody explain the reason? Tanushyam Bhattacharjee, Shantanoo Sahoo Variable Energy Cyclotron Centre Kolkata,India |