EPICS Controls Argonne National Laboratory

Experimental Physics and
Industrial Control System

2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  <20202021  2022  2023  2024  Index 2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  <20202021  2022  2023  2024 
<== Date ==> <== Thread ==>

Subject: Re: Jenkins test failures on macOS
From: "Johnson, Andrew N. via Core-talk" <core-talk at aps.anl.gov>
To: Michael Davidsaver <mdavidsaver at gmail.com>
Cc: EPICS core-talk <core-talk at aps.anl.gov>
Date: Tue, 4 Aug 2020 19:51:51 +0000
On Aug 4, 2020, at 11:14 AM, Michael Davidsaver <mdavidsaver at gmail.com> wrote:

On 8/4/20 9:09 AM, Johnson, Andrew N. wrote:
On Aug 3, 2020, at 9:37 PM, Michael Davidsaver <mdavidsaver at gmail.com <mailto:mdavidsaver at gmail.com>> wrote:

On 8/3/20 6:07 PM, Johnson, Andrew N. wrote:
No. I get non-zero milli- and microseconds in my time stamps now, but (counting after the decimal point) digits 7 through 9 of a %.9f seconds value are always zero.

Ok, then this seems likely to be no worse that Windows.
Out of curiosity, have you checked (eg. with epicsTimerTest)
to see what the minimum delta actually is?

In the Github issue yesterday I explained that on my laptop reading the time using the Mach clock_get_time() API takes about 1.6 microseconds and gives nanosecond precision, whereas using clock_gettime() takes only about 58 nanoseconds but gives microsecond precision (as averaged over 100000 readings by epicsTimeTest). Which is better?

I was referring to the smallest non-zero delta in the time
values returned by consecutive calls.  aka. is it really
microsecond precision?

I don’t see where epicsTimerTest is measuring that at all. There is something like that in epicsTimeTest for characterizing the monotonic time routines but not for the wall-clock time (and the calculations in the crossCheck() measurements are a bit strange).

All the time-stamps I now see from the IOC have 6 varying digits after the decimal point, then 3 zeros. The code in database/test/std/rec/simmTest now displays time-stamp differences to 9 decimal places and the last 3 of those are always 0; the smallest non-zero value I can find in those differences is 0.000001000 sec, so yes I believe it is really giving clock times with microsecond resolution.

- Andrew


-- 
Complexity comes for free, simplicity you have to work for.


References:
Jenkins build became unstable: epics-7.0 » mac #245 APS Jenkins via Core-talk
Jenkins test failures on macOS Johnson, Andrew N. via Core-talk
Re: Jenkins test failures on macOS Michael Davidsaver via Core-talk
Re: Jenkins test failures on macOS Johnson, Andrew N. via Core-talk
Re: Jenkins test failures on macOS Michael Davidsaver via Core-talk
Re: Jenkins test failures on macOS Johnson, Andrew N. via Core-talk
Re: Jenkins test failures on macOS Michael Davidsaver via Core-talk

Navigate by Date:
Prev: Build failed in Jenkins: EPICS-3.14 #750 Jenkins EPICS PSI via Core-talk
Next: Build failed: EPICS Base 7 base-7.0-59 AppVeyor via Core-talk
Index: 2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  <20202021  2022  2023  2024 
Navigate by Thread:
Prev: Re: Jenkins test failures on macOS Michael Davidsaver via Core-talk
Next: Jenkins build is back to stable : epics-7.0 » mac #246 APS Jenkins via Core-talk
Index: 2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  <20202021  2022  2023  2024 
ANJ, 05 Aug 2020 Valid HTML 4.01! · Home · News · About · Base · Modules · Extensions · Distributions · Download ·
· Search · EPICS V4 · IRMIS · Talk · Bugs · Documents · Links · Licensing ·