> From:
[email protected]> 1. the time we want to use (something like local time, UTC, GPS time, TAI)
> 2. the representation of this time in a computer/FPGA/PLC/IOC/...
1. Time for humans
2. Time for machines
Possibly create a a small library/network service to convert from one to the other
For 2, you probably will have to use several timing sources.
> Regarding 1.:
>
> In my eyes using daylight savings time is calling for trouble. Think
> about an archiver that has to deal with gaps and getting samples for the
> same hour twice. This is also about all the clients, scripts, and
> operators that need to keep this in mind.
Do you know how this is solved in the archiver applications? (EPICS timestamps)
Better yet, do you know how the archiver is used?
I believe operators do not care so much about time accuracy.
They navigate the archiver looking for events.
"When was this changed?"
"What happened then?"
To look for events, operators use approximative time.
"Give me the data for the past 10 days"
Then they find the event and zoom on it.
What matters is to them is that the data can be correlated (i.e when the event occurred, you have an accurate state of the machine)
> Using local (winter) time might be ok for the EPICS environment itself.
Winter time? No idea it was that cold in Michigan! Come to California ;-)
> But it will be difficult to tell which of two firmware files generated
> in different time zones is newer. And we probably do want to use the
> same time for everything...
Given an infinite amount of money, time, and resource... yes. Otherwise, that's wishful thinking.
In practice, you need to look at the timing requirement of each system.
Are they issues with synchronization, accuracy,... what about jitter, drift, etc?
How reliable/trustworthy does you timing source needs to be?
For accuracy and reliability, don't build an infrastructure on software/network based clocks, use hardware.
> If you want to get rid of leap seconds you can go for GPS time [1] or
> International Atomic Time (TAI, [2]). Both do not have leap seconds and
> thus slowly diverge from UTC (right now GPST = UTC + 16 s, TAI = UTC +
> 35 s).
> TAI and GPST seem to be ideal to me for an accelerator control system.
> They can be easily derived with high precision from GPS clocks. The
> downside is that operators have to work with different times on their
> GUI and their watch.
GPS doesn't work indoor. How will you route your timing signals? Network? Dedicated copper wires?
How you bring the timing signals to the equipment should also be taken into consideration.
Are your operators interested in exact timing (where 16sec is an issue)? Not at my site, a light source.
(Most EPICS facilities are light sources BTW. JLAB is the only other nuclear facility that uses EPICS. Find your peer there!)
But I can tell you that if they were, I would not ask them do mental exercises.
The first thing they will ask you is to fix it, if you can't they will wonder why you were picked for that job!
(solution: maybe use a conversion utility?)
To me, this is a classic human computer interface issue.
The high level software is designed for humans. (think productivity)
The low level software is designed for machines. (think reliability/accuracy)
> Regarding 2:
>
> It seems like most of our current systems use POSIX timestamps or EPICS
> timestamps for internal representation of time. Unfortunately they both
> come with several issues:
>
> 1. 32 bit POSIX timestamps will overflow in 2038 [3] which is within the
> 2. POSIX as well as EPICS timestamps do not provide a representation for
> 3. Leap seconds make it much more difficult to calculate the time
> elapsed between two timestamps (which is what nuclear physicists usually
> do when working with their data).
Based on your description, I would say that you need to have different timing systems.
Once, I was asked about monitoring the on/off/standby states of our modulators.
I first modified the EPICS driver and implemented the feature in the IOC.
But, the IOC needs to be up. Because we wanted statistic for each month, the developer needs to make sure the timing is working.
The computer needs to be properly synch'ed. etc. They are too many way to break it.
I would always get timing but was not sure of their accuracy (is the error 0%, 1%, or 20%)
OK, then we moved to an external timing source (a piece of hardware whose sole function is to keep track of time), well that worked but the best solution was to ask the modulator supplier to include the feature in its modulator's PLCs.
If your end users (and not operators) need to have the information about exact timing, how long are your experiments?
What are the level of accuracy and precision needed?
> We need to make sure we have correct
> leap-second data available for all software that calculates time
> differences. This seems to be practically impossible to me.
Hardware-based timestamps?
But only you know your users and applications.
> There are still some open questions:
> What effects do different timestamps have on network connections that
> cross the office/accelerator network boundary? Some things that come to
> my mind:
Here you will have many problems.
Why do you want your OS to have the same timing requirement as the data collected by your end-user applications?
Use a standard NTP server, and when you collect your data use timestamps from another source (hardware?).
> What else do we need to keep in mind? Can anyone provide some experience
> with running a control system on a time that is different from local
> time/UTC?
In our case, we have a control room which can "attach" to different machines in different time zones. (we are unique that way)
Because our remote machines use their respective local time zone, we have to go through a few mental exercises of subtracting or adding hours in our remote archivers. It could have been implemented differently (force everyone to use a universal time), but we decided to put the burden on ourselves and not our users/clients/operators.
> Martin
Comments?
--
Emmanuel