EPICS Controls Argonne National Laboratory

Experimental Physics and
Industrial Control System

1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  <20072008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024  Index 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  <20072008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024 
<== Date ==> <== Thread ==>

Subject: Re: Maximum archival rate
From: Kay-Uwe Kasemir <[email protected]>
To: Terry Cornall <[email protected]>
Cc: [email protected]
Date: Thu, 29 Mar 2007 09:05:09 -0400

On Mar 29, 2007, at 02:04 , Terry Cornall wrote:


Hi all.
I was wondering if anyone has a feeling (or a measured result) for the maximum or typical number of values per second that can reasonably be archived with a modern PC architecture (and if so, an indication of the type of computer used)


I'm trying to dimension the archive requirements for all the 14 beamlines for the Australian Synchrotron and am wondering if I can do it
with one PC or if I need many. Of course, sampling or monitoring rates will play a big part, but also I need to know what is possible per archive engine.


Thanks,

Terry Cornall M.Eng.Sc  B.Sc.
Beamlines Control Systems Engineer

Australian Synchrotron Project
Hi:

This naive question doesn't have as simple an answer as you might hope.

One archive engine can usually collect about 10000 samples/second.
For example, monitor 1000 channels all updating at 10Hz.

Problem 1:
You won't know what your missing.
The channel access protocol can go into "flow control",
skipping a few samples, but it's really hard to find out
if/when/how that's happening.

Problem 2:
The amount of data. One 'double' sample uses a little over
20 bytes for the timestamp, status, sev, value.
For many values, the data file structure and index add
relatively little to that, but you'll still get about 20GB
per day. How do you intend to back that up?

At the SNS, we run about 70 different engines.
They are typically separated by subsystem,
and often restarted daily to limit the amount of
possible data loss in case a sub-archive gets corrupted.
Some store only a few values each second,
others store the value of the IOC clock each second
(don't ask me why), others store about
1000 values each second, including waveforms.
The total amount of data for Feb. 2007 is about 150G.

When you try to allow access to 'all' or as much as possible,
that means that somebody has to periodically create indices.
This is mostly automated, but in case there's a problem,
for example one index reaches the 2GB file size limit
and things need to get reorganized, that process takes a lot
of time. The index mechanism certainly needs some improvement,
but even after we eventually get that,
moving those amounts of data over the network takes many days.
If somebody tells you that disks are cheap, please ask
that person to take care of your archiving, then run
as fast as you can.

-Kay


Replies:
Re: Maximum archival rate Maren Purves
References:
Maximum archival rate Terry Cornall

Navigate by Date:
Prev: Re: "undefined symbol" error when download st.cmd (base3.14.9-pre2) Ralph Lange
Next: WireSet & mpc8540 Korhonen Timo
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  <20072008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024 
Navigate by Thread:
Prev: Maximum archival rate Terry Cornall
Next: Re: Maximum archival rate Maren Purves
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  <20072008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024 
ANJ, 10 Nov 2011 Valid HTML 4.01! · Home · News · About · Base · Modules · Extensions · Distributions · Download ·
· Search · EPICS V4 · IRMIS · Talk · Bugs · Documents · Links · Licensing ·