EPICS Home

Experimental Physics and Industrial Control System


 
1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  <20052006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024  Index 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  <20052006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024 
<== Date ==> <== Thread ==>

Subject: RE: Real time video with epics
From: Aladwan Ahed <[email protected]>
To: Hunt Steven <[email protected]>
Cc: "'Kay-Uwe Kasemir'" <[email protected]>, Emmanuel Mayssat <[email protected]>, Tech-talk <[email protected]>
Date: Thu, 17 Nov 2005 12:04:11 +0100

Hi All,

 

On Wednesday, November 16, 2005 10:36 PM, Steven Hunt wrote:

 

> The original firewire support for Epics I wrote (a long time ago), used

> periodic record scanning.  This has the nasty effect that you have to

> scan at twice the frame rate to be sure you miss nothing.

 

The latest version of the software uses a non blocking poll mechanism, this means, when we read from the video1394 dma ring buffer, the caller will not wait if there is no ready frame, more over, we set a flag (drop_frames) that forces the caller to drop all the frames in the buffer and take the latest frame captured. This guarantees that each time we made a call we get the last frame captured.

 

 

> but you can transfer the parameters

> (beam size and position for instance), and only send the image at a

> lower rate.

 

That what we do too, but, we also do more processing which justifies the high CPU load for the video server PC (2.8 GHz, 1GB RAM, typical load 80%), we do online centroid finding algorithm, background subtraction, averaging, horizontal and vertical distribution/profile calculation, region of interest selection, in addition to other calculations like the maximums and the standard deviation. All of these values are available usually at 10 Hz. Recently the scientist is asking for implementing 2D Gaussian fit and the list will continuo to grow. Steve, I am really wondering if you have a dedicated IOC to run your video server on, can you give more information about the performance?

 

The whole setup we are using costs around USD 1400 (Flea point grey camera, SUSI pundit PC). As Linux RH7.3 is not a real time OS, if RT becomes a requirement, we might consider Linux RT.

 

 

>> On Wednesday, November 16, 2005 9:01 PM, Kay-Uwe Kasemir wrote:

 

>> But under vxWorks, especially after increasing the vxWorks 'tick'

>> clock rate, one can usually simply change the menuScan.dbd file

>> and add e.g. ".05 second" and voila:

>> "epics can ... process .. at a maximum rate of 20Hz".

 

In our setup, the Linux system ticks is 100 Hz, we modified the menuScan.dbd to different values, like 15 Hz, 30 Hz, and 50 Hz. I managed to process 15 frames/sec with our Camera (1024*768), when I select a region of interest, I processes all the 30 frames the camera is able to capture, which is similar to choosing a Camera with lower resolution.

 

>> Just like ADC driver/device support can be written to process

>> records on "I/O Intr" whenever the ADC receives a trigger,

>> you can write your frame grabber support to trigger record

>> processing whenever the frame grabber gets a new image.

>> The IOC application developer guide has details on the

>> "I/O Intr" mechanism. The EPICS "Event" mechanism might also

>> work.

 

Now we trigger the camera externally to synchronize it with the Laser source, and implementing the "I/O Intr" method of grabbing the frames is a better choice.

 

>> The next problem:

>> When records process at a high rate _and_ there are

>> channel access clients which subscribed to updates from those

>> records, every time the records process, data is sent to those

>> clients (ignoring some ADEL/MDEL details).

>> In the case of the SNS LLRF, those records include waveforms.

>> All is OK as long as only one set of EDM screens is displaying

>> those waveforms.

 

To overcome this problem we thought about using compression algorithms, but the CA protocol does not allow to resize dynamically (at run time) data blocks of waveforms. This implies that images, which are transferred from the EPICS server to the client application have a fixed size, and this, in turns, limits the use of image compressing algorithms, I hope this limitation will disappear with V4.

 

Ahed

 


Replies:
RE: Real time video with epics Jeff Hill

Navigate by Date:
Prev: RE: synApps Workshop Rees, NP (Nick)
Next: RE: Real time video with epics Singleton, SJ (Steve)
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  <20052006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024 
Navigate by Thread:
Prev: Re: Next RTEMS-uC5282 build problem Eric Norum
Next: RE: Real time video with epics Jeff Hill
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  <20052006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  2017  2018  2019  2020  2021  2022  2023  2024