Dirk,
I tried a form of that where the first (horizontal) record used I/O intr and forward linked to the vertical record. I would get one value in each waveform then I would receive the error about “surplus input”.
Tony
Hello Anthony, In principle you can also use "I/O Intr" for the two longin records. They should trigger for each line. But I cannot guarantee this. record (longin, "$(P)$(R)TBT_Horizontal_tmp") { field (DTYP, "stream") field (INP, "@spark.proto getTBT_X $(PORT)") field (SCAN, "I/O Intr") field (FNLK, "$(P)$(R)TBT_Horizontal") } Similar for vertical. Terminator = "\n"; # only trigger reading getTBT_XY { out "TBT_XY 3"; } # processes the I/O Intr records once for each input line getTBT_X_line { in "%i %*i"; } getTBT_Y_line { in "%*i %i"; } Unfortunately EPICS isn't very good at multi-dimensional arrays. Dirk On 13.04.2015 14:59, Anthony Pietryla wrote: Hi Dirk,
Thank you for your reply. Unfortunately, there are many more than three rows to read, up to 256k samples/channel. I used a value of three for my debugging.
One suggestion I received was to read everything into one huge array and use an asub record to parse the values.
Thanks, Tony
On Apr 13, 2015, at 3:54 AM, Dirk Zimoch <[email protected] <mailto:[email protected]>> wrote:
Hello Anthony,
For waveform records StreamDevice repeats the format for each element and optionally reads a separator in between. While this allows to read a row, it cannot read a column. And it probably does not work with redirection (I have never tried).
This makes it a bit complicated to read your device. But hopefully not impossible. You need a few more records.
I understand that the reply always consists of 3 rows. Then I would try the following:
Have two longin records, one for X and one for X. Redirect to these records as you do now, but do it 3 times:
Terminator = "\n"; getTBT_XY { out "TBT_XY 3"; in " %(\$1TBT_Horizontal_tmp)d %(\$1TBT_Vertical_tmp)d"; in " %(\$1TBT_Horizontal_tmp)d %(\$1TBT_Vertical_tmp)d"; in " %(\$1TBT_Horizontal_tmp)d %(\$1TBT_Vertical_tmp)d"; }
This processes the two longin records 3 time each. For the waveforms use two compress records and do a FLNK from the longin records to the compress records:
record (longin, "$(P)$(R)TBT_Horizontal_tmp") { field (FNLK, "$(P)$(R)TBT_Horizontal") }
record (compress, "$(P)$(R)TBT_Horizontal") { field (INP,"$(P)$(R)TBT_Horizontal_tmp") field (ALG, "Circular Buffer") field (NSAM,"3") }
The same for Vertical.
I am not quire sure about the compress record settings. Maybe you need to try a bit. Maybe you may want to write to the .RES of the compress before starting. In that case you can use additional seq records.
Hope this helps.
Dirk
On 09.04.2015 23:01, Anthony Pietryla wrote:
I have a device which uses SCPI protocol and returns sets of waveforms in columns where I can request how many rows to return. For instance I can query the device for 3 values of position data and it will return 2 columns (X & Y) and 3 rows. I want to read the columns into separate waveform records. I am having problems with the protocol file for which I need help.
The database records and the protocol file are listed below. When I process the record only the first values (first row of data) are written to each record. With asyn debugging turned on the console shows all three sets of data:
epics> 2015/04/09 13:52:36.877 L0 wrote TBT_XY 3
2015/04/09 13:52:36.878 L0 read 2000 6000 7143 4286 4286 1429
caget shows only the first value stored:
caget -# 5 B:Spark:TBT_Horizontal B:Spark:TBT_Horizontal 5 3333 0 0 0 0
caget -# 5 B:Spark:TBT_Vertical B:Spark:TBT_Vertical 5 -1111 -3333 0 0 0
******************* The protocol file has:
getTBT_XY { Separator = "\n"; out "TBT_XY 3"; in " %(\$1TBT_Horizontal)d %(\$1TBT_Vertical)d"; ExtraInput = Ignore; }
************** The database records are:
record(waveform, "$(P)$(R)TBT_Horizontal") { field(DESC, "TBT Horizontal") field(DTYP, "stream") field(INP, "@spark.proto getTBT_XY($(P)$(R)) $(PORT)") field(NELM, "262144") field(EGU, "um") field(FTVL, "LONG") }
record(waveform, "$(P)$(R)TBT_Vertical") { field(DESC, "TBT Vertical") field(DTYP, "Soft Channel") field(NELM, "262144") field(EGU, "um") field(FTVL, "LONG") }
******************** When I comment out the “ExtraInput = Ignore;” line in the protocol file I get error messages:
2015/04/09 15:53:58.170 L0 wrote TBT_XY 3
2015/04/09 15:53:58.170 L0 asynOctetBase interrupt 2015/04/09 15:53:58.171 L0 read 3333 -1111 -3333 -5556 -2000 0
2015/04/09 15:53:58.271 L0 B:Spark:TBT_Horizontal: 29 bytes surplus input "<0a> -3333 -5556<0a> -200..." 2015/04/09 15:53:58.271 L0 B:Spark:TBT_Horizontal: after 13 bytes: " 3333 -1111"
caget -# 5 B:Spark:TBT_Horizontal B:Spark:TBT_Horizontal 5 3333 0 0 0 0
caget -# 5 B:Spark:TBT_Vertical B:Spark:TBT_Vertical 5 -1111 -3333 0 0 0
-- Anthony Pietryla Principal Controls Engineer Advanced Photon Source Argonne National Laboratory
Phone: 630-252-7430 Fax: 630-252-6123
|