Hello,
I was running through a particular binary file containing data taken on all 4 channels of the DRS4 and printing out the value of the first time sample for each channel (per event). While doing so, I noticed that some of these times were negative. For this dataset, channel 1 was chosen as the reference channel (which is the default setup in Stefan's DRS4 macro). From my understanding, the calibration procedure delineated in the DRS4 manual and shown in the code below is supposed to sync the timing of each channel relative to sample 0. However, this does not appear to be the case for when I print out the time value of the first sample, I notice that only channel 1's 0th sample is set to 0. The first sample for the other channels is nonzero (and most often negative).
Even more strange is when I test another 4-channel dataset with the same code, this issue does not appear. More specifically, the first time sample on each waveform on all channels is set to 0, as should be the case.
My question is therefore whether or not the timing calibration varies from dataset to dataset. My initial thought was that this should not be the case, but I have two different data sets taken on the same set of channels which give different timing calibration results. Are there any circumstances under which this behavior can happen?
for (ch=0 ; ch<5 ; ch++) {
i = fread(hdr, sizeof(hdr), 1, f);
if (i < 1)
break;
if (hdr[0] != 'C') {
// event header found
fseek(f, -4, SEEK_CUR);
break;
}
chn_index = hdr[3] - '0' - 1;
fread(voltage, sizeof(short), 1024, f);
for (i=0 ; i<1024 ; i++) {
// convert data to volts
waveform[chn_index][i] = (voltage[i] / 65536. - 0.5);
// calculate time for this cell
for (j=0,time[chn_index][i]=0 ; j<i ; j++)
time[chn_index][i] += bin_width[chn_index][(j+eh.trigger_cell) % 1024];
}
}
// align cell #0 of all channels
t1 = time[0][(1024-eh.trigger_cell) % 1024];
for (ch=1 ; ch<4 ; ch++) {
t2 = time[ch][(1024-eh.trigger_cell) % 1024];
dt = t1 - t2;
for (i=0 ; i<1024 ; i++)
time[ch][i] += dt;
}
|