Guillaume Blanchard wrote: |
Hello,
I am designing a DAQ board with both DRS4 + AD9222 and a FPGA to monitor.
Do I have to change the default value of O-OFS ?
Does a simple low-pass filter (series resistor + capacitor) on each AD9222 input is enough to limit the noise ?
I am planning to use the (DRS4,AD9222,FPGA) group as both a trigger and digitizing system (as shown in the DRS4 datasheet). The DRS4 will be working at 5Ghz with 8 active channels.
So each channel will have a time depth of 1/5Ghz x 1024 = 204.8ns. So, in order to miss nothing, the ADC latency + the trigger decision must be inferior to 204.8ns, am I correct ?
This leads me to implement on my board the 65Mhz version of the AD9222 as this converter has a 8 clock period latency, i.e. 123ns and it left me 81ns to perform a trigger decision ?
Cordially,
G.Blanchard
|
Like Stefan pointed out, your time constraints are quite tight. In those 81 ns, you also need to deserialize the AD9222 output. Unless you implement some really fancy input comparison logic, this will consume another 1-2 ADC clock cycles. Perhaps you should first verify that your FPGA design actually can do its job within those 81 ns. In our system, we sample at only 1-2 GHz and have enough margin to implement really complex triggers in FPGA. But the total latency (ADC + FPGA deserialization) takes 250 ns.
Depending on the application, you do need a low-pass filter. Not only because of the noise, but also in order to be able to trigger reliably. Using fast PMTs for example, you will not be able to see all pulses in full size if the bandwidth is 50 MHz and you're only sampling at 65 MSPS.
Hannes |