We're using an Analog Discovery 1 as a delayed-trigger oscilloscope. We route the incoming trigger to the digital pattern generator, delay it by a user selectable amount, and send the result to the analog scope channel two as a trigger. This allows us to sweep the 8192 sample buffer of the analog scope along a repetitive trace and capture more samples, one buffer at a time.
Our question is about the accuracy of the digital pattern generator's wait state.
Is the wait delay determined by counting down from a clock?
If so, which one? The 100 MHz master clock?
Does that mean you have implemented a 36 or 37 bit counter in the system, in order to reach one day (86400 seconds) with 10 nS precision?
We'd really appreciate any information on this topic. Right now we're not relying on the wait state out of uncertainty about how it's determined. If we could use wait, it would make our lives and our code nicer
Question
blindgoat
We're using an Analog Discovery 1 as a delayed-trigger oscilloscope. We route the incoming trigger to the digital pattern generator, delay it by a user selectable amount, and send the result to the analog scope channel two as a trigger. This allows us to sweep the 8192 sample buffer of the analog scope along a repetitive trace and capture more samples, one buffer at a time.
Our question is about the accuracy of the digital pattern generator's wait state.
We'd really appreciate any information on this topic. Right now we're not relying on the wait state out of uncertainty about how it's determined. If we could use wait, it would make our lives and our code nicer
Thanks.
2 answers to this question
Recommended Posts
Archived
This topic is now archived and is closed to further replies.