Jump to content
  • 0

Is it possible to run Analog Discovery 3 at low sample rates (e.g., 100 kS/s)?


Jim Luby

Question

Hello,

I'm mentoring undergraduate students in underwater acoustics and would like to use my recently acquired Analog Discovery 3 module as a digitizer. I opened up the Logger capability within Waveforms and see that the minimum sample rate is presently set to 50 Ms/s. I'm wondering if this is a hardware limit (i.e., the board simply cannot sample at lower rates) or whether it might be possible in a future software release to run the board at sample rates as low as 100 kS/s. Ideally, I'd like to use the module to digitize two analog channels coherently (i.e., simultaneous sample/hold). Even better would be the capability to trigger the module and have it collect two channels of data for a fixed period of time. This would allow me and my students to use the module as a two channel A/D converter. I've looked for affordable (in an education environment) USB multi-channel A/D modules but most cost several thousand dollars and are out of reach of our budget.

Thanks!

Jim Luby

Link to comment
Share on other sites

16 answers to this question

Recommended Posts

  • 1

Hi @Jim Luby,

I'm not sure where you're seeing the 50 MS/s within the Logger specifically (maybe overall system frequency selection?), but within the Scope tool, you can easily set the sample rate by expanding the time dropdown on the right hand side and picking your desired sample rate (the gear at the top of the screenshot next to the HoldOff will let you choose if those samples are being decimated or averaged at the chosen rate).

image.png

In terms of recording both channels for a set amount of time on a trigger, you can use either variant of Record mode (change the Mode dropdown from Repeated to Record for recording to the host computer RAM or do the "Rec." option next to Export for recording to directly to file) and then choose the Config option to set the length of time and trigger position. The trigger itself is set the top toolbar where you can choose the source and condition; you can choose something like a rising or falling edge or a pulse of some length for one of your analog inputs, or be triggered off of the start of one of the Wavegen channels or the Logic Analyzer. The amount of samples / time base / rate will dictate for how long you are recording.

image.png

What do you mean by the digitizing of the signals? Are you referring getting Measurements off of the signals (such as getting the measured frequency of the signal shown on screen) or are you looking for the Persistence view option? (I'm only familiar with sample/hold phrasing for spectrum analysis)

Let me know if you have any questions.

Thanks,
JColvin

Link to comment
Share on other sites

  • 1

Hi @Jim Luby

The various device hardware resources can be used by one instrument at a time. For instance the oscilloscope can be used by the Scope, Spectrum, Voltmeter, Logger, Tracer, Network or Impedance Analyzer interface. The last used instrument takes control over the oscilloscope resource and the others display busy status. Each of these instruments let you configure the oscilloscope. These is no 'parameter' ported from one instrument to the other.

The oscilloscope channels (by definition) have identical delays, however this can be affected by the cabling. In the Scope interface each channel can be shifted to compensate for different delays. In the Impedance Analyzer the open/short compensation should minimize/eliminate the wiring effects.

Link to comment
Share on other sites

  • 1

Hi @zygot,

Best of my understanding is that the rate that the ADC IC itself runs at is equal to the selected system clock frequency (which for the Analog Discovery 3 is adjustable between 50 MHz and 125 MHz, unlike the Analog Discovery 2 which was set at 100 MHz and unadjustable).

The sampling frequency of the instrument itself can either be set by the user or is automatically selected by WaveForms based on the time-base selected; the available sampling frequencies are natural divisions of the overall system clock frequency (so 50 MHz, 33 MHz, 25 MHz, etc for a 100 MHz system frequency or 62.5 MHz, 31.25 MHz, 15.625 MHz, etc for a 125 MHz system clock). The overall system frequency, and thus the clock fed to the ADC IC, still operates at the higher frequency, so the full amount of samples is collected even if a user chooses a smaller sampling frequency. The fact that the IC still operates at the higher frequency despite the user sampling selection is what allows for software enhanced resolution (edit add: possible because the ADC stores data at 16 bit resolution") allowing for 15 bit resolution when the user sample rate is half of the system frequency, and 16 bits when the user sample rate is one quarter or lower of the system frequency), but that's neither here nor there.

As for what happens to those extra samples, users can choose for each oscilloscope channel if they want to decimate (only record Nth A/D conversion and "throw away" the rest), average the samples taken (the default option), min/max (the Help within WaveForms describes this as "each two samples will be calculated as the minimum and maximum value of conversion results, but I'm not certain how to interpret that), or Full Scale which operates similar to average but instead of using the physical hardware input voltage ranges instead uses the user selected range in software (which can help improve vertical resolution).

Let me know if you have any questions.

Thanks,
JColvin

Edit July 26 2023: added clarification details on the resolution increase at lower sample rates with regards to system frequency

Link to comment
Share on other sites

  • 1

Hi @Jim Luby

With Decimate the Nth (sys-ADC freq / Rate) samples are stored, these is no low pass filter involved.
The Average of N samples eliminates or highly reduces the aliasing but causes a bit of damping.
With AD3, ADP3X50... in device Filters channels are available.

image.png

image.png

image.png

Link to comment
Share on other sites

  • 0

@JColvin

Thank you so much for providing such a detailed, helpful response to my question. I'm a very new user of the Analog Discovery 3 device, having received it just three days ago. I do have another question that I'm hoping you can shed some light on. Specifically, do the various "apps" within Waveforms interact or can I view them as independent? So, for example, do parameter settings in the Scope app result in limitations (for lack of a better word) in other apps such as the Spectrum analyzer?

As to my use of the word "digitizing of the signals," I meant analog-to-digital conversion. I'm especially interested in knowing if I were to digitize two channels of data with the device, would there be any time offset between the channels introduced by the sampling scheme. I work with acoustic systems (e.g., sonar) and we often look at channel-to-channel processing methods for such things as estimating direction-of-arrival (DOA) of received signals. If the digitizing scheme were to introduce a delay between two channels of acoustic data it would appear as if the signal was arriving from a different angle than it actually is arriving from. So, in short, I'd like to know (and will test once I'm more familiar with the Discovery 3) if I were to feed an identical signal into two channels and then cross-correlate them, would the peak of the cross-correlation function be at zero time delay.

Thanks again for your very thoughtful response.

Best regards,

Jim

Link to comment
Share on other sites

  • 0

For the original AD1, Digilent provided an excellent write-up of the hardware design that allowed users a chance of figuring this kind of information out for themselves. I have been unable to find equivalent information about the AD3. In recent years Digilent has been moving toward more opacity and hiding more of the important details that are important for certain applications.

Perhaps, someone at Digilent will point you to an AD3 theory of operation document that will suffice. In general, Digilent products are geared toward general purpose use cases. Sometimes, someone who knows what the right questions are, for their application, post them to the forum. Thanks for doing that. I'm looking forward to the subsequent exchange. Details, details... can't do much without covering the details.  

Edited by zygot
Link to comment
Share on other sites

  • 0
3 hours ago, JColvin said:

Analog Discovery 3 is adjustable between 50 MHz and 125 MHz, unlike the Analog Discovery 2 which was set at 100 MHz and unadjustable).

 

3 hours ago, JColvin said:

The overall system frequency, and thus the clock fed to the ADC IC, still operates at the higher frequency, so the full amount of samples is collected even if a user chooses a smaller sampling frequency. The fact that the IC still operates at the higher frequency despite the user sampling selection is what allows for software enhanced resolution (15 bits when the user sample rate is half of the system frequency, 16 bits when the user sample rate is one quarter or lower of the system frequency), but that's neither here nor there.

A fixed ADC clock makes sense for some ADC architectures. A limited but variable ADC clock make sense for some ADC architectures, but not others. Most ADC architectures are not design to support a very wide range of clock frequencies and meet specifications.

Depending on how decimation is performed, I'd agree that increasing bit resolution is possible. Of course, one has to figure in other ADC specs like ENOB and front-end design. In order to save memory resources I'd think that decimation would be done in hardware, prior to storage. So, your explanation is a bit confusing.

If my ADx get samples out of the ADC at 100 MHz, and the instrument is set for 10 KHz sampling, are you saying that the 16K or 32K sample buffer is filled with 100 Msps and then software does decimation? This would seem to mean that the number of samples for the observation time for various instrument sample rates varies wildly. At 100 Msps collecting a full buffer of 16384 samples would be an observation time of 163.85 us. At 10 Ksps, the observation time would be the same 163.85 us but you'd only have 16 samples to show for it. I must be missing something.

Is there are reason why the only reply from Digilent, and thank you for that, has to be conditioned with ambiguous qualifications? These devices are great for a lot of use cases. I can't see any reason not to be open about basic details of operation, like the AD1 had.

Edited by zygot
Link to comment
Share on other sites

  • 0

The reason that I'm being ambiguous is because I don't know the underlying hardware or software well enough to be able to clarify if the decimation is done in hardware or software (or your other query on the buffer filling up when the ADC is running full speed); I'm hoping that Attila will clarify when it is regular working hours in his time zone.

Link to comment
Share on other sites

  • 0
14 hours ago, JColvin said:

The reason that I'm being ambiguous is because I don't know the underlying hardware or software well enough to be able to clarify if the decimation is done in hardware or software (or your other query on the buffer filling up when the ADC is running full speed); I'm hoping that Attila will clarify when it is regular working hours in his time zone.

I have high regards for your honesty and forthright posts, hence your quibbling was a positive aspect of your reply. My comment was intended to be more of a "why doesn't someone who can give an official technically informative answer to this thread?" point.

Link to comment
Share on other sites

  • 0

@attila

Hi Attila - Thank you for your earlier helpful response. I've been following the exchange between @zygot and @JColvin regarding A/D sampling, etc. In a recent post made by @JColvin he mentions there being the option to decimate - i.e,. "(only record Nth A/D conversion and "throw away" the rest)". Am I correct that low pass filtering is applied prior to throwing out samples so as to prevent aliasing? Thanks again! - Jim

Link to comment
Share on other sites

  • 0
28 minutes ago, attila said:

With Decimate the Nth (sys-ADC freq / Rate) samples are stored, these is no low pass filter involved.
The Average of N samples eliminates or highly reduces the aliasing but causes a bit of damping.

So you use what's sometimes referred to as a "sum and dump" filter for decimation. Correct?

Link to comment
Share on other sites

  • 0
5 hours ago, attila said:

decimate = every Nth ADC conversion

So, there is no ability for the ADx to increase bit width through decimation as @JColvin suggested.

By summing samples and scaling the result you can effectively increase bit width. If you throw away samples you can't. Why not take advantage of decimation?

With "Sum and dump" filtering you can use the use the accumulated value to implement integration, such as measuring energy over a symbol period. If you scale the accumulated value appropriately, you can get an average, but with the benefit of increased resolution. There's no problem if you throw away samples, but then you don't get any advantage by decimation.

Edited by zygot
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...