Jump to content
  • 0

USB-2637 measurement error when measuring high impedance inputs


Question

Posted

Hello there,

We are using the USB-2637 and we are finding some measurement errors where a voltage source that has a somewhat high impedance, for example 10k. The error results in a measurement being less than a DMM measured value.

The voltages being measured have voltage dividers in them with resistors in the 10k to 20k ohms range. 

Our observation is that voltages with a low source impedance, say <= 100 ohms don't have an observed error, however voltages with around a 10k impedance do have an error such that the value is a little less than the DMM value.

We suspect that the sample and hold circuit on the USB-2637 is causing the voltage to droop due to the source resistance and the internal sample and hold circuit operation. Please see the attached image for an oscilloscope capture that has a 10k resistor connected in series with a 9.1V source.

Can you confirm that this is expected behaviour?

Can you offer any solutions either in the circuit or in the setup of the USB-DAQ that could help improve the accuracy of the measurement?

 

Thank you so much!

Image.jfif

3 answers to this question

Recommended Posts

  • 0
Posted

The USB-2637 was designed to work with low-impedance signals. The best accuracy is achieved when the impedance is kept below 100 ohms. The main reason for this is the internal capacitance of the analog multiplex front end. Each time the board selects a different channel, a tiny charge passes through this capacitance to the next channel. A low-impedance source dissipates the charge, whereas a high-impedance source does not.  The higher the impedance, the longer the time constant. One solution is to buffer it to convert the high impedance to low. Another is to enable a channel connected to the ground after a channel with a high-impedance signal. The grounded channel will dissipate the charge before it can be passed on to the next channel with a high-impedance signal. 

  • 0
Posted

Thankyou for the response! There is lots of actionable feedback there! 

We are wondering about this "the longer the time constant." Is there a way to increase the sampling period to accommodate a longer time constant?

  • 0
Posted

If the software is DAQami, DASYLab, TracerDAQ, or LabVIEW, the time between channels is determined by the sample rate. If you run it at the maximum aggregate sample rate, the time between the channels is 1uS or 1 / 1 Mhz. For example, if there are eight channels being used and the sample rate is 1000. In this scenario, the A/D is sampling eight times faster, so the time between channels is 1/8000 or 125uS. If you've created your program, the same holds if the BURSTMODE scan option is not used. BURSTMODE forces the channels together so that channel-to-channel time is 1uS.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...