Jump to content
  • 0

Linux UL - error in differential voltage with USB-1608G


matt_sgf

Question

Hi

I'm using a USB-1608G and I connected a voltage device in the differential set-up. I installed the Linux driver.

I adapted the sample code 'AIn.c' to work for differential voltages by setting:

    inputMode=AI_DIFFERENTIAL;
    range=BIP1VOLTS;

However, the voltage reading is low. If I apply 1.0V I read around 0.998V. This uses the ulAIn function.

If I plug the device in to a Windows laptop and read the voltage using DAQami I get 1.0V.

Also, I adapted the sample code 'AInScan.c', which uses the ulAInScan function, to read differential voltages and this gave approx 1.0V.

Is there a problem with ulAIn voltage reading? Perhaps I am missing a required setting? Why does the ulAInScan give a different (and correct) number?

Thanks

Matt

Link to comment
Share on other sites

6 answers to this question

Recommended Posts

  • 0

Hello,

I did the same here, switching Ain.c to AI_DIFFERNTIAL and BIP1VOLTS. I used a voltage calibrator set to 1.0000 volts for my test voltage. If I let the program run for a few seconds, the measurement toggles between 0.999 and 1.000. I got slightly better results by modifying the program to use lowChan = 0 and highChan = 0 and attaching my test signal to channel 0. I didn't see any appreciable difference using the DAQami or the AInScan example. 

Best regards,

John

Link to comment
Share on other sites

  • 0

Hi

I've just tried again. I made the edits to the example, see attached. I applied 1.0V and the value reads low still.

I am using channel 0 with a 100k resistor between the low and the ground as instructed.

I also edited the Python code example with the same result. I'm using OpenSuse 15.4.

Can you suggest anything I might be missing?

Thanks

AIn_dif.c

Link to comment
Share on other sites

  • 0

I'm not able to duplicate the lower measurement using ULDaq 1.2.1. I run the AIn_Average.C program, which returns a consistent 0.9998v. I also have a 6.5-digit multimeter on my voltage source output, which reads 0.99997v. 

DAQami doesn't have an average feature.  Instead, I wrote a Windows C program similar to the AIn_Average.c and it returns a consistent 0.9959v. The lower value may have to do with the counts to voltage conversion function using a float type instead of the double used by ULDaq. 

 

Link to comment
Share on other sites

  • 0

I figured out why I was getting inconsistent results; the low-side input reference resistor connection was loose. I'm using a 47k ohm resistor from the low side input to the ground primarily because my voltage calibrator is isolated from the system ground. Now I get good results using AIn, AInScan, Linux, or Windows.  I get a 200 reading average of 0.9999v consistently. I suspect that when you switch from your Windows computer to the Linux box, your system grounding changes ever so slightly. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...