Jump to content
  • 0

Configure DT9844 for 2 Channel Analog Input


Grata

Question

We have attempted to configure a DT9844 to read 2 separate SE analog input channels and display the output as a function of time using LabVIEW.
The read rate is set to 300klHz.
On AI0 we have an input with V~=2.6V (constant).
On AI1 we have an input V nominally at 2.27V that can increase to ~2.8V.
Measuring the V at the AI terminals with an accurate precision DMM, we have confirmed that the V on AI0 is steady at 2.603V and that AI1 ranges from 2.27 to 2.7V.
The V measured by the DT9044 on AI1 seems to accurate with respect to the DMM to within 1mV.

However, as the V changes on AI1, a V change is produce on our AI0 display.  We suspect there is an AI configuration error.  Our sample code is show below and attached.  Any suggestions of where the configuration error is would be greatly appreciated.

image.png.eed5e3c390173081f13d9f5b3806c7a5.png

 

image.png

DT_nCH_nSAMP_Cont.vi

Link to comment
Share on other sites

2 answers to this question

Recommended Posts

  • 0

Hello @Grata.

Thank you for the screen capture and LabVIEW vi example.  I was able to reproduce the issue with the DT9844 and LabVIEW 2023, as well as with Data Translation's QuickDAQ application.  Additionally, I tested at a slower sampling rate of 10 kHz, input channels configured in differential and single-ended modes, with the input signals swapped, and with a disabled channel between the two signal channels.  For all these test cases, when one channel's signal source varied full scale (+/- 10 V), then the other input channel's reading changed by +/- 4 mV.  This appears to be a hardware bug, not a user configuration error.

At this time, I do not have a solution nor workaround, but I will log this issue as a hardware bug for our engineers to investigate.

Regards,

Fausto

Link to comment
Share on other sites

  • 0

Hello @Grata.

Please short all unused analog input channels to analog ground terminals, on the DT9844, and also connect the 'Amplifier Low' terminal to an 'Analog Ground' terminal.  Run your LabVIEW application and let me know if you notice less of a voltage change on the other fixed voltage input channel.

Also to point out, the system accuracy specification for a 50 Ohm source impedance at a 500 kHz scan rate is +/-0.01 % (2 mV) and the crosstalk specification is -60 dB at 100 kHz (1 mV).  

Edited by Fausto
included additional specs
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...