Jump to content

F. Y.

Members
  • Posts

    3
  • Joined

  • Last visited

F. Y.'s Achievements

Newbie

Newbie (1/4)

0

Reputation

  1. Hello Both USB-1080X analog outputs start to have an offset of about -0.003V. In the DAQami software, for example, I set the AOUT1 to 0, but the two connected input channels read -0.003V. I can confirm this is an output issue with external multimeters. (See the first screenshot) There's another problem that I think it's related to this one - after executing the mcculw example codes like ULAIO01.py, the output voltage level always stays at the last data point. For example, if the last data point of the output signal from ULAIO01.py before I quit the program is -4V, then AOUT1 will stay at -4V even if no program is running whatsoever(confirmed with multimeters), and I cannot stop it unless I unplug the USB. That significantly affects data acquisition because the next time I use the device, the input terminals will read a peak of -4V at the beginning, not 0V. (See the second screenshot) I'm using official software and python examples so I'm not sure how could that happen. So in short my questions are: how to calibrate or refresh the terminals to default? How do I do that in python so the device is released completely every time it finishes an execution, and output terminals are back to 0? Thank you
  2. Hello @Fausto Yes. Both CH0H/CH0L and CH1H/CH1L are measuring external signals from a battery. AOUT0 output my 50khz signal profile (the blue line) to the battery, and CH0H/CH0L CH1H/CH1L read its signals back. I have tried to measure connect AOUT0 directly to CH0H/CH0L as the picture shows, without any other external load, but the random delay problem persists. The exact delay in time that the device starts to measure the signal seems random. I wonder, maybe there's a function or parameter to set a trigger or something instead of simply executing a_in_scan() and a_out_scan() Edit: to clarify, there's no delay between the two input channels, the delay is between the AOUT0 and the inputs - there's a random amount of time after the start of the output and the start of the input measurement.
  3. Hi I have a USB-1808X and I'm trying to output an analog signal that contains around 160,000 data points at 50KHz via the AOUT0 pin. (which means the duration of the signal is about 3.2 sec). A current signal should be read back simultaneously at the same frequency via CH0H/CH0L as well as a voltage signal via CH1H/CH1L. Everything is fine except there's always a delay (~30 data points) at the beginning of the measured signals. Simply a constant delay is not a problem as I can shift the measurements back or forward in python with that constant to compensate. The problem is, it's not a constant, every time I run the code, the delay can be anything from 25 to 40 points, which makes it impossible to compensate precisely. Is that a hardware limitation or we can do anything better in programming? I'm using ul.a_in_scan() follow by ul.a_out_scan() to do the task. The scan option is BACKGROUND. I believe you can ignore the voltage measurement plot (green line) in the following screenshots. The current is more obvious to analysis with in my case. X-axis is the number of data points. Thank you
×
×
  • Create New...