Computer is M1 MacBook Pro, Mac OS Ventura, program in python 3. Wiring are all 50 ohm BNC coax cables terminated in simple wires for screw terminal condition (< 6 inches bare wire at DAQ side. There are several source meters and signal generators (keithley or similar bench top test and measure equiptment on the same bench). The USB connection is routed through a hub the also handles various ethernet and other USB connections for automation of other parts of our system.
Nothing was different about the code or setup the time that it worked. As far as I can tell the success was random.
I have since tried turning down the frequencies and found that it works consistently at max AI scan speed and lower output scan frequency. 0 fails below ~100 kHz output rate and fails less frequently at 200-400kHz.
I actually don't need high scan frequency on the analogue output, so I changed the code to control analogue output with a_out, and use d_out_scan with an int buffer (instead of daq_out_scan with a float buffer). I still encounter fails at 500 kHz but it appears stable at 400 kHz digital output scan speed. I think that performance is acceptable for my purposes. I was going to cobble together a version of the code that can be shared publicly but with this approach I think my problem is effectively resolved now.
I do have a follow up question on input/output timing: I am triggering the input scan on one of the DIO output pins. I would like the clocks to be synchronized such that if I run the output scan at 400 kHz and input scan at 200 kHz, that point N in the input sequence will be temporally synced with the 2*N point of the output scan over many seconds. Are the input and output clocks synchronized by design? Or do I need to use OCLKO to drive ICLKI to ensure this?
What is the relationship between the ICLKI frequency and the input rate? i.e. If I tell the DAQ I want 200 kS/s input scan rate and provide an arbitrary input clock frequency (e.g. 400kHz), what is the result? does the input clock expect a specific frequency?