The ULx for NI LabVIEW help states for the input task buffer size: "When you acquire a continuous number of samples (the ULx Timing VI's sample mode input is set to ContinuousSamples), ULx calculates the buffer size based on the number of channels, the sample rate, and the device's packet size requirements." How, precisely, is this calculated?
I ask as we are getting the following error despite calling the ULx Read VI approximately 50 to 100 times per second and asking it to read all the samples currently available in the buffer:
Measurements: Samples in the buffer were overwritten before they were read.
Reading the data more frequently, or specifying a fixed number of samples to read instead of reading all available samples might correct the problem.
In NI DAQmx, which the ULx driver appears to try to provide an identical interface to, it allocates a buffer at least as big as the value of the samples per channel attribute/property, and so in this respect it appears the ULx and DAQmx drivers differ in behaviour, and, based on the ULx driver help, I apparently have no control over the buffer size. Am I mistaken?
Edit: I managed to discover that the buffer size created by ULx for LabVIEW appears to have 1048576 elements, and since we are using a 10kHz sample rate with eight input channels, the buffer should be able to hold about 13 seconds of data, which makes me question whether that is the problem.
In case it is relevant, we are attempting to continuously stream from three USB-1808X at once using three separate tasks. To reiterate, we are reading each task 50 to 100 times per second. I came across another post where it was indicated that reading from multiple devices can be problematic: https://forum.digilent.com/topic/26880-how-do-i-get-two-analog-inputs-from-different-devices/#comment-81595. If so, is this only an issue with ULx for LabVIEW? That is, if I used LabVIEW to directly call UL for Windows or UL for .NET, then would this problem not occur? If so, which is higher performance, UL for Windows or UL for .NET? For example, it appears UL for .NET calls UL for Windows (CBW) under the hood, suggesting it would it be better if I call the latter from LabVIEW?
Question
banksey255
The ULx for NI LabVIEW help states for the input task buffer size: "When you acquire a continuous number of samples (the ULx Timing VI's sample mode input is set to ContinuousSamples), ULx calculates the buffer size based on the number of channels, the sample rate, and the device's packet size requirements." How, precisely, is this calculated?
I ask as we are getting the following error despite calling the ULx Read VI approximately 50 to 100 times per second and asking it to read all the samples currently available in the buffer:
Error 29 occurred at ULx Read (Analog 2D DBL NChan NSamp).vi
Possible reason(s):
Measurements: Samples in the buffer were overwritten before they were read.
Reading the data more frequently, or specifying a fixed number of samples to read instead of reading all available samples might correct the problem.
In NI DAQmx, which the ULx driver appears to try to provide an identical interface to, it allocates a buffer at least as big as the value of the samples per channel attribute/property, and so in this respect it appears the ULx and DAQmx drivers differ in behaviour, and, based on the ULx driver help, I apparently have no control over the buffer size. Am I mistaken?
Edit: I managed to discover that the buffer size created by ULx for LabVIEW appears to have 1048576 elements, and since we are using a 10kHz sample rate with eight input channels, the buffer should be able to hold about 13 seconds of data, which makes me question whether that is the problem.
In case it is relevant, we are attempting to continuously stream from three USB-1808X at once using three separate tasks. To reiterate, we are reading each task 50 to 100 times per second. I came across another post where it was indicated that reading from multiple devices can be problematic: https://forum.digilent.com/topic/26880-how-do-i-get-two-analog-inputs-from-different-devices/#comment-81595. If so, is this only an issue with ULx for LabVIEW? That is, if I used LabVIEW to directly call UL for Windows or UL for .NET, then would this problem not occur? If so, which is higher performance, UL for Windows or UL for .NET? For example, it appears UL for .NET calls UL for Windows (CBW) under the hood, suggesting it would it be better if I call the latter from LabVIEW?
Edited by banksey255Link to comment
Share on other sites
13 answers to this question
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now