I am facing a huge problem in my project where I am bulding a GUI interface using MCC 118 voltage measurement instrument. My objective is to log everysingle data point measured from each voltage channel and store this data and export it in an excel file. The most cruicial task here is to achieve equal time intervals between consecutive sample points read. Say for example, sample 1 is read at 33ms, sample 2 is read at 33.2ms sample 3 is read at 33.4 ms. For this to be possible I don't know if the MCC 118 provides samples with equal interval. As we may know by using a_in_scan_read() function doesnt deliver samples one at a time but a set of samples for each iteration. Say for example, if i keep a_in_scan_read() function in a loop i read with a scan rate of 1000samples/sec for 10 second expecting 10,000 samples in total per channel. The working mechanism is such that in the first iteration 30 samples/channel are read then in second iteration 55 samples , in 3rd iteration 40samples and so on till the limit of 10,000 samples are achieved.My approach was that eventhough a set of samples are read for each iteration, calculating the time difference between begin and end of each iteration in a loop and dividing it with the number of samples read for that iteration would form an equal time gap between each sample point. However, this method only works to some extent. It goes this way, say for
IT(Iteration) 1 -- 30 samples --- time interval --- 1.5 ms
IT 2 ----------- 40 samples ---- time interval -- 1.7 ms
IT 3 ------------- 56 samples --- time interval -- 1.85 ms
.........
As you can see for each batch size the time interval is varying. I could take the time difference between the beginning and end of the entire Loop instead of just recording the time taken for each iteration in a loop. However, I am not sure if I could do that since in between , I am also storing and concatenating the data received from the channels at the end of each iteration before moving on to the next iteration in the Loop. I believe some computational time is also lost in this part of the process .Previously followed by storing data I also used to Log real time data in a canvas figures for real time plotting . However as the results have shown non uniform time interval between sample points I decided to disect that part from the process. Now I am just reading and storing data however the results are still the same. Could someone give me the best possible solution to achieve equal time interval between sample points? This feature is quite crucial in signal processing applications. What needs to be known is,
Is there any way to accurately log the time for each sample read from MCC118 ? or
is there any computational method feasible enough to manually develop the time interval between samples as accurately as possible?
I have attached the Loop function which I am using in my program please open it in notepad.
some key notes:
->stime- refers to time the scan begins
->click_stop - just stops the process
->all other variables related to data are data storage parameters
->D0-D7 are global variable conditions for respective channels to be enabled by user(0->selected and 1->not selected)
I would really like to know if I am making any mistake here. I hope I was clear enough and Please let me know.
Question
Sai Mouli
Hello everyone,
Good Day,
I am facing a huge problem in my project where I am bulding a GUI interface using MCC 118 voltage measurement instrument. My objective is to log everysingle data point measured from each voltage channel and store this data and export it in an excel file. The most cruicial task here is to achieve equal time intervals between consecutive sample points read. Say for example, sample 1 is read at 33ms, sample 2 is read at 33.2ms sample 3 is read at 33.4 ms. For this to be possible I don't know if the MCC 118 provides samples with equal interval. As we may know by using a_in_scan_read() function doesnt deliver samples one at a time but a set of samples for each iteration. Say for example, if i keep a_in_scan_read() function in a loop i read with a scan rate of 1000samples/sec for 10 second expecting 10,000 samples in total per channel. The working mechanism is such that in the first iteration 30 samples/channel are read then in second iteration 55 samples , in 3rd iteration 40samples and so on till the limit of 10,000 samples are achieved.My approach was that eventhough a set of samples are read for each iteration, calculating the time difference between begin and end of each iteration in a loop and dividing it with the number of samples read for that iteration would form an equal time gap between each sample point. However, this method only works to some extent. It goes this way, say for
IT(Iteration) 1 -- 30 samples --- time interval --- 1.5 ms
IT 2 ----------- 40 samples ---- time interval -- 1.7 ms
IT 3 ------------- 56 samples --- time interval -- 1.85 ms
.........
As you can see for each batch size the time interval is varying. I could take the time difference between the beginning and end of the entire Loop instead of just recording the time taken for each iteration in a loop. However, I am not sure if I could do that since in between , I am also storing and concatenating the data received from the channels at the end of each iteration before moving on to the next iteration in the Loop. I believe some computational time is also lost in this part of the process .Previously followed by storing data I also used to Log real time data in a canvas figures for real time plotting . However as the results have shown non uniform time interval between sample points I decided to disect that part from the process. Now I am just reading and storing data however the results are still the same. Could someone give me the best possible solution to achieve equal time interval between sample points? This feature is quite crucial in signal processing applications. What needs to be known is,
Is there any way to accurately log the time for each sample read from MCC118 ? or
is there any computational method feasible enough to manually develop the time interval between samples as accurately as possible?
I have attached the Loop function which I am using in my program please open it in notepad.
some key notes:
->stime- refers to time the scan begins
->click_stop - just stops the process
->all other variables related to data are data storage parameters
->D0-D7 are global variable conditions for respective channels to be enabled by user(0->selected and 1->not selected)
I would really like to know if I am making any mistake here. I hope I was clear enough and Please let me know.
Thank You,
LOOP
Edited by Sai MouliLink to comment
Share on other sites
5 answers to this question
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now