Jump to content

stevepqr

Members
  • Posts

    3
  • Joined

  • Last visited

stevepqr's Achievements

Newbie

Newbie (1/4)

0

Reputation

  1. @JRys Another update - taking onboard your OS comment I took another clean Pi, installed Debian Lite and the required installs only for daqhats. I modified the continuous_scan example so that it more closely resembles my code and - same result, well very similar, average delay is 104ms with the occasional (but quite regular) 52ms thrown in. Could it be something in the setup? It's fairly unmodified from the original continuous_scan, the only this I have changed is the scan_rate=1280 and read_request_size=128...
  2. @JRys I can't find an obvious way to reply directly to your comment! I appreciate that there will be a timing issue between the onboard timers and the loop in my program, my issue is the size of it and the inconsistency. My program is cut back now such that I only have a scan_status() and a scan_read() along with a single printf so I can see whats going on (see attached loop code image) you can see that I loop around scan_status() until it returns >128. If you look at the attached image you can see that I am reading 128 samples every time (at 1280Hz this should be 100ms loop time), the first number in the second line is the number of samples in the buffer returned by scan_status() and the second number is the time taken to go round the loop in microseconds. I don't believe there is anything in my code that could be slowing down the process, I find it hard to believe that anything in the OS is causing the delay - you can see from the short amount of data shown in the attachment that there is an approximate correspondence between the size of the buffer returned by scan_status() and the length of the delay/looptime. So, given that the loop time due to my code is as short as it can be, what is causing the scan_status() function to return always a significantly greater number of samples than 128 (201,151,169,176,188 in my attachment for the first few) when I'm reading scan_status() in a very fast loop? At this speed I would expect it to return 128,129 maybe 130 every time. //Edit In summary - scan_status() is not returning the number of samples I expect and I believe this is contributing to the variation in the loop timings. I put clock_getime() calls before and after the scan_status() loop, the difference before and after the loop is ALL of the variation in the main loop timing is coming from scan_status() (see attached ScanStatus Loop Timing.png - you can see the scan_status() loop is always a couple of hundred microseconds less than the main loop time) //End Edit As I said I think you are right - my program and the OS will have an effect but I believe this effect would be much smaller than what I am seeing, maybe <1ms variation in the timings, the main problem is with the scan_status() loop not stopping at 128 because scan_status() doesn't return the correct size of the number of samples every time it is called. Regards Stevepqr
  3. I have a peculiar problem with the MCC172 - I'm reading continuously at 1280Hz using a slightly modified version of the continous_scan.c example. I read 128 samples every loop, at 1280Hz I should get a read every 100mS but I'm not, mostly it's around 105mS but occasionally (every 8-9 loops) it will read in 55mS. I don't appear to be losing any samples but the reported timing is off, I'm using the Pi clock_gettime() to record the timestamp every loop, other than this timestamp there is no other code in my loop other than the mcc scan_read function. Additionally (or perhaps connected) if I check the data in the buffer using the mcc scan_status function to ensure I have 128 samples in the buffer before I read this does not return a continuously increasing number of samples but returns (for example) 4,4,4,95,95,95,95,160 - I would have expected to receive back 4,16,32,55,100,116,129 or something similar (if that makes any sense!). The loop time without the mcc scan_read is around 2-3mS so I know my loop is fast enough, any ideas?
×
×
  • Create New...