Jump to content

belowdabridge

Members
  • Posts

    6
  • Joined

  • Last visited

Posts posted by belowdabridge

  1. So,  if I set the Data Rate in InstaCal ( or with BoardConfig.SetAdDataRate()  )  to 60, and scan 16 channels, it gets just over 3 full scans internally per second. ( This is stored and available, or only read when I ask for it? )   Then the Sample rate, the rate I ask AInScan() for, can be set to 32?  This gets 32 readings in 1 sec or .5 sec to read the 16 channels?  

    Is it that if I did that when I run AInScan() it goes channel by channel reading each at the 60 Data rate ( one reading ? )?  Or does the sample rate need to be set to less than Data rate / number of channels?

    What happens if the Sample rate is too high?  It tries to read data that's not ready yet?  

    Thanks,

    dave

  2. Jeffery,   Yes, USB-2416,  rate as set in C# program,  but now I need to know how are they different?

    This is part of the base problem,  I haven't found the documentation that explains what controls what.

    Please correct me, but there is the rate at which the ADC scans channel - to - channel  ( and it seems that this is not necessarily all channels at a fixed rate, is it possible to scan channels 0-3, 7-11 at 60 scans / sec. but scan all 16 every 6th scan ( thus 10 scans / sec ) ? )  

    I expect that given a scan rate, the buffer settling time and conversion time are set.  ( and at that low scan rate, I'd expect both are at max. )

    Then there's the question of averaging.  Is that done in the USB-2416? 

    My actual problem is how to get the best readings I can once a second ( I'm using C#'s timer now and it's not keeping time,  I suspect because it's counting millisecond ticks, but NOT when something else is happening, - I don't think I'm doing enough to miss it's interrupt. )

    So when I call AInScan(), what happens before it returns?

    My stated test requirement is one reading per sec., but I want to keep the HMI updated during the test.  ( And the processor should be idle most of the time.  I've got very little that the program does.

    ( Oh, thanks for the quick replies, I'd have gotten back to this one sooner, but it was still "wandering around the net" when I checked at 8 last night. )

    Thanks,

    dave

  3. Using UL for .net, ( C# ) I use AloadQueue()  and AInScan(),  my question is:  given a Rate of 10 or 60, what actually happens / what in the timing? 

    As I understand, there's input buffer settling time, ( is that fixed, or can I lengthen it ? )  and number of samples averaged ( which affect precision and jitter ),  then the question of when the scan returns - does it grab a current "last read value" or start a read and wait 1/Rate seconds to get the values?

    If I scan fewer channels does it return quicker?

    My current problem is that scanning 16 channels with Rate at 10, C#'s one second tick ( and I read once a tick ) is coming at ~1.6sec.

    Thanks,

    dave

  4. Jeffery, Thanks, ( sorta ) got me looking in the right place. 

    I have a "sorta working" program.  ( Visual C# 2019 )  After checking that my boards are attached, I run:

     Globals.AIMhandle = MccDaq.MccService.ScaledWinBufAllocEx(Globals.AINumPoints);   //     Globals.AINumPoints is 16 for my 16 channels in.

    then

    AInOptions = MccDaq.ScanOptions.ScaleData;

                    short[] channels = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 };
                    MccDaq.Range[] ranges = { MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts,
                        MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts,
                        MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts,
                        MccDaq.Range.BipPt625Volts, MccDaq.Range.BipPt625Volts, MccDaq.Range.BipPt625Volts, MccDaq.Range.BipPt625Volts
                         };

                    ULStat = Globals.DaqBoardAnalog.ALoadQueue(channels, ranges, 16);    

    then  ( which happens once a second usually )

     int Rate = 10;

                ULStat = Globals.DaqBoardAnalog.AInScan(lowChan: Globals.LowChannelNumber, highChan: Globals.HighChannelNumber, numPoints: Globals.AINumPoints,
                     rate: ref Rate, range: Globals.ChRange, memHandle: Globals.AIMhandle, options: AInOptions);       //   I had a problem here,  Globals.ChRange was set to MccDaq.Range.Bip10Volts,  and that caused the clipping.  ( ?? As I read the doc, the queue should dominate  ?? )

    then

     numpoints = Globals.AINumPoints;
                ADDataV = new double[numpoints];

                AIFirstPoint = 0;

                ULStat = MccDaq.MccService.ScaledWinBufToArray(Globals.AIMhandle, ADDataV, AIFirstPoint, numpoints);

    to bring the data into an array.

    This seems to work, but, if the queue setting doesn't get some channels read at the higher resolution I may be hurt on accuracy.  ( Tho I can probably live with 1mV errors. ) 

    My problems now,  ( attached file )  under current I'm reporting volts across a 0.002 Ohm resistor,  so nearly 90A, temp readings are thermocouples in air.  volts readings are OK to start, Ground resistance is unconverted ( my bug ) voltage across a series sense resistor.   This shows one of 4 test nests.  The other 3 were not powered.

    Readings here and on the other nests were good until the ground resistance dropped ( checking with my meter, it seems it really did, from 40+ MOhms to ~ 1 KOhm. )  this sent the voltage across the sense resistor too high.  I unplugged the connector for those channels.   ( about line 43 )  then things got weird.  The voltage readings went down - I still saw 13V with my meter at the connector, the voltage readings on the other nests went up to several volts ( they were 0 ) ,  temperature readings went up ( my meter showed ~ 70 F ) .

     Can the open channels ( as I unplugged to not put 24V on the inputs )  cause other channels to read wrong?.

    Thanks,

    dave

    dataTK_test_2022-12-13T16-55-49_Nest_1.csv

  5. I've got an USB-2416-4AO, using DAQAmi, if I feed it 13v, it reads 13v.  In my program, I'm getting 9.999.. v when I feed it 13v.

    My program initializes the unit with

                    short[] channels = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 };
                    MccDaq.Range[] ranges = { MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts, MccDaq.Range.Bip1Volts,
                        MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts, MccDaq.Range.BipPt078Volts,
                        MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts, MccDaq.Range.Bip20Volts,
                        MccDaq.Range.BipPt1Volts, MccDaq.Range.BipPt1Volts, MccDaq.Range.BipPt1Volts, MccDaq.Range.BipPt1Volts
                         };

                    ULStat = Globals.DaqBoardAnalog.ALoadQueue(channels, ranges, 16);

    I get sane readings for the other channels, but when  I have 13v on channel 8, it's read as 9.999... .  Which makes sense for a 10v range.

    What am I missing?

    Thanks,

    dave

     

  6. Is there any documentation for the BERGtools package?  I'm building a C# program and would like to use parts of it.  ( It is free to use in a one-of program for commercial use?    ( That is, used on one station in plant. )

    Thanks,

    dave

×
×
  • Create New...