Jump to content
  • 0

Getting Measurement messed up in windows and linux machine.


Kodeeswaran S

Question

Hi Team, 

I am using Digilent Analog Discovery 3 device for my device testing and trying to capture 100msec from D2 channel.

In my testing one digilent device is connected in remote system(linux machine) and another one is connected in my windows machine(locally) in both machine I am trying to measure 100msec data from channel 2.

My sample rate is 1024 which I am setting in python script. for this sample rate my total capture is 15.96sec.

In windows machine number of samples per signal is 204 and I am able to verify that it is 100mesc, whereas in linux machine also it is 204 but I am not able to verify it as 100msec. 

only difference in linux and windows machines is internal frequency. in linux machine is 50Mhz and window machine is 100MHZ.

 

Attached csv file for the both machines.

 

Please help to resolve and explain me this issue.

output_100msec_Linux.csv output_windows.csv

Link to comment
Share on other sites

11 answers to this question

Recommended Posts

  • 0

Hi @attila

I have used below WF_SDK from github.

from WF_SDK import device, logic, pattern, error   # import instruments

import matplotlib.pyplot as plt   # needed for plotting
from time import sleep            # needed for delays
import csv
"""-----------------------------------------------------------------------"""

class test_logic_capture:

   

    def test_logic_capture_data(self,Max_count=0,Channel=0,Sampling_rate=0,Sample_size=0):
        
        try:
            DIO_IN = Channel
            DIO_OUT = 0
            print("test")
            print("Max_Count -->", Max_count)
            print("Chaanel -->", Channel)
            print("Sampling_rate -->", Sampling_rate)
            print("Sample_sixe -->", Sample_size)

            # connect to the device
            device_data = device.open()

            """-----------------------------------"""
            data = []
            # initialize the logic analyzer with default settings
            logic.open(device_data, Sampling_rate , Sample_size)

            # set up triggering on DIO0 falling edge
            logic.trigger(device_data, enable=True, channel=Channel, rising_edge=False)

            print("sampling frequency ------------->",logic.data.sampling_frequency)
            # generate a 100KHz PWM signal with 30% duty cycle on a DIO channel
            # pattern.generate(device_data, channel=DIO_OUT, function=pattern.function.pulse, frequency=100e03, duty_cycle=30)
            count = 0 
            # sleep(1)    # wait 1 second
            while count < Max_count:
                # record a logic signal on a DIO channel
                buffer = logic.record(device_data, channel=DIO_IN)
                count = count + 1
                print(count)
                # limit displayed data size
                length = len(buffer)
                if length > 16000:
                    length = 16348
                buffer = buffer[0:length]
                data.append(buffer)
                # print(data)
            print(len(data))
            # print(data)
            buffer = test_logic_capture.flat(data)
            print(len(buffer))
            # generate buffer for time moments
            time = []
            # start_time = 0.1
            total_scale_sec = count * (Sample_size/Sampling_rate)
            division_ms_per_div = 1000
            total_sample_size = count * length
            division_sec_per_div = division_ms_per_div / 1000
            num_divisions = total_scale_sec / division_sec_per_div
            print(num_divisions)
            time_increament = total_scale_sec / total_sample_size
            
            time = [i * time_increament for i in range(total_sample_size)]
            print(time)
            # for index in range(count*length):
            #     time.append(index / 50000)   # Time is in usec
                # time.append(index  / 50000)   # Time is in sec
            print(len(time))
            # print(time)

            with open("output.csv", "wt", newline='') as file:
                writer = csv.writer(file)
                writer.writerow(["column1", "column2"])
                for row in zip(time, buffer):
                    writer.writerow(row)
            # plot
            plt.rcParams["figure.figsize"] = (100, 50)
            plt.plot(time, buffer)
            plt.xlabel("time [sec]")
            plt.ylabel("logic value")
            plt.yticks([0, 1])
            # plt.show()
            plt.savefig('data_log_working.png')
            # reset the logic analyzer
            logic.close(device_data)

            # reset the pattern generator
            pattern.close(device_data)

            """-----------------------------------"""

            # close the connection
            device.close(device_data)

        except error as e:
            print(e)
            # close the connection
            device.close(device.data)


    def flat(lis):
        flatList = []
        # Iterate with outer list
        for element in lis:
            if type(element) is list:
                # Check if type is list than iterate through the sublist
                for item in element:
                    flatList.append(item)
            else:
                flatList.append(element)
        return flatList

Link to comment
Share on other sites

  • 0

Hi @Kodeeswaran S

Looking at you csv files: in window the first period is 206 samples long, according the column1 0.201, and in linux 409, 0.400
The sample step of column1 is both files is 0.000979 but you mention different frequencies in windows and linux

I don't recommend using this 'WF SDK'.
The logic.record performs simple separate captures than should not be concatenated...

See the real WaveForms SDK manual and examples here:

image.png

The AD3 first/default configuration has 16ki samples allocated for Logic Analyzer so you can use acqmodeSingle, you don't have to use acqmodeRecord, to capture up to this number of samples.

image.png

image.png

Link to comment
Share on other sites

  • 0

Hi @attila

When I read my internal frequency on linux machine and windows machine with the same code base on both the machine the frequencies are different in both the machines. I don't know why it is responding like that. do you have any suggestions or feedback for this issue? 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...