Jump to content

Kodeeswaran S

Members
  • Posts

    7
  • Joined

  • Last visited

Recent Profile Visitors

52 profile views

Kodeeswaran S's Achievements

Newbie

Newbie (1/4)

1

Reputation

  1. Hi @attila Thanks for sharing the console window. I will check from my side and get back to you in another couple of days.
  2. Hi @attila, Thanks for your immediate response. Can you please share me the console output of both the machines?
  3. Hi @attila When I read my internal frequency on linux machine and windows machine with the same code base on both the machine the frequencies are different in both the machines. I don't know why it is responding like that. do you have any suggestions or feedback for this issue?
  4. Hi @attila The actual issue is I am not changing the frequency on both the machines. By default, the frequencies are changed on Linux and windows machine.
  5. Hi @attila, did you gone through the script? any update? Thank you
  6. Hi @attila I have used below WF_SDK from github. from WF_SDK import device, logic, pattern, error # import instruments import matplotlib.pyplot as plt # needed for plotting from time import sleep # needed for delays import csv """-----------------------------------------------------------------------""" class test_logic_capture: def test_logic_capture_data(self,Max_count=0,Channel=0,Sampling_rate=0,Sample_size=0): try: DIO_IN = Channel DIO_OUT = 0 print("test") print("Max_Count -->", Max_count) print("Chaanel -->", Channel) print("Sampling_rate -->", Sampling_rate) print("Sample_sixe -->", Sample_size) # connect to the device device_data = device.open() """-----------------------------------""" data = [] # initialize the logic analyzer with default settings logic.open(device_data, Sampling_rate , Sample_size) # set up triggering on DIO0 falling edge logic.trigger(device_data, enable=True, channel=Channel, rising_edge=False) print("sampling frequency ------------->",logic.data.sampling_frequency) # generate a 100KHz PWM signal with 30% duty cycle on a DIO channel # pattern.generate(device_data, channel=DIO_OUT, function=pattern.function.pulse, frequency=100e03, duty_cycle=30) count = 0 # sleep(1) # wait 1 second while count < Max_count: # record a logic signal on a DIO channel buffer = logic.record(device_data, channel=DIO_IN) count = count + 1 print(count) # limit displayed data size length = len(buffer) if length > 16000: length = 16348 buffer = buffer[0:length] data.append(buffer) # print(data) print(len(data)) # print(data) buffer = test_logic_capture.flat(data) print(len(buffer)) # generate buffer for time moments time = [] # start_time = 0.1 total_scale_sec = count * (Sample_size/Sampling_rate) division_ms_per_div = 1000 total_sample_size = count * length division_sec_per_div = division_ms_per_div / 1000 num_divisions = total_scale_sec / division_sec_per_div print(num_divisions) time_increament = total_scale_sec / total_sample_size time = [i * time_increament for i in range(total_sample_size)] print(time) # for index in range(count*length): # time.append(index / 50000) # Time is in usec # time.append(index / 50000) # Time is in sec print(len(time)) # print(time) with open("output.csv", "wt", newline='') as file: writer = csv.writer(file) writer.writerow(["column1", "column2"]) for row in zip(time, buffer): writer.writerow(row) # plot plt.rcParams["figure.figsize"] = (100, 50) plt.plot(time, buffer) plt.xlabel("time [sec]") plt.ylabel("logic value") plt.yticks([0, 1]) # plt.show() plt.savefig('data_log_working.png') # reset the logic analyzer logic.close(device_data) # reset the pattern generator pattern.close(device_data) """-----------------------------------""" # close the connection device.close(device_data) except error as e: print(e) # close the connection device.close(device.data) def flat(lis): flatList = [] # Iterate with outer list for element in lis: if type(element) is list: # Check if type is list than iterate through the sublist for item in element: flatList.append(item) else: flatList.append(element) return flatList
  7. Hi Team, I am using Digilent Analog Discovery 3 device for my device testing and trying to capture 100msec from D2 channel. In my testing one digilent device is connected in remote system(linux machine) and another one is connected in my windows machine(locally) in both machine I am trying to measure 100msec data from channel 2. My sample rate is 1024 which I am setting in python script. for this sample rate my total capture is 15.96sec. In windows machine number of samples per signal is 204 and I am able to verify that it is 100mesc, whereas in linux machine also it is 204 but I am not able to verify it as 100msec. only difference in linux and windows machines is internal frequency. in linux machine is 50Mhz and window machine is 100MHZ. Attached csv file for the both machines. Please help to resolve and explain me this issue. output_100msec_Linux.csv output_windows.csv
×
×
  • Create New...