hamster Posted November 21, 2017 Posted November 21, 2017 Last night I measured the speed of RF waves in a generic 10m TV coax using the AD2, a socket and two resistors Why?: I'm trying to build a cheap colinear antenna for receiving 1090MHz plane broadcasts. To do this I need to know the "velocity factor" of a cable. The setup: Connect the AD2 wave output and the input of the first scope channel (the reference channel) to one end of a 330 ohm resistor, Connect the other end of the 330 Ohm resistor, the second scope channel, and one end of the 100 Ohm resistor to the center pin on the socket. Connect the other end of the 100 Ohm resistor plus the AD2's ground connection to the shell/ground connection of the socket. Running the test: Without the cable plugged into the connector, run the Network Analyzer, from 1 MHz to 10 Mhz - it should be a pretty much flat line. Then connect the cable and test again. There will be a 'dip' somewhere 5 or 6 MHz. What is going on: The 330Ohm+100Ohm resistor acts as a signal divider, and has an AC impedance of about 75 ohm, matching that of the Coax cable. Because the cable has an open end, it is acting as an 'open stub' and any signal that is injected into the cable reaches the end of the cable and is reflected. The source and reflected signal interfere with each other, and where the signal is destructively interferes with the source signal the "dip" is seen. The bottom of this dip is when the cable is 1/4th the wavelength of the RF signal - so if the driving signal is at 90 degrees, the reflection is at 270 degrees, making the measured signal much weaker. Results: For a 10m (30 ft?) cable the dip was at 5.634MHz. That makes a full wavelength 40m long. That gives a speed of propagation of 5.634MHz * 40m = 225,360,000 m/second - about 75% the speed of light in a vacuum.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.