davetyp Antenna-Theory.com Newbie
Joined: 29 Dec 2013 Posts: 1
|
Posted: Sun Dec 29, 2013 6:24 am Post subject: Array factor versus with bandwidth? |
|
|
Hello all,
I have an array of terminated folded dipoles serving as the antenna portion of an HF radio telescope.
Each dipole is 30' wide. There are eight TFD's, arranged in two square arrays along a north-south line. The dipole wires are oriented north-south and east-west.
I used TFD's because I make my observations in the frequency range from 17 to 33 MHz. That is, I needed something with a nice fat bandwidth. In the HF band, TFD's are easier and cheaper to build than conical log spiral antennas or helixes.
I arranged them into square arrays so I can produce RCP and LCP outputs using 90 degree hybrids.
All of which works great -- as long as I don't try to steer the beam away from zenith.
Using the time delay method to beam-steer the array, the response of the array to point source illumination changes with frequency in a periodic fashion. For example, when the beam is steered 30 degrees off zenith to the west (and not steered at all in the north-south direction), I see 2 or 3 "nulls" for every 1 MHz change in frequency. The nulls are closer together at lower frequencies and further apart at higher frequencies.
These nulls produce obvious bands in a spectrum display -- the typical waterfall kind of display.
The big question is: is this behavior the array factor at work, or is it something to do with the wideband 90 degree hybrid?
Thanks for any input or advice. I'm happy to provide more detail, but I don't want to wear out my welcome in my first post.
--
Dave |
|