drive-point current phase explanation

From: larry goldstein <larry_at_email.domain.hidden>
Date: Thu, 21 Dec 1995 14:27:19 +0500

I'm trying to understand why the relative phases of the currents on two
identical,
electrically isolated, antennas in free space is not determined by the distance
between them. I am using the following NEC model:

* 2 identical half-wave resonant dipole antennas - each has five segments
* the two antennas are parallel to each other
* distance, d, between antenna centers is exactly 1000 wavelengths to avoid
any mutual
  coupling effect
* first antenna (transmit) is excited with 1 excited with a voltage source

Shouldn't the phase of the current at the drive-point of the receive antenna
lag the
phase of the current at the drive-point of the transmit antenna by
2*pi*d/lambda ?

I would expect the phases of the drive point currents on both antennas to be
the same
in this model since the separation distance, d, is an integer multiple of a
wavelength. However, I find that the phase of the current at the center of
the receive
antenna is about -94 degrees relative to the phase of the current at the
center of the
transmit antenna.

Can anyone explain this ?
 
Received on Thu Dec 21 1995 - 17:39:00 EST

This archive was generated by hypermail 2.2.0 : Sat Oct 02 2010 - 00:10:36 EDT