NEC-LIST:GPS

From: Peter Fuks <peter_at_email.domain.hidden>
Date: Mon, 09 Sep 2002 14:06:08 +0200

Hi all,
I have been doing some experiments with a GPS-antenna. One set of
measurements was made at a certain test-site with successively larger
attenuation to the incoming signal. The other set was made in different
types of forests. The goal was to be able to correlate the type of forest
with a certain degree of attenuation. The units measured were standard
deviation from the mean position, average number of satellites used, percent
of the time when no positioning could be made and average SNR. Average SNR
was derived after correlation and despreading. The standard deviation seems
almost uncorrelated to the other parameters and remained at 7-13 meters for
all measurements. The average number of satellites used stayed well above
five; with four being the minimum required. The receiver could calculate a
position all of the time as long as the SNR stayed above 10 dB, when it was
tested at the testsite.
Now comes the part that surprised me. When measurements were made in the
forest the receiver was unable to make a positioning with a much higher
average received SNR. The lowest SNR was 13,6 dB and the time-outage for
that measurement was 8%.
To get continuos positioning during the whole measuring time we had to move
to a deciduous forest, where the SNR rose to 16 dB. Measurements in a mixed
forest and between the trees in the spruce forest were also made but the
positioning was not continuous there either. The outage was 5% at 14,2 dB
for the mixed forest, and 2% at 16,6 dB for the spruce forest.

Why is it that the receiver needs a higher SNR when in a forest? Especially
if the forest contains spruce, it seems.

Any ideas?
Peter

-- 
The NEC-List mailing list <nec-list_at_gweep.ca>
http://www.gweep.ca/mailman/listinfo.cgi/nec-list
Received on Mon Sep 09 2002 - 12:04:07 EDT

This archive was generated by hypermail 2.2.0 : Sat Oct 02 2010 - 00:10:42 EDT