My TSC3 controller displays precision values when I'm using my R8 receiver connected to a real-time network. It will show values such as H = 9mm, V = 15mm.

Is there any documentation explaining how these values are calculated? Is there any documentation that explains how all of the precision values are calculated in general (ie. total station values like resection points, averaged points, etc.).

Thank you.

This is what we used to provide back in the support days. I'm pretty sure nothing has changed.

The precision values are determined from the following:

horizontal precision = HDOP * RMS * 3.0

vertical precision = VDOP * RMS * 3.0

where HDOP and VDOP are the horizontal and vertical dilution of

precision respectively. RMS (Root Mean Square) is the solution RMS

for the L1 phase observation in meters, and is the radius of the error

ellipse within which approximately 70% of position fixes will be

found. The precision values are scaled to ~ 99% confidence level by

multiplying by a factor of three.

An example of RMS calculation is

An RMS of 60 in the Trimble Access Controller software is 60 milli cycles = 0.060 cycles of L1 wavelength. The wavelength of L1 is 0.19 m.

That is: 0.19 X 0.060 = 11.4 mm, expressed as a distance