I am trying to determine Relative Positional Precision (RPP) per the ALTA / NSPS Land Title Survey standard, from a network adjustment in Trimble Business Center (TBC v 3.70). I've found that the Project Settings below get me most of the way:
Network Adjustment > Covariance Display > Horizontal:
Precision = Ratio (note my PPM results are 3.28 times larger - see note at end)
Propagated Linear Error (E) = Canadian (semi-major axis of relative error ellipse)
Constant term (C) = 0 (don't enter the standard's 0.07 ft constant here - this "C" doesn't behave the same way)
Network Adjustment > Covariance Display > General:
Restrict to observed lines = No (evaluate all combinations of points)
Default Standard Errors > Confidence Level Display > Scale to confidence level = 95% (so standard error ellipse multiplied by ~2.45)
From the adjustment report, I copy the covariance portion into a spreadsheet (note Excel misinterprets the 1:1000 ratio format as some other type of number, so those cells must be pre-formatted as text), convert the precisions from ratio back to distance, and compare them with the allowable 0.07 ft + 50 ppm (i.e. for 2 points 600 ft apart, allowable = 0.07 ft + 50/1000000 x 600 ft = 0.10 ft ).
The conversion step is tedious. Given that the ATLA / NSPS standard has been around for a while, it would be nice to see a test incorporated into TBC's network adjustment, or at least an option for TBC to report precision as a linear value instead of ratio or PPM.
Finally, for the same network adjustment, my TBC reports precision in PPM (parts per million) format as being 3.28 times larger than the ratio results. The ratio results appear correct, as I duplicated them in old Trimble Geomatics Office (TGO v 1.63), and the TGO precisions are the same in both ratio and PPM format. Looks to me like TBC is incorrectly applying a unit conversion to what should be unitless PPM values.