After several years of explaining the difference between what the TSC3 user sees in the field for precision and what the office is showing for precision in files, and reading other posts about confidence intervals and the difference between the way it is represented in TBC and Access; I have found yet another way in which confusion has been added to this situation.
You can now select what precision you would like to show for results by selecting either TBC5.3, for better results sometimes, or TBC5.2 or lower for less precise work. Additionally, if you choose a TSC3 running version 2017.20 or a TSC7 with 2020.00 you can also determine your outcome.
TBC5.2 - 20170.20 file showing 0.0191m and TBC5.3 showing 0.0135m at 95% CI
Raw TSC3 result
TBC5.2 TSC7 2020.00 emulator created data showing 0.0245m and TBC5.3 showing 0.0173m at 95%CI
So which is correct?
According to TBC version 5.3, the 2020.00 data is shown correctly.
According to the Access 2017.20 manual, TBC version 5.2 is shown correctly.
Do I have to process 2017.20 data using only version 5.2 and 2020.00 data using version 5.3?
What do I do with a project that used both data collectors?