Computers Can Do Math!

Today, in a meeting, we discussed separating raw data from its presentation. It is such a simple concept, but explaining the purpose of each one is more complex than I expected. So I’m dedicating this quarter’s Automation Corner to “Computers Can DO Math!”

In the meeting, we were discussing PTB’s DCC, the Digital Certificate of Calibration XML document created by Germany’s National Metrology Institute. The idea behind the XML document is to digitize calibration results, moving the calibration certificate from a paper to a digital/computer-ready format. Truly a bold move for the slow-moving metrology industry!

The problem hinges around the presentation layer of the DCC. More to the point, the entire XML schema focuses on the presentation layer without including the complete underlying data that is part of the calibration.

The case in point when measuring the length of a gauge block. A typical report will indicate the measured error of the gauge block. For example, on the report, it might say “-3.5 µin,” indicating that the gauge block is short by 3.5 micro-inches.

The problem is that the measurement data is missing! The report simplified the data, but it didn’t provide the actual measurement and details needed to better understand how the lab concluded that the gauge block was, in fact, short.

The data has been over-simplified, reporting only the error and uncertainties. What was the nominal value? Was it tested against the nominal value of 5 inches or against the length measurement made last year, 4.999999 inches? Was it measured once or multiple times? If it was measured multiple times, what were the individual measurements?

A better way to present the measurement data would be to include all relevant information about the measurement, i.e., the work performed by the calibration lab. This data should include the nominal value, upper and lower test limits that reference a specification, and the measured value or values, along with the measurement uncertainties. It should also include any additional information related to the measurement conditions.

Yes, a computer can calculate the measured value, assuming the gauge block was a 5-inch gauge block that measures 3.5 micro inches short, resulting in a 4.999965-inch measurement; but that is a calculation, not the calibration provider stating that they made a measurement of a certified value. I know it is splitting hairs, but if an error was made in the calculation, who is at fault, and where did the error occur? This is the real problem: Without the actual measurement data in the report, tracking down a math error or a simple typo, now becomes a task!

Simple calculations, such as Measured-Value minus Nominal-Value, can be performed by the report engine. Better yet, both the values are known, and the math is now auditable. Things like rounding errors can be addressed and evaluated, then corrected if needed.

The digital transition in metrology demands a complete overhaul of conventional, opaque data presentation. We must move beyond outdated paper or simplistic digital methods to present comprehensive measurement data. This includes integrating the final value with critical context: uncertainty, environmental conditions, equipment traceability, and procedures used.

Crucially, documentation—from planning to reporting—must be straightforward yet auditable. We need to adopt standardized digital protocols, like digital calibration certificates and blockchain logging, to ensure data integrity, permanence, and full traceability. Establishing this robust digital chain of custody guarantees transparency and confidence in metrological results.