Thoughts on Metrology Software Life Cycles

thinkerI was having a long conversation the other day with a fellow software engineer. One of the topics we talked about for hours was software life cycles and how metrology software is not like most software. One big difference is the typical life cycle of metrology software; most software applications have a short life cycle, usually less than five years. But when it comes to metrology software, many of the application we are running today are more than twenty years old. That is four times the natural life expectancy of software.

For those of us who have been in this industry for years there are a few names that stand out. One good example is Hewlett Packard’s Rocky Mountain Basic (RMB). Born in the late 1970’s it was introduced as a tool for scientists and engineers to control instrumentation. It was then later embedded in the instrumentation hardware. I remember how cool it was to write custom software that would run right on the measurement hardware. And still today, when I visit calibration labs I see Rocky Mountain Basic procedures running and being developed.

So why is it that very few software applications running on my computer are more than five years old, while the software we are running in the lab is ancient? Why do companies rewrite, rethink and drastically update software? And why does this not happen as often with metrology software?

All of this got me to thinking, “What is the life cycle of metrology software?” If I was to write something new today, how long would it be used? But most importantly, what could I do today in designing my software that could impact its longevity? All difficult questions, but none the less they should be part of our design paradigm.

What is the life expectancy of metrology software? I think to answer this question we have to take a good hard look at the life expectancy of the hardware we are running. Taking a step back and looking at the total picture, it becomes obvious that it’s not the software but rather the measurement hardware’s longevity extending the life expectancy of software. I am amazed to still see calibration labs using standards that are 20 years old. Often when I ask a calibration lab why they are using such old standards the answer is usually related to historical data or accuracy.

New standards are being created every day, but I have noticed a trend; the greater the accuracy of a standard, the longer its life expectancy. There are many examples, just look around your calibration lab. Most labs will have a 57xxA, 3458A, 8510C, 8902A—all standards engineered and designed more than 20 years ago. In many labs today, they are still the calibration lab’s primary traceable standards.

There is a direct relationship between the long life cycle of metrology software and the accuracy of the lab standards. Most software is directly tied to the computers they run on; when we replace the computer, we install all new software. But in the metrology world, we will install old software on a new computer because it makes good measurements.

So when designing our software, we must to plan for the super computers of the future. As Moore’s Law continues to hold true, we know the computing power controlling our measurement hardware will be magnitudes more powerful in just a few years. We have to factor in speed consideration for tomorrow’s computers when designing the software. Then we need to find that balance between measurement accuracy and speed.

Speed itself is a double edged sword. Faster computers can cause the software to run to fast, causing errors in measurements. We need to resolve those speed issues before they result in measurement errors. Adding a wait statement is one solution, but then how much processor is going to be wasted waiting for time to pass?

Metrology as a science is well established with a bright, long lasting future. When choosing a language we should look for the same things, such as languages that are widely used and accepted in all industries. Cross platform support will also maximize longevity. Languages like C#, Java and LabView have all been widely used and accepted and, at present, show no signs of losing ground.

Organizing and architecting software for both today’s use and tomorrow’s unknowns is extremely difficult. I have found avoiding specifics and thinking in abstract layers leads to better software design. The more specifics you can push down to the lowest possible level, the more robust your code will be. And the more robust your software is, the greater the chances are it will still be running long after you are gone.