Calibrating Low-Temperature Radiation Thermometers

By Peter Saunders

The advent of low-cost handheld radiation thermometers, or infrared (IR) thermometers, has led to a proliferation of non-contact temperature measurement in the food, building, and low-temperature processing industries. These thermometers typically measure temperatures in the range –50° C to 500° C using uncooled thermopiles that detect radiation in the 8–14 µm spectral range (or similar). However, these instruments are not as simple to use or to calibrate as they first appear due to systematic effects that are present in almost all measurements. This article provides information relating to the calibration of these “low-temperature” IR thermometers.

Because the detectors in these instruments are uncooled, radiation emitted by the detector itself must be considered in the calibration process. The emissivity setting on the thermometer, which is often fixed at a value of 0.95, and any radiation reflected from the surroundings, must also be taken into account. As a consequence of these systematic effects, calibration methods are more complicated than for contact thermometers or high-temperature IR thermometers. The expected reading, even on a perfect IR thermometer, does not necessarily match the reading of the reference thermometer. This article describes the nature of the systematic effects and outlines a procedure for determining the corrections required during calibration. Read Full Article (PDF)