Thermometer Calibration

Calibrating thermometers is the process of verifying that a thermometer provides correct temperature measurements. This is done by comparing the thermometer’s readings with a reference standard (for example, a temperature fixed point) or a known temperature source (comparison calibration). The aim of calibration is to identify and, if necessary, adjust any errors or deviations so that the thermometer provides accurate measurements.

Methods for calibrating thermometers

Basically, there are two different methods to calibrate thermometers: The calibration according to the comparison method and the calibration at fixed temperature points.

Calibrate thermometer according to the comparison method

Calibrate thermometer according to the comparison method

The calibration of thermometers according to the comparison method is based on comparing a thermometer to be calibrated with an already calibrated thermometer. This principle is based on the zeroth law of thermodynamics, which was formulated by Sir William Thomson (Lord Kelvin) in 1848. Although it was developed as the last of the four laws of thermodynamics, it has been given the name “zeroth law” because of its fundamental importance.

The Zeroth Law says that if two systems are each in thermal equilibrium with a third system, they are also in thermal equilibrium with each other. Based on this law, one can conclude that if a calibrated thermometer indicates the same temperature as the actual temperature in a calibration bath and the thermometer to be calibrated also indicates this temperature, both thermometers measure the same temperature. However, this only applies under the condition that a state of equilibrium exists, which is not always the case in the real world.

Calibrate thermometers at temperature fixed points

Calibrate thermometer at temperature fixed points

When calibrating thermometers at temperature fixed points, the temperature standard is not a calibrated reference thermometer but a so-called temperature fixed point. These temperature fixed points are used as defining temperatures of the ITS-90 temperature scale and for the calibration of thermometers.

The International Temperature Scale of 1990 (ITS-90) defines the temperature scale nine fixed points in the range from -189.3442°C (triple point of argon) to 961.78°C (freezing point of silver). These fixed points are thermodynamic equilibrium states during phase transitions of pure substances.

An example of a fixed point is the triple point of water, where highly pure water at 0.01°C exists in solid, liquid and gaseous states. This state can then be used in a water triple point cell to calibrate thermometers at this defined temperature.

How Often Should a Thermometer Be Calibrated?

There isn’t a one-size-fits-all answer to this question. The frequency of calibrating thermometers generally depends on various factors:

  1. Purpose of the Thermometer: A thermometer used in critical applications, such as in the medical or food industry, may need to be calibrated more frequently than a basic household thermometer.
  2. Accuracy Requirements: Some processes demand very high temperature precision. In such cases, regular calibration is essential.
  3. Environmental Conditions: Thermometers exposed to extreme conditions (e.g., very high temperatures) may be more prone to inaccuracies and thus should be checked more often.
  4. Previous Calibration Results: If recent calibrations showed little to no deviations, the time between calibrations might be extended. However, if significant deviations are noted in each calibration, the frequency should be increased.

As a general rule, critical thermometers should be calibrated at least once a year. In some industries or for specific applications, more frequent calibrations, such as every three or six months, might be necessary. It’s always a good idea to include regular checks and calibrations in the maintenance schedule to ensure the thermometer operates correctly.

Standards and guidelines can also guide the determination of calibration cycles. For instance, DIN EN ISO/IEC 17025 is an international standard setting out the requirements for the competence of testing and calibration laboratories. If a laboratory is accredited to this standard, it indicates that it possesses the technical competence and has established a management system ensuring consistent and valid results.

Calibration of measuring instruments, including thermometers, is a crucial part of this standard. Some key points regarding calibration from DIN EN ISO/IEC 17025 include:

  1. General Requirements: Laboratories must ensure that all equipment potentially influencing the results is calibrated and/or qualified.
  2. Intervals: The standard does not prescribe specific calibration intervals. Instead, laboratories should use their risk management to determine calibration frequency.
  3. Traceability: Calibrations should be traceable to national or international standards.
  4. Records: Laboratories must maintain records of calibrations, including details about the method, the operator, environmental conditions, confirmation intervals, results, and any deviations.

A laboratory accredited to DIN EN ISO/IEC 17025 must have clear policies and procedures for calibrating its equipment. However, the exact frequency of calibration is determined by the laboratory itself, based on its risk management and the specific requirements of its accreditation.

Tips to calibrate thermometers successful

Tip 1: A thermometer only measures its own temperature

The statement “A thermometer measures only its own temperature” indicates a fundamental principle of thermometer calibration.

When you use a thermometer to measure the temperature of something, be it the air, a liquid or a solid body, what you are actually measuring is how warm or cold THE THERMOMETER itself is. The thermometer reaches thermal equilibrium with the medium it is measuring. This means that it assumes the same temperature as the medium.

A simple example is a mercury thermometer. The mercury in the thermometer expands and shrinks based on how hot or cold it is. If you immerse it in warm water, the mercury will expand because it becomes WARMER. If you immerse it in cold water, it will shrink because it is getting COLD. In both cases, the thermometer is actually measuring how warm or cold THE MERCURY is, not the water directly. But because the mercury quickly reaches thermal equilibrium with the water, the thermometer effectively shows the temperature of the water.

The same principle applies to digital thermometers, PT100 (resistance thermometers), thermocouples and others. They all react to temperature changes by changing their own temperature and then displaying or measuring this value.

It is important to note that for an accurate temperature measurement, the thermometer and the object to be measured must have enough time to reach thermal equilibrium. Otherwise, the measurement could be inaccurate.

Tip 2: Thermocouples always make a difference measurement

A thermocouple is made of two different metals joined together at one end. If there is a different temperature at this junction (called the “measuring point” or “hot junction”) than at the other end of the two metals (called the “reference junction” or “cold junction”), a voltage is created between these two points. This voltage is called thermo-voltage and depends on the temperature difference between the two ends and the specific material properties of the two metals.

This means that a thermocouple always measures the temperature difference between the measuring point and the reference junction. To determine the absolute temperature at the measuring point, the temperature at the reference junction must be known. Often this reference junction is cooled to a known temperature (e.g. 0°C in the case of a so-called external reference junction) or, in the case of mini-connectors for example, the room temperature is used as a reference.

A thermocouple therefore does not directly measure an absolute temperature, but a temperature difference between two points. To obtain an absolute temperature measurement, the temperature at one of the two points must be known.

Tip 3: Resistance thermometers always measure too warm

Resistance thermometers, often called Pt100 or Pt1000 (where the numbers indicate the nominal resistances at 0°C), use the temperature-dependent resistance of a metal, usually platinum, to measure temperatures. When an electric current flows through a resistor, that resistor is heated. This is a direct result of Ohm’s law, where the electrical power P through a resistor R is described as P = I^2 × R, where I is the current flowing through the resistor.

In resistance thermometers, a measuring current is sent through the platinum resistor to measure the resistance (and therefore the temperature). But it is precisely this measuring current – especially if it is too high – that can cause the temperature detector to heat up considerably. This heating then leads to a wrong measurement result, because the sensor becomes warmer than the actual environment that is to be measured. As a result, the resistance thermometer indicates a temperature that is too high.

For precise measurements or applications, the heating caused by the measuring current must be taken into account and compensated.

Sources


Thomas Klasmeier

About the Author

Thomas Klasmeier has been working as a metrologist and engineer for over 20 years, specializing in precise temperature measurement. As an entrepreneur, he runs a temperature calibration laboratory and manufactures precision thermometers.

Moreover, he is passionate about sharing his knowledge. He regularly speaks at seminars and professional conferences, sharing and discussing his expertise. He is also the author of Handbook “Temperature”.