How accurately can the camera measure an absolute temperature?
Most FLIR Thermography cameras have a specified accuracy of ±2 ºC (±3.6 ºF) or ±2% (whichever is greater) of reading for a blackbody target (emissivity ~ 1).
For example, for objects that are 100 °C or lower, the temperature reading off a blackbody can be 98°C to 102°C and be within specification. Similarly for objects above 100°C, say 200 °C, the reading could vary between 196°C and 204°C.
Some science cameras such as the SC660 are specified ±1°C or ±1% of reading.
This means that any camera, at any environment condition (within specification), at any time will give a reading within the accuracy specification.
However, a particular camera, at the same environment condition, will have a statistical repeatability of measurement that is much better than this. Typically close to the NETD value. This is also applicable when you compare adjacent pixels, provided your target(s) are optically resolved. This means that much higher accuracies can be achieved by comparing values with a known reference source in the image scene.
What happens when measuring real objects with emissivities less than one?
When measuring materials with emissivities less than one, in other words real everyday objects, the measurement error is increased. A simple to apply rule of thumb is shown below:
Camera accuracy specification
Adjusted accuracy = --------------------------------
So let’s take our 100°C object again, but this time the emissivity is 0.7. So the adjusted accuracy would be approximately:
±2 ºC (±3.6 ºF)
Adjusted accuracy = --------------------- = ±2.9ºC (±5.1ºF)
For specific applications such as fever screening, there is an averaging function implemented that makes absolute accuracy specification unnecessary. The averaging function sets a reference value from a series of "normal" readings, and gives an alarm if one single reading differs more than a certain limit.