Calibration Accuracy
Calibration accuracy is the main component of an
accuracy specification. It provides information on
deviation of the individual sensor readings in
equilibrium state against a high precision reference at
the time of calibration (which is understood to be about
the time of sale). Major causes for tolerance on
calibration accuracy are in-batch variation (e.g.
homogeneity of conditions within calibration chamber),
batch-to-batch variation, precision of calibration
reference, and stability of sensors. Calibration accuracy
is measured against a dew point mirror – a high
precision reference. Thus, the user shall be able to
reproduce the value.
Sensirion specifies calibration accuracy with two
different parameters:
Typical accuracy: The above mentioned variation of
measured deviation against reference may be
characterized by an average value and a coverage
factor k (k=1 is equivalent to standard distribution σ in
case of normal distribution). For typical tolerances of
accuracy at a certain log point, Sensirion understands
that for a sample, such as a batch, average values ±2k
are located inside specified limits. With other words
95% of the sensors measure within this typical limits.