The earliest of these methods, the temperature correlation method, seeks to predict the smoke concentration at detector activation by relating the concentration to the temperature at the detector. This method is derived from experimental work in the 1970s of Heskestad and Delichatsios [4], who proposed a correlation between the temperature rise at the smoke detector and the amount of smoke at the detector location for a given fuel.
For example, Heskestad and Delichatsios determined that for a particular fuel, an 11.1 1C temperature rise at the smoke
detector could be correlated to activation of that smoke detector. This detector activation methodology was originally
based in part on the fact that early fire models could more accurately predict the thermal layer than the smoke layer. However, considerable criticism of the accuracy of this correlation, particularly due to its fuel dependency, has
been published in the peer-reviewed literature [5–12].
A fundamental flaw with the temperature correlation method has been the sometimes weak relationship between the
development of smoke density in a fire and the development of a thermal layer. Despite its shortcomings, the temperature correlation method is still in use in some segments of the fire safety community almost 30 years after its introduction, because this method is easy to implement with any fire model [5–12].