Rater 2 was a research psychologist specializing in transport safety, with extensive experience in the applicationof the SRK framework to occupational accidents. To evaluate the reliability of the coding system, 40 randomly selected occurrences analyzed by Rater 1 were analyzed independently by Rater 2, and the level of agreement between the two raters was assessed. The value of Cohen’s Kappa for the coding of errors was 0.68. Using the guidelines of Landis and Koch (1977), this represents a substantial level of inter-rater consistency.The time when each error occurred was placed into 1 h bins centered
on the hour. So, for example, the time bin 01:00 h included all errors that occurred in the period 00:31–01:30 h.