The students’ answers were analyzed by the two research authors. The degree of concordance (interrater agreement) was evaluated with Cohen’s kappa (κ) coefficient and, as can be seen in Table 3, a level of agreement was achieved between substantial and almost perfect according to the Landis & Koch scale (1977). Moreover, the intrarater reliability κ coefficient was also calculated for the main researcher with a comparison of the original scoring and the rescoring of the responses 3 weeks later. The analysis of these scores revealed an average value of 0.88 agreement for all questions, which is satisfactory for a level of confidence of 95 %.