The students’ answers were analyzed by the two research authors. The degree of concordance (interrater agreement) was evaluated with Cohen’s kappa (κ) coefficient and, as can be seen in Table 3, a level of agreement was achieved between substantial and almost perfect according to the Landis & Koch scale (1977). Moreover, the intrarater reliability κ coefficient was also calculated for the main researcher with a comparison of the original scoring and the rescoring of the responses 3 weeks later. The analysis of these scores revealed an average value of 0.88 agreement for all questions, which is satisfactory for a level of confidence of 95 %.
It must be noted that, as had been forecast taking into account the characteristics of the two participating groups, no significant differences were found between the results obtained by students in either section of the physics course; therefore, we have considered them a single student sample. Table 4 summarizes the results obtained in situations I, II, III, and