In the content analysis, in order to determine
reliability, the general coherence between encoders
was calculated. After the interview transcripts were
completed, by considering the interview questions
and by determining the choices that cover the
answers to these questions, an interview coding
key was prepared. For detecting the reliability of
the interview coding key, five interview transcripts
were chosen randomly and multiplied. Then, these
transcripts were evaluated by the researchers with
the interview coding key. Evaluation was done by
marking the suitable choice in the interview coding
key. In order to determine the coherence of the
signs in the interview coding key, all answers to
the questions were compared by examining them
one by one. After this step, the last version of the
interview coding key was given. Each researcher
assigned suitable choices that included the answer
to each question after reading the interview
transcript forms independently. After this process,
the answers of the related questions were checked,
and signs such as idea union and differences of
ideas were found. If the researchers signed the same
part of the answer it was accepted as an idea union,
if they signed different choices, it was accepted as
a difference of ideas. In this research, in order to
determine reliability of the content analysis, the
agreement percentage formula between researchers
was used. The agreement percentage formula is
given below: