Of the 357 responses received, 12 were not usable due to incomplete or improper data entry. Coding of the free responses was conducted by three researchers to ensure reliability. To address content validity, previous research was utilized to build coherent justification for the themes and codes (Creswell, 2009). Concurrent validity was verified
through the repeated themes of the respondents. Intercoder reliability was checked and data was compared to the code definitions throughout the coding process (Creswell, 2009). A Krippendorff’s alpha ¼ 0.83 was obtained as an overall measure of intercoder reliability, where 0.8 is considered acceptable for most research (Lombard et al., 2002). There were several instances where the researchers’ opinions of the reliability of the results may have been in question. These concerns are noted in the results