Keywords

intercoder reliability assessment, qualitative research, quality assurance

 

Authors

  1. Burla, Laila
  2. Knierim, Birte
  3. Barth, Jurgen[spacing diaeresis]
  4. Liewald, Katharina
  5. Duetz, Margreet
  6. Abel, Thomas

Abstract

Background: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis.

 

Objective: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis.

 

Methods: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain.

 

Results: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results.

 

Discussion: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.