SAGE Journal Articles

(20.1) Li, S., & Seale, C. (2007). Learning to do qualitative data analysis: An observational study of doctoral work. Qualitative Health Research, 17 (10): 1442-1452.

Abstract
Using examples from written assignments and supervisory dialogues, the authors report a longitudinal observational case study of a doctoral research project, focusing on the teaching and learning of qualitative data analysis on a project that involved coding and analysis of nursing talk. Written drafts contain concrete exemplars illustrating the problems and solutions discussed in supervisions. Early problems include the difficulty of knowing where to start with coding, ambiguities in the definition of codes, inaccurate reporting and recording of data, failure to distinguish researcher and actor categories, and overinterpretation of evidence. Solutions to these problems required their accurate identification, communication of practical solutions, and care in the interactional management of delivery and receipt of feedback. This detailed analysis informs readers of sources of validity, rigor, and, eventually, creativity in carrying out a social research project. It also assists in explicating an apprenticeship model for the learning of research skills.

(20.2) Yeh, C.J., & Inman, A.G. (2007). Qualitative data analysis and interpretation in counseling psychology: Strategies for best practices. The Counseling Psychologist, 35 (3), 369-403.

Abstract
This article presents an overview of various strategies and methods of engaging in qualitative data interpretations and analyses in counseling psychology. The authors explore the themes of self, culture, collaboration, circularity, trustworthiness, and evidence deconstruction from multiple qualitative methodologies. Commonalities and differences that span across approaches are explored. Implications for how researchers address qualitative data analysis and interpretation in counseling psychology training and research are discussed.

(20.3) Moret, M., Reuzel, R., van der Wilt, G. J., & Grin, J. (2007). Validity and reliability of qualitative data analysis: Interobserver agreement in reconstructing interpretative frames. Field Methods, 19 (1): 24-39.

Abstract
Many authors have discussed criteria for assessing the quality of qualitative studies. However, relatively few have presented the results of using criteria for validity of qualitative studies. We investigated the quality of reconstructing interpretative frames, a method for analyzing interview transcripts. The aim of this method is to describe a person’s perspective, distinguishing between perceived problem definitions, proposed solutions, empirical background theories, and normative preferences. Based on this description, one should be able to estimate this person’s cooperation on implementing specific changes in his or her practice. In this article, we assessed the interobserver reliability of this analytical method as an indicator of its rigor. Six analysts reconstructed interpretative frames on the basis of verbatim transcripts of three interviews. The analysts agreed only moderately about the issues identified and which problems should be prioritized. However, they showed remarkable unanimity as to the estimates of the respondents’ cooperation on proposed solutions

(20.4) Caracelli, V. J., & Greene, J. C. (1993). Data analysis strategies for mixed method evaluation designs. Educational Evaluation and Policy Analysis, 15 (2): 195-207.

Abstract
Four integrative data analysis strategies for mixed-method evaluation designs are derived from and illustrated by empirical practice: data transformation, typology development, extreme case analysis, and data consolidation/merging. The appropriateness of these strategies for different kinds of mixed-method intents is then discussed. Where appropriate, such integrative strategies are encouraged as ways to realize the full potential of mixed-methodological approaches

(20.5) Bazeley, P., & Kemp, L. (2012). Mosaics, triangles, and DNA: Metaphors for integrated analysis in mixed methods research. Journal of Mixed Methods Research, 6 (1): 55-72. doi: 10.1177/1558689811419514

Abstract
Metaphors used to describe the process of integration of analyses in mixed methods research are analyzed to determine various ways in which researchers think and write about integration. By examining the metaphors used and through examples of the application of each metaphor, the authors clarify the integrative processes they point to. The authors conclude this analysis by identifying from these metaphors eight principles to guide the effective integration of analyses in mixed methods research.