SAGE Journal Articles
Click on the following links. Please note these will open in a new window.
Journal Article 1: Liu, M., & Cernat, A. (2016). Item-by-item versus matrix questions. Social Science Computer Review, 1–17.
Abstract: While the choice of matrix versus item-by-item questions has received considerable attention in the literature, it is still unclear in what situation one is better than the other. Building upon the previous findings, this study expands this line of research by examining whether the difference between the two question types is moderated by the number of response options. Through a web survey experiment, this study compares matrix and item-by-item questions with 2, 3, 4, 5, 7, 9, and 11 response options. Additionally, we also investigate the impact of the device used to complete the survey on data quality. The results show that straight lining and response time are similar between the two question types across all response lengths, but item nonresponse tends to be higher for matrix than item-by-item question, especially among mobile respondents. Also measurement models reveal measurement equivalence between the two question types when there are fewer than seven response options. For matrices with 9 or 11 response options, analyses reveal substantial differences compared to item-by-item questions.
Journal Article 2: Moy, P., & Murphy, J. (2016). Problems and prospects in survey research. Journalism & Mass Communication Quarterly, 93, 16–37.
Abstract: Over the last few decades, survey research has witnessed a number of developments that have affected the quality of data that emerge using this methodology. Using the total survey error (TSE) approach as a point of departure, this article documents chronic challenges to data quality. With the aim of facilitating assessments of data quality, this article then turns to best practices in the disclosure of survey findings based on probability and nonprobability samples. Finally, (p)reviewing the use of technology and social media, it provides an overview of the opportunities and challenges for survey research today.
Journal Article 3: Vogl, S. (2013). Telephone versus face-to-face interviews. Sociological Methodology, 43, 133–177.
Abstract: Usually, semistructured interviews are conducted face-to-face, and because of the importance of personal contact in qualitative interviews, telephone interviews are often discounted. Missing visual communication can make a telephone conversation appear less personal and more anonymous but can also help prevent some distortions and place the power imbalance between adult interviewer and (child) respondent into perspective.
In this article telephone and face-to-face interviews are compared in order to analyse the general applicability of telephone interviews and their peculiarities when researching children. The data consists of 112 semistructured interviews with 56 children aged 5, 7, 9 and 11, conducted in Germany. Each child was interviewed twice; once on the telephone and once face-to-face. By triangulating qualitative and quantitative analytical steps, both interview modes are compared from a number of perspectives. The results showed very little difference between the two modes of interview and therefore challenge the reluctance to conduct semistructured telephone interviews, both in qualitative research and with children. Dependent on the research question, relevant distinctions could be the lower interviewer involvement, the lower number of opinions and suggestions stated by respondents and fewer signs of tension and tension release in telephone interviews.
Journal Article 4: Goforth, A. N., Rennie, B. J., Hammond, J., & Schoffer Closson , J. K. (2015). Strategies for data collection in social skills group interventions a case study. Intervention in School and Clinic, 51, 170–177.
Abstract: For many practitioners in schools and clinics, collecting data to show the effectiveness of an intervention is probably one of the most important yet challenging components of intervention implementation. This article provides practitioners with an example case study of how data can be organized and collected to determine the effectiveness of a social skills group intervention. Techniques to establish individual and group goals, determine a method to collect data, collect baseline data, and monitor progress are described. Challenges and practical strategies are discussed, and useful and specific suggestions are provided.