39 research outputs found
Investigation of alternative interface designs for long-list questions–the case of a computer-assisted survey in Germany
This study aims to address the questionnaire design challenges in cases wherein questions involve a large number of response options. Traditionally, these long-list questions are asked in open-ended or closed-ended formats. However, alternative interface design options are emerging in computer-assisted surveys that combine both interface designs. To investigate trade-offs of these alternative designs, a split-ballot experiment was conducted with a) a long list of radio buttons, b) a search tree (nested list of response options), and c) a combo box (combination of a text box and a drop-down box). Based on the question on the highest educational qualification attained from the Innovation Sample of the German Socio-Economic Panel, we investigated the interface design that facilitates respondents optimally and enhances the measurement quality. The findings indicate that combo boxes reduce the response burden and increase measurement details, whereas search trees and long lists reduce post-coding efforts
What about the Less IT Literate? A Comparison of Different Postal Recruitment Strategies to an Online Panel of the General Population
Even though the proportion of individuals who are not equipped to participate in online surveys is constantly decreasing, many surveys face an under-representation of individuals who do not feel IT literate enough to participate. Using experimental data from a probability-based online panel, we study which recruitment survey mode strategy performs best in recruiting less IT-literate persons for an online panel. The sampled individuals received postal invitations to conduct the recruitment survey in a self-completion mode. We experimentally vary four recruitment survey mode strategies: one online mode strategy, two sequential mixed-mode strategies, and one concurrent mode strategy. We find the recruitment survey mode strategies to have a major effect on the sample composition of the recruitment survey, but the differences between the strategies vanish once respondents are asked to proceed with the panel online
The impact of technological change on survey nonresponse and measurement
This dissertation addresses the imbalance between technological advancements and human adaptation to technology in the context of survey research by raising the question of whether survey research is behind or too far ahead of their respondents. Hence, the four papers that constitute this dissertation deal with respondents' ability to use technology and how respondents' technological abilities are associated with nonresponse and measurement issues in computer-assisted surveys.
Although the adaptation to technology increases, the findings indicate that sub-groups of respondents are reluctant to participate in an online survey resulting in nonresponse bias. In addition, the measurement of long-list questions can be improved by alternative interface designs.
By tackling the imbalance between technological advancements and human adaptation to technology in survey research, this dissertation demonstrates that survey practitioners still have to be cautious when adapting survey designs to technological advancements because these technological advancements might be too far ahead of their respondents
Mobile web surveys
This guide is for survey practitioners who want to conduct web surveys considering mobile devices or a mobile-only web survey. The guide points out different strategies on how to handle multiple devices and specifically mobile devices in web surveys with regard to web survey design and data quality. Furthermore, it addresses issues of questionnaire design and raises questions that can help to decide for one or the other survey software
Preparation of survey data
This guide focuses on the data preparation phase, which starts after data collection and ends before their analysis. This first assessment of the “raw” survey data is crucial since data preparation can affect the quality of the data in a positive or negative way. After an overview of the different types of errors, the guide discusses the remedies and issues related to these editing procedures
Modeling group-specific interviewer effects on survey participation using separate coding for random slopes in multilevel models
Despite its importance in terms of survey participation, the literature is sparse on how face-to-face interviewers differentially affect specific groups of sample units. In this paper, we demonstrate how an alternative parametrization of the random components in multilevel models, so-called separate coding, delivers valuable insights into differential interviewer effects for specific groups of sample members. At the example of a face-to-face recruitment interview for a probability-based online panel, we detect small interviewer effects regarding survey participation for non-Internet households, whereas we find sizable interviewer effects for Internet households. Based on the proposed variance decomposition, we derive practical guidance for survey practitioners to address such differential
interviewer effects