73 research outputs found

    Assessing Trends and Decomposing Change in Nonresponse Bias: The Case of Bias in Cohort Distributions

    Full text link
    Survey research is still confronted by a trend of increasing nonresponse rates. In this context, several methodological advances have been made to stimulate participation and avoid bias. Yet, despite the growing number of tools and methods to deal with nonresponse, little is known about whether nonresponse biases show similar trends as nonresponse rates and what mechanisms (if any) drive changes in bias. Our article focuses on biases in cohort distributions in the U.S. and German general social surveys from 1980 to 2012 as one of the key variables in the social sciences. To supplement our cross-national comparison of these trends, we decompose changes into within-cohort change (WCC) and between-cohort change. We find that biases in cohort distributions have remained relatively stable and at a relatively low level in both countries. Furthermore, WCC (i.e., survey climate) accounts for the major part of the change in nonresponse bias

    Understanding Respondents' Attitudes Toward Web Paradata Use

    Full text link
    The collection and use of paradata is gaining in importance, especially in web surveys. From a research ethics’ perspective, respondents should be asked for their consent to the collection and use of web paradata. In this context, a positive attitude toward paradata use has been deemed to be a prerequisite for respondents’ willingness to share their paradata. The present study aimed to identify factors affecting respondents’ attitudes toward paradata use. Our findings revealed that adequately informing survey respondents about what paradata are and why they are used was an important determinant of their attitudes toward paradata use. Moreover, we found that respondents with a positive attitude toward the survey were more likely to have a favorable opinion of paradata use. Our findings suggest that a thorough understanding of the factors that contribute to a positive attitude toward paradata use provides the basis for improved paradata consent procedures, which in turn will increase rates of consent to paradata use and help attenuate the risk of consent bias in web surveys

    Early and Late Participation during the Field Period: Response Timing in a Mixed-Mode Probability-Based Panel Survey

    Full text link
    Reluctance of respondents to participate in surveys has long drawn the attention of survey researchers. Yet, little is known about what drives a respondent's decision to answer the survey invitation early or late during the field period. Moreover, we still lack evidence on response timing in longitudinal surveys. That is, the questions on whether response timing is a rather stable respondent characteristic and what - if anything - affects change in response timing across different interviews remain open. We relied on data from a mixed-mode general population panel survey collected between 2014 and 2016 to study the stability of response timing across 18 panel waves and factors that influence the decision to participate early or late in the field period. Our results suggest that the factors which had effects on response timing are different in the mail and web modes. Moreover, we found that experience with prior panel waves affected the respondent's decision to participate early or late. Overall, the present study advocates understanding response timing as a metric variable and, consequently, the need to reflect this in modeling strategies

    Good questions, bad questions? A Post-Survey Evaluation Strategy Based on Item Nonresponse

    Get PDF
    In this paper we discuss a three-step strategy to evaluate data quality in terms of item nonresponse and to identify potentially flawed questions. We provide an example with several data sets of a large-scale social scientific study to illustrate the application of the strategy and to highlight its benefits. In survey research it is common practice to test questions ex ante, for example by means of cognitive pretesting. Nevertheless, it is necessary to check the respondents’ response behavior throughout the questionnaire to evaluate the quality of the collected data. Articles addressing item nonresponse mostly focus on individuals or specific questions – adjusting the focus on the questionnaire as a whole seems to be a fruitful addition for survey methodology. Shifting the perspective enables us to identify problematic questions ex post and adjust the questionnaire or research design before re-applying it to further studies or to assess the data quality of a study. This need may arise from shortcomings or failures during the cognitive pretesting or as a result of unforeseen events during the data collection. Furthermore, result of this ex post analysis may be an integral part of data quality reports

    A Note on How Prior Survey Experience With Self-Administered Panel Surveys Affects Attrition in Different Modes

    Full text link
    Attrition poses an important challenge for panel surveys. With respect to these surveys, respondents’ decisions about whether to participate in reinterviews are affected by their participation in prior waves of the panel. However, in self-administered mixed-mode panels, the way of experiencing a survey differs between the mail mode and the web mode. Consequently, this study investigated how respondents' prior experience with the characteristics of a survey - such as length, difficulty, interestingness, sensitivity, and the diversity of the questionnaire - affects their informed decision about whether to participate again or not. We found that the length of a questionnaire seems to be of such importance to respondents that they base their participation on this characteristic, regardless of the mode. Our findings also suggest that the difficulty and diversity of questionnaires are readily accessible information that respondents use in the mail mode when making a decision about whether to participate again, whereas these characteristics have no effect in the web mode. In addition, privacy concerns have an impact in the web mode but not in the mail mode

    Relying on External Information Sources When Answering Knowledge Questions in Web Surveys

    Full text link
    Knowledge questions frequently are used in survey research to measure respondents’ topic-related cognitive ability and memory. However, in self-administered surveys, respondents can search external sources for additional information to answer a knowledge question correctly. In this case, the knowledge question measures accessible and procedural memory. Depending on what the knowledge question aims at, the validity of this measure is limited. Thus, in this study, we conducted three experiments using a web survey to investigate the effects of task difficulty, respondents’ ability, and respondents’ motivation on the likelihood of searching external sources for additional information as a form of over-optimizing response behavior when answering knowledge questions. We found that the respondents who are highly educated and more interested in a survey are more likely to invest additional efforts to answer knowledge questions correctly. Most importantly, our data showed that for these respondents, a more difficult question design further increases the likelihood of over-optimizing response behavior

    Risk of Nonresponse Bias and the Length of the Field Period in a Mixed-Mode General Population Panel

    Get PDF
    Survey researchers are often confronted with the question of how long to set the length of the field period. Longer fielding time might lead to greater participation yet requires survey managers to devote more of their time to data collection efforts. With the aim of facilitating the decision about the length of the field period, we investigated whether a longer fielding time reduces the risk of nonresponse bias to judge whether field periods can be ended earlier without endangering the performance of the survey. By using data from six waves of a probability-based mixed-mode (online and mail) panel of the German population, we analyzed whether the risk of nonresponse bias decreases over the field period by investigating how day-by-day coefficients of variation develop during the field period. We then determined the optimal cut-off points for each mode after which data collection can be terminated without increasing the risk of nonresponse bias and found that the optimal cut-off points differ by mode. Our study complements prior research by shifting the perspective in the investigation of the risk of nonresponse bias to panel data as well as to mixed-mode surveys, in particular. Our proposed method of using coefficients of variation to assess whether the risk of nonresponse bias decreases significantly with each additional day of fieldwork can aid survey practitioners in finding the optimal field period for their mixed-mode surveys

    Using Google Trends Data to Learn More About Survey Participation

    Get PDF
    As response rates continue to decline, the need to learn more about the survey participation process remains an important task for survey researchers. Search engine data may be one possible source for learning about what information some potential respondents are looking up about a survey when they are making a participation decision. In the present study, we explored the potential of search engine data for learning about survey participation and how it can inform survey design decisions. We drew on freely available Google Trends (GT) data to learn about the use of Google Search with respect to our case study: participation in the Family Research and Demographic Analysis (FReDA) panel survey. Our results showed that some potential respondents were using Google Search to gather information on the FReDA survey. We also showed that the additional data obtained via GT can help survey researchers to discover topics of interest to respondents and geographically stratified search patterns. Moreover, we introduced different approaches for obtaining data via GT, discussed the challenges that come with these data, and closed with practical recommendations on how survey researchers might utilize GT data to learn about survey participation.Da Response Rates in Umfragen immer weiter sinken, bleibt es eine wichtige Aufgabe für methodische Forschung, mehr über den Teilnahmeprozess zu lernen. Search Engine Data können eine mögliche Quelle sein, um herauszufinden, welche Informationen potenzielle Befragte über eine Umfrage suchen, wenn sie eine Teilnahmeentscheidung treffen. In der vorliegenden Studie untersuchten die Autor*innen das Potenzial von Suchmaschinendaten, um etwas über die Teilnahme an Umfragen zu erfahren und wie diese Daten in Entscheidungen über die Gestaltung von Umfragen einfließen können. Sie stützten sich auf frei verfügbare Daten von Google Trends (GT), um mehr über die Nutzung der Google-Suche in Bezug auf eine Fallstudie zu erfahren: die Teilnahme an der Panel-Umfrage Family Research and Demographic Analysis (FReDA). Die Ergebnisse zeigten, dass einige potenzielle Befragte die Google-Suche nutzten, um Informationen über die FReDA-Umfrage einzuholen. Die Autoren zeigen ebenfalls, dass die über GT gewonnenen zusätzlichen Daten den Umfrageforschern helfen können, Themen, die für die Befragten von Interesse sind, sowie geografisch geschichtete Suchmuster zu entdecken. Darüber hinaus stellen die Autoren verschiedene Ansätze für die Beschaffung von Daten über GT vor, erörtern die mit diesen Daten verbundenen Herausforderungen und geben abschließend praktische Empfehlungen, wie Umfrageforscher GT-Daten nutzen können, um mehr über die Teilnahme an ihren Umfragen zu erfahren

    Die Existenzdauer von Unternehmen in Deutschland : Ein Ost- West Vergleich

    Full text link
    Die vorliegende Arbeit untersucht die Existenzdauer von Unternehmen in Ost- und Westdeutschland. Dabei wird eine offene Perspektive auf die Organisationen eingenommen und relevante Mechanismen der Selektion von Unternehmen aus dem theoretischen Ansatz der Organisationsökologie (Hannan et al.) abgeleitet. Als zentrale Forschungsfrage wird untersucht ob Varianz in den Existenzdauer zwischen Ost- und Westdeutschland besteht und welchen Mechanismen die Selektion von Unternehmen aus ihren Populationen unterliegt. Die aus dem theoretischen und historischen Rahmen gewonnenen Forschungshypothesen werden einer empirischen Prüfung unterzogen. Als Datengrundlage dient das IAB-Betriebspanel des Instituts für Arbeitsmarkt- und Berufsforschung aus Nürnberg. Mittels nicht-parametrischer und parametrischer Verfahren der Ereignisanalyse gelingt es Varianz in den Existenzdauern zwischen Ost- und Westdeutschland nachzuweisen. Ostdeutsche Unternehmen der untersuchten Kohorten weisen eine signifikant niedrigere Schließungswahrscheinlichkeit auf als ihre westdeutschen Konterparts. Selektionsmechanismen wie die Liability of Adolescence und Density Dependence können in Accelerated Failure Time Modellen aufgezeigt werden. Weiterhin weisen die Ergebnisse auf die Relevanz hin, Populationen von Organisationen als Analyseeinheiten zu berücksichtigen
    corecore