139 research outputs found

    Research Note: Reducing the Threat of Sensitive Questions in Online Surveys?

    Get PDF
    We explore the effect of offering an open-ended comment field in a Web survey to reduce the threat of sensitive questions. Two experiments were field in a probability-based Web panel in the Netherlands. For a set of 10 items on attitudes to immigrants, a random half were offered the opportunity to explain or clarify their responses, with the hypothesis being that doing so would reduce the need to choose socially desirable answers, resulting in higher levels of prejudice. Across two experiments, we find significant effects contrary to our hypothesis – the opportunity to comment decreased the level of prejudice reported, and longer comments were associated with more tolerant attitudes among those who were offered the comment field

    Some Methodological Uses of Responses to Open Questions and Other Verbatim Comments in Quantitative Surveys

    Get PDF
    "The use of open-ended questions in survey research has a very long history. In this paper, building on the work of Paul F. Lazarsfeld and Howard Schuman, we review the methodological uses of open-ended questions and verbatim responses in surveys. We draw on prior research, our own and that of others, to argue for increasing the use of open-ended questions in quantitative surveys. The addition of open-ended questions - and the capture and analysis of respondents' verbatim responses to other types of questions - may yield important insights, not only into respondents' substantive answers, but also into how they understand the questions we ask and arrive at an answer. Adding a limited number of such questions to computerized surveys, whether self- or interviewer-administered, is neither expensive nor time-consuming, and in our experience respondents are quite willing and able to answer such questions." (author's abstract

    Using paradata to explore item level response times in surveys

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95020/1/rssa1041.pd

    Willingness of Online Panelists to Perform Additional Tasks

    Get PDF
    People’s willingness to share data with researchers is the fundamental raw material for most social science research. So far, survey researchers have mainly asked respondents to share data in the form of answers to survey questions but there is a growing interest in using alternative sources of data. Less is known about people’s willingness to share these other kinds of data. In this study, we aim to: 1) provide information about the willingness of people to share different types of data; 2) explore the reasons for their acceptance or refusal, and 3) try to determine which variables affect the willingness to perform these additional tasks. We use data from a survey implemented in 2016 in Spain, in which around 1,400 panelists of the Netquest online access panel were asked about their hypothetical willingness to share different types of data: passive measurement on devices they already use; wearing special devices to passively monitor activity; providing them with measurement devices and then having them self-report the results; providing physical specimens or bodily fluids (e.g. saliva); others. Open questions were used to follow up on the reasons for acceptance or refusal in the case of the use of a tracker. Our results suggest that the acceptance level is quite low in general, but there are large differences across tasks and respondents. The main reasons justifying both acceptance and refusal are related to privacy, security and trust. Our regression models also suggest that we can identify factors associated with such willingness

    Willingness to use mobile technologies for data collection in a probability household panel

    Get PDF
    We asked members of the Understanding Society Innovation Panel about their willingness to participate in various data collection tasks on their mobile devices. We find that stated willingness varies considerably depending on the type of activity involved: respondents are less willing to participate in tasks that involve downloading and installing an app, or where data are collected passively. Stated willingness also varies between smartphones and tablets, and between types of respondents: respondents who report higher concerns about the security of data collected with mobile technologies and those who use their devices less intensively are less willing to participate in mobile data collection tasks

    Characteristics of physical measurement consent in a population-based survey of older adults

    Get PDF
    BACKGROUND: Collecting physical measurements in population-based health surveys has increased in recent years, yet little is known about the characteristics of those who consent to these measurements. OBJECTIVE: To examine the characteristics of persons who consent to physical measurements across several domains, including one’s demographic background, health status, resistance behavior toward the survey interview, and interviewer characteristics. RESEARCH DESIGN, SUBJECTS, AND MEASURES: We conducted a secondary data analysis of the 2006 Health and Retirement Study, a nationally-representative panel survey of older adults aged 50 and older. We performed multilevel logistic regressions on a sample of 7,457 respondents who were eligible for physical measurements. The primary outcome measure was consent to all physical measurements. RESULTS: Seventy-nine percent (unweighted) of eligible respondents consented to all physical measurements. In weighted multilevel logistic regressions controlling for respondent demographics, current health status, survey resistance indicators, and interviewer characteristics, the propensity to consent was significantly greater among Hispanic respondents matched with bilingual Hispanic interviewers, diabetics, and those who visited a doctor in the past 2 years. The propensity to consent was significantly lower among younger respondents, those who have several Nagi functional limitations and infrequently participate in “mildly vigorous” activities, and those interviewed by black interviewers. Survey resistance indicators, such as number of contact attempts and interviewer observations of resistant behavior in prior wave iterations of the HRS were also negatively associated with physical measurement consent. The propensity to consent was unrelated to prior medical diagnoses, including high blood pressure, cancer (excl. skin), lung disease, heart abnormalities, stroke, and arthritis, and matching of interviewer and respondent on race and gender. CONCLUSIONS: Physical measurement consent is not strongly associated with one’s health status, though the findings are somewhat mixed. We recommend that physical measurement results be adjusted for characteristics associated with the likelihood of consent, particularly functional limitations, to reduce potential bias. Otherwise, health researchers should exercise caution when generalizing physical measurement results to persons suffering from functional limitations that may affect their participation

    Prenotification in web-based access panel surveys: the influence of mobile text messaging versus e-mail on response rates and sample composition

    Full text link
    To compare the effectiveness of different prenotification and invitation procedures in a webbased three-wave access panel survey over 3 consecutive months, we experimentally varied the contact mode in a fully crossed two-factorial design with (a) three different prenotification conditions (mobile short messaging service [SMS], e-mail, no prenotice) and (b) two “invitation and reminder" conditions (SMS, e-mail). A group with nearly complete mobile phone coverage was randomly assigned to one of these six experimental conditions. As expected, SMS prenotifications outperformed e-mail prenotifications in terms of response rates across all three waves. Furthermore, e-mail invitation response rates outperformed those for SMS invitations. The combination of SMS prenotification and e-mail invitation performed best. The different experimental treatments did not have an effect on the sample composition of respondents between groups. (author's abstract

    Understanding Society: Minimizing selection biases in data collection using mobile apps

    Get PDF
    The UK Household Longitudinal Study: Understanding Society has a programme of research and development that underpins innovations in data collection methods. One of our current focuses is on using mobile applications to collect additional data that supplement data collected in annual interviews. To date, we have used mobile apps to collect data on consumer expenditure, well-being, anthropometrics and cognition. In this article we review the potential barriers to data collection using mobile apps and experimental evidence collected with the Understanding Society Innovation Panel, on what can be done to reduce these barriers

    Response of sensitive behaviors to frequent measurement

    Get PDF
    We study the influence of frequent survey measurement on behavior. Widespread access to the Internet has made important breakthroughs in frequent measurement possible—potentially revolutionizing social science measurement of processes that change quickly over time. One key concern about using such frequent measurement is that it may influence the behavior being studied. We investigate this possibility using both a population-based experiment with random assignment to participation in a weekly journal for twelve months (versus no journal) and a large scale population-based journal-keeping study with weekly measurement for 30 months. Results reveal few of the measured behaviors are correlated with assignment to frequent measurement. Theoretical reasoning regarding the likely behavioral response to frequent measurement correctly predicts domains most vulnerable to this possibility. Overall, however, we found little evidence of behavioral response to frequent measurement

    Increasing Participation in a Mobile App Study: The Effects of a Sequential Mixed-Mode Design and In-Interview Invitation

    Get PDF
    Mobile apps are an attractive and versatile method of collecting data in the social and behavioural sciences. In samples of the general population, however, participation in app-based data collection is still rather low. In this paper, we examine two potential ways of increasing participation and potentially reducing participation bias in app-based data collection: 1) inviting sample members to a mobile app study within an interview rather than by post, and 2) offering a browser-based follow-up to the mobile app. We use experimental data from Spending Study 2, collected on the Understanding Society Innovation Panel and on the Lightspeed UK online access panel. Sample members were invited to download a spending diary app on their smartphone or use a browser-based online diary to report all their purchases for one month. The results suggest that inviting sample members to an app study within a face-to-face interview increases participation rates but does not bring in different types of participants. In contrast, the browser-based alternative can both increase participation rates and reduce biases in who participates if offered immediately once the app had been declined. We find that the success of using mobile apps for data collection hinges on the protocols used to implement the app
    corecore