166 research outputs found

    Research Note: Reducing the Threat of Sensitive Questions in Online Surveys?

    Get PDF
    We explore the effect of offering an open-ended comment field in a Web survey to reduce the threat of sensitive questions. Two experiments were field in a probability-based Web panel in the Netherlands. For a set of 10 items on attitudes to immigrants, a random half were offered the opportunity to explain or clarify their responses, with the hypothesis being that doing so would reduce the need to choose socially desirable answers, resulting in higher levels of prejudice. Across two experiments, we find significant effects contrary to our hypothesis – the opportunity to comment decreased the level of prejudice reported, and longer comments were associated with more tolerant attitudes among those who were offered the comment field

    Some Methodological Uses of Responses to Open Questions and Other Verbatim Comments in Quantitative Surveys

    Get PDF
    "The use of open-ended questions in survey research has a very long history. In this paper, building on the work of Paul F. Lazarsfeld and Howard Schuman, we review the methodological uses of open-ended questions and verbatim responses in surveys. We draw on prior research, our own and that of others, to argue for increasing the use of open-ended questions in quantitative surveys. The addition of open-ended questions - and the capture and analysis of respondents' verbatim responses to other types of questions - may yield important insights, not only into respondents' substantive answers, but also into how they understand the questions we ask and arrive at an answer. Adding a limited number of such questions to computerized surveys, whether self- or interviewer-administered, is neither expensive nor time-consuming, and in our experience respondents are quite willing and able to answer such questions." (author's abstract

    Using paradata to explore item level response times in surveys

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95020/1/rssa1041.pd

    Assessment of Innovations in Data Collection Technology for Understanding Society

    Get PDF
    In this brief assessment, Professor Mick Couper (Survey Research Center, University of Michigan) reviews some of the options for using new technologies for data collection in Understanding Society, with a primary focus on Web-based data collection. He briefly addresses the following areas: Web as the primary mode for all main instruments, in a sequential mixed-mode approach; Web as the secondary mode for all main instruments, in a sequential mixed-mode approach; Web as the primary mode for certain instruments, such as the youth self-completion survey; Web as the only or primary mode for special supplemental studies; Web as the only mode of data collection, using an online panel; Other technologies such as smart phones, tablets, and social media; Use of administrative records; Biomeasures. Each of these topics is addressed in the body of the report. Much of the review focuses on the use of the Web as the primary mode of data collection in a sequential mixed-mode design, as this is the approach currently under investigation in the Understanding Society Innovation Panel (IP5), and has the most potential – if successful – of yielding efficiencies in data collection. However, the existing research evidence is thin, and while there are some promising findings there are also studies that suggest this approach might not be as effective as hoped. This suggests caution in proceeding down this path too fast. The mixed-mode design planned for the next two Innovation Panels (IP5 and IP6) offers the best opportunity to gain much-needed evidence of direct relevant to Understanding Society, and Professor Couper's recommendation is to wait for the results of these studies to be available before any decisions about changing data collection strategies for the Understanding Society mainstage are made. This conclusion is based on the fact that there is much that is not known about how well the introduction of Web-based data collection will work, and that proceeding without such knowledge presents intolerable levels of risk for an important infrastructure study like Understanding Society

    Willingness of Online Panelists to Perform Additional Tasks

    Get PDF
    People’s willingness to share data with researchers is the fundamental raw material for most social science research. So far, survey researchers have mainly asked respondents to share data in the form of answers to survey questions but there is a growing interest in using alternative sources of data. Less is known about people’s willingness to share these other kinds of data. In this study, we aim to: 1) provide information about the willingness of people to share different types of data; 2) explore the reasons for their acceptance or refusal, and 3) try to determine which variables affect the willingness to perform these additional tasks. We use data from a survey implemented in 2016 in Spain, in which around 1,400 panelists of the Netquest online access panel were asked about their hypothetical willingness to share different types of data: passive measurement on devices they already use; wearing special devices to passively monitor activity; providing them with measurement devices and then having them self-report the results; providing physical specimens or bodily fluids (e.g. saliva); others. Open questions were used to follow up on the reasons for acceptance or refusal in the case of the use of a tracker. Our results suggest that the acceptance level is quite low in general, but there are large differences across tasks and respondents. The main reasons justifying both acceptance and refusal are related to privacy, security and trust. Our regression models also suggest that we can identify factors associated with such willingness

    Willingness to use mobile technologies for data collection in a probability household panel

    Get PDF
    We asked members of the Understanding Society Innovation Panel about their willingness to participate in various data collection tasks on their mobile devices. We find that stated willingness varies considerably depending on the type of activity involved: respondents are less willing to participate in tasks that involve downloading and installing an app, or where data are collected passively. Stated willingness also varies between smartphones and tablets, and between types of respondents: respondents who report higher concerns about the security of data collected with mobile technologies and those who use their devices less intensively are less willing to participate in mobile data collection tasks

    The effects of personalized feedback on participation and reporting in mobile app data collection

    Get PDF
    Offering participants in mobile app studies personalized feedback on the data they report seems an obvious thing to do: participants might expect an app to provide feedback given their experiences with commercial apps, feedback might motivate more people to participate in the study, and participants might be more motivated to provide accurate data so that the feedback is more useful to them. However, personalized feedback might lead participants to change the behaviour that is being measured with the app, implementing feedback is costly, and also constrains other design decisions for the data collection. In this paper, we report on an experimental study that tested the effects of providing personalized feedback in a one-month mobile app-based spending study. Based on the app paradata and responses to a debrief survey, it seems that participants reacted positively to the feedback. The feedback did not have the potential negative effect of altering the spending participants reported in the app. However, the feedback also did not have the intended effect of increasing initial participation or ongoing adherence to the study protocol

    Characteristics of physical measurement consent in a population-based survey of older adults

    Get PDF
    BACKGROUND: Collecting physical measurements in population-based health surveys has increased in recent years, yet little is known about the characteristics of those who consent to these measurements. OBJECTIVE: To examine the characteristics of persons who consent to physical measurements across several domains, including one’s demographic background, health status, resistance behavior toward the survey interview, and interviewer characteristics. RESEARCH DESIGN, SUBJECTS, AND MEASURES: We conducted a secondary data analysis of the 2006 Health and Retirement Study, a nationally-representative panel survey of older adults aged 50 and older. We performed multilevel logistic regressions on a sample of 7,457 respondents who were eligible for physical measurements. The primary outcome measure was consent to all physical measurements. RESULTS: Seventy-nine percent (unweighted) of eligible respondents consented to all physical measurements. In weighted multilevel logistic regressions controlling for respondent demographics, current health status, survey resistance indicators, and interviewer characteristics, the propensity to consent was significantly greater among Hispanic respondents matched with bilingual Hispanic interviewers, diabetics, and those who visited a doctor in the past 2 years. The propensity to consent was significantly lower among younger respondents, those who have several Nagi functional limitations and infrequently participate in “mildly vigorous” activities, and those interviewed by black interviewers. Survey resistance indicators, such as number of contact attempts and interviewer observations of resistant behavior in prior wave iterations of the HRS were also negatively associated with physical measurement consent. The propensity to consent was unrelated to prior medical diagnoses, including high blood pressure, cancer (excl. skin), lung disease, heart abnormalities, stroke, and arthritis, and matching of interviewer and respondent on race and gender. CONCLUSIONS: Physical measurement consent is not strongly associated with one’s health status, though the findings are somewhat mixed. We recommend that physical measurement results be adjusted for characteristics associated with the likelihood of consent, particularly functional limitations, to reduce potential bias. Otherwise, health researchers should exercise caution when generalizing physical measurement results to persons suffering from functional limitations that may affect their participation

    Prenotification in web-based access panel surveys: the influence of mobile text messaging versus e-mail on response rates and sample composition

    Full text link
    To compare the effectiveness of different prenotification and invitation procedures in a webbased three-wave access panel survey over 3 consecutive months, we experimentally varied the contact mode in a fully crossed two-factorial design with (a) three different prenotification conditions (mobile short messaging service [SMS], e-mail, no prenotice) and (b) two “invitation and reminder" conditions (SMS, e-mail). A group with nearly complete mobile phone coverage was randomly assigned to one of these six experimental conditions. As expected, SMS prenotifications outperformed e-mail prenotifications in terms of response rates across all three waves. Furthermore, e-mail invitation response rates outperformed those for SMS invitations. The combination of SMS prenotification and e-mail invitation performed best. The different experimental treatments did not have an effect on the sample composition of respondents between groups. (author's abstract
    • 

    corecore