52 research outputs found
The Online Panels Benchmarking Study: a Total Survey Error comparison of fndings from probability-based surveys and nonprobability online panel surveys in Australia
The pervasiveness of the internet has led online research, and particularly online research undertaken via nonprobability online panels, to become the dominant mode of sampling and data collection used by the Australian market and social research industry. There are broad-based concerns that the rapid increase in the use of nonprobability online panels in Australia has not been accompanied by an informed debate about the advantages and disadvantages of probability and nonprobability surveys. The 2015 Australian online Panels Benchmarking Study was undertaken to inform this debate, and report on the fndings from a single national questionnaire administered across three different probability samples and fve different nonprobability online panels. This study enables us to investigate whether Australian surveys using probability sampling methods produce results different from Australian online surveys relying on nonprobability sampling methods, where accuracy is measured relative to independent population benchmarks. In doing so, we build on similar international research in this area, and discuss our fndings as they relate to coverage error, nonresponse error, adjustment error and measurement error
Building a probability-based online panel: Life in Australia™
Life in Australia™ was created to provide Australian researchers, policy makers, academics and businesses with access to a scientifically sampled cross-section of Australian resident adults at a lower cost than telephone surveys. Panellists were recruited using dual-frame landline and mobile random digit dialling. The majority of panellists choose to complete questionnaires online. Representation of the offline population is ensured by interviewing by telephone those panellists who cannot or will not complete questionnaires online. Surveys are conducted about once a month, covering a variety of topics, most with a public opinion or health focus. Full panel waves yield 2000 or more completed surveys. Panellists are offered a small incentive for completing surveys, which they can choose to donate to a charity instead. This paper describes how Life in Australia™ was built and maintained before the first panel refreshment in June 2018. We document the qualitative pretesting used to inform the development of recruitment and enrolment communications materials, and the pilot tests used to assess alternative recruitment approaches and the comparative effectiveness of these approaches. The methods used for the main recruitment effort are detailed, together with various outcome rates. The operation of the panel after recruitment is also described. We assess the performance of the panel compared with other probability surveys and nonprobability online access panels, and against benchmarks from high-quality sources. Finally, we assess Life in Australia™ from a total survey error perspective
Current Knowledge and Considerations Regarding Survey Refusals: Executive Summary of the AAPOR Task Force Report on Survey Refusals
The landscape of survey research has arguably changed more significantly in the past decade than at any other time in its relatively brief history. In that short time, landline telephone ownership has dropped from some 98 percent of all households to less than 60 percent; cell-phone interviewing went from a novelty to a mainstay; address-based designs quickly became an accepted method of sampling the general population; and surveys via Internet panels became ubiquitous in many sectors of social and market research, even as they continue to raise concerns given their lack of random selection.
Among these widespread changes, it is perhaps not surprising that the substantial increase in refusal rates has received comparatively little attention. As we will detail, it was not uncommon for a study conducted 20 years ago to have encountered one refusal for every one or two completed interviews, while today experiencing three or more refusals for every one completed interview is commonplace. This trend has led to several concerns that motivate this Task Force. As refusal rates have increased, refusal bias (as a component of nonresponse bias) is an increased threat to the validity of survey results. Of practical concern are the efficacy and cost implications of enhanced efforts to avert initial refusals and convert refusals that do occur. Finally, though no less significant, are the ethical concerns raised by the possibility that efforts to minimize refusals can be perceived as coercive or harassing potential respondents. Indeed, perhaps the most important goal of this document is to foster greater consideration by the reader of the rights of respondents in survey research
Cross-Race Preferences for Same-Race Faces Extend Beyond the African Versus Caucasian Contrast in 3-Month-Old Infants
A visual preference procedure was used to examine preferences among faces of
different races and ethnicities (African, Asian, Caucasian, and Middle Eastern) in
Chinese 3-month-old infants exposed only to Chinese faces. The infants demonstrated
a preference for faces from their own ethnic group. Alongside previous
results showing that Caucasian infants exposed only to Caucasian faces prefer
same-race faces (Kelly et al., 2005) and that Caucasian and African infants exposed
only to native faces prefer the same over the other-race faces (Bar-Haim, Ziv, Lamy, &
Hodes, 2006), the findings reported here (a) extend the same-race preference
observed in young infants to a new race of infants (Chinese), and (b) show that
cross-race preferences for same-race faces extend beyond the perceptually robust
contrast between African and Caucasian faces
Telephone survey methods : sampling, selection, and supervision
Volume 7157 p.; 21 cm
Encyclopedia of survey research methods
This book covers all major facets of survey research methodology, from selecting the sample design and the sampling frame, designing and pretesting the questionnaire, and data collection and data coding to the thorny issues surrounding dimishing response rates, confidentially, privacy, informed consent, and other ethical issues, as well as data weighting and data analysi
- …