703 research outputs found

    How Much is a Box? The Hidden Cost of Adding an Open-ended Probe to an Online Survey

    Get PDF
    Probing questions, essentially open-ended comment boxes that are attached to a traditional closed-ended question, are increasingly used in online surveys. They give respondents an opportunity to share information that goes beyond what can be captured through standardized response categories. However, even when probes are non-mandatory, they can add to perceived response burden and incur a cost in the form of lower respondent cooperation. This paper seeks to measure this cost and reports on a survey experiment that was integrated into a short questionnaire on a German salary comparison site (N = 22,306). Respondents were randomly assigned to one of three conditions: a control without a probing question; a probe that was embedded directly into the closed-ended question; and a probe displayed on a subsequent page. For every meaningful comment gathered, the embedded design resulted in 0.1 break-offs and roughly 3.7 item missings for the closed-ended question. The paging design led to 0.2 additional break-offs for every open-ended answer it collected. Against expectations, smartphone users were more likely to provide meaningful (albeit shorter) open-ended answers than those using a PC or laptop. However, smartphone use also amplified the adverse effects of the probe on break-offs and item non-response to the closed-ended question. Despite documenting their hidden cost, this paper argues that the value of the additional information gathered by probes can make them worthwhile. In conclusion, it endorses the selective use of probes as a tool to better understand survey respondents

    How should we organize schooling to further children with migration background?

    Get PDF
    Educational integration of children with migration background is an important issue in the social sciences. Few studies exist that quantify the disadvantage of immigrant children in education and there has not been any attempt to identify institutional conditions of the education system that contribute to educational integration. Using data from five international student assessments, this study tries to fill that gap. First, Blinder-Oaxaca decompositions are used to allow for a comparison of (dis)integration of students with migration background across countries and time. In a second step, (dis)integration is related to institutional characteristics of the schooling system. The study shows that early education, time in school and central exams furthers integration, while social segregation of students among schools is detrimental to educational integration.Institution; Integration; Immigrant; Pisa; Timss; Education

    High Performance Data Acquisition and Analysis Routines for the Nab Experiment

    Get PDF
    Probes of the Standard Model of particle physics are pushing further and further into the so-called “precision frontier”. In order to reach the precision goals of these experiments, a combination of elegant experimental design and robust data acquisition and analysis is required. Two experiments that embody this philosophy are the Nab and Calcium-45 experiments. These experiments are probing the understanding of the weak interaction by examining the beta decay of the free neutron and Calcium-45 respectively. They both aim to measure correlation parameters in the neutron beta decay alphabet, a and b. The parameter a, the electron-neutrino correlation coefficient, is sensitive to λ, the ratio of the axial-vector and vector coupling strengths in the decay of the free neutron. This parameter λ, in tandem with a precision measurement of the neutron lifetime τ , provides a measurement of the matrix element Vud from the CKM quark mixing matrix. The CKM matrix, as a rotation matrix, must be unitary. Probes of Vud and Vus in recent years have revealed tension in this unitarity at the 2.2σ level. The measurement of a via decay of free cold neutrons serves as an additional method of extraction for Vud that is sensitive to a different set of systematic effects and as such is an excellent probe into the source of the deviation from unitarity. The parameter b, the Fierz interference term, appears as a distortion in the mea- sured electron energy spectra from beta decay. This parameter, if non-zero, would indicate the existence of Scalar and/or Tensor couplings in the Weak interaction which according to the Standard Model is purely Vector minus Axial-Vector. This is therefore a search for physics beyond the standard model, BSM, physics search. The Nab and Calcium-45 experiments probe these parameters with a combination of elegant experimental design and brute force collection and analysis of large amounts of digitized detector data. These datasets, particularly in the case of the Nab experiment, are anticipated to span multiple petabytes of data and will require high performance online analysis and precision offline analysis routines in order to reach the experimental goals. Of particular note are the requirements for better than 3 keV energy resolution and an understanding of the uncertainty in the mean timing bias for the detected particles within 300 ps. Presented in this dissertation is an overview of the experiments and their design, a description of the data acquisition systems and analysis routines that have been developed to support the experiments, and a discussion of the data analysis performed for the Calcium-45 experiment

    PolyFS Visualizer

    Get PDF
    One of the most important operating system topics, file systems, control how we store and access data and form a key point in a computer scientists understanding of the underlying mechanisms of a computer. However, file systems, with their abstract concepts and lack of concrete learning aids, is a confusing subjects for students. Historically at Cal Poly, the CPE 453 Introduction to Operating Systems has been on of the most failed classes in the computing majors, leading to the need for better teaching and learning tools. Tools allowing students to gain concrete examples of abstract concepts could be used to better prepare students for industry. The PolyFS Visualizer is a block level file system visualization service built for the PolyFS and TinyFS file systems design specifications currently used by some of professors teaching CPE 453. The service allows students to easily view the blocks of their file system and see metadata, the blocks binary content and the interlinked structure. Students can either compile their file system code with a provided block emulation library to build their disk on a remote server and make use of a visualization website or place the file mounted as their file system directly into the visualization service to view it locally. This allows students to easily view, debug and explore their implementation of a file system to understand how different design decisions affect its operation. The implementation includes three main components: a disk emulation library in C for compilation with students code, a node JS back-end to handle students file systems and block operations and a read only visualization service. We have conducted two surveys of students in order to determine the usefulness of the PolyFS Visualizer. Students responded that the use of the PolyFS visualizer helps with the PolyFS file system design project and has several ideas for future features and expansions

    Developing silicon pixel detectors for LHCb: constructing the VELO Upgrade and developing a MAPS-based tracking detector

    Get PDF
    The Large Hadron Collider beauty (LHCb) experiment is currently undergoing a major upgrade of its detector, including the construction of a new silicon pixel detector, the Vertex Locator (VELO) Upgrade. The challenges faced by the LHCb VELO Upgrade are discussed, and the design to overcome them is presented. VELO modules have been produced at the University of Manchester. The VELO modules use 55 μ\mum pixels operating 5.1 mm from the beam without a beam pipe, an innovative silicon microchannel cooling substrate, and 40 MHz readout with a full detector bandwidth of 3 Tb/s. The module assembly process and the results of the associated R&D are presented. The mechanical and electronic tests are described. A grading scheme for each test is described, and the results are presented. The majority of the modules are of excellent quality, with 40 out of 43 of suitable quality for installation in the experiment. A full set of modules for the experiment has now been produced. The VELO Upgrade is read out into a data acquisition system based on an FPGA board. The architecture of the readout firmware for the readout FPGA for the VELO Upgrade is presented, and the function of each block described. Challenges arise due to the design of the VeloPix front end chip, the fully-software trigger and real-time analysis paradigm. These challenges are discussed and their solutions briefly described. An algorithm for identifying isolated clusters is presented and previously-considered approaches discussed. The current design uses around 83 % of the available logic blocks, and 85 % of the available memory blocks. A complete version of the firmware is now available and is being refined. An ultimate version of the LHCb experiment, the LHCb Upgrade II, is being designed for the 2030s to fully exploit the potential of the high luminosity LHC. The Mighty Tracker is the proposed new combined-technology downstream tracker for Upgrade II, consisting of a silicon pixel inner region and a scintillating fibre outer region. A potential layout of the detector and modules is given. The silicon pixels will likely be the first LHC tracker based on radiation-hard HV-MAPS technology. Studies for the electronic readout system of the silicon inner region are reported. The total bandwidth and its distribution across the tracker are discussed. The numbers of key readout and FPGA DAQ boards are calculated. The detector's expected data rate is 8.13 Tb/s in Upgrade II conditions over a total of more than 46,000 front end chips

    The German Socio-Economic Panel Study (SOEP): Scope, Evolution and Enhancements

    Get PDF
    After the introduction in Section 2, we very briefly sketch out current theoretical and empirical developments in the social sciences. In our view, they all point in the same direction: toward the acute and increasing need for multidisciplinary longitudinal data covering a wide range of living conditions and based on a multitude of variables from the social sciences for both theoretical investigation and the evaluation of policy measures. Cohort and panel studies are therefore called upon to become truly interdisciplinary tools. In Section 3, we describe the German Socio-Economic Panel Study (SOEP), in which we discuss recent improvements of that study which approach this ideal and point out existing shortcomings. Section 4 concludes with a discussion of potential future issues and developments for SOEP and other household panel studies.SOEP, household panel studies, survey design

    Developing a Silicon Pixel Detector for the Next Generation LHCb Experiment

    Get PDF
    The second long shutdown of the LHC presents an opportunity for the LHCb experiment to upgrade its detector systems and switch to a fully software triggered readout. Its first tracking layer, the VELO detector, is no exception to this and is undergoing an upgrade increasing the number of sensitive channels from 180 thousand silicon microstrips to about 41 million pixels. The new system will operate with zero-suppressed readout at 40 MHz, while cooled down using evaporative liquid CO2_2 in silicon microchannel plates. The VELO Upgrade will consist of 52 modules, placed around the beam-pipe, built at the University of Manchester and Nikhef. The construction of the modules is a complex process that consists of a number of tight tolerance steps, their results verified both in metrology and in the electrical and thermal performance testing. In order to store data and track the performance a database has been developed, used to automatically analyse the uploaded values as well as compute the grades and quality of the individual steps and final modules. By the end of August 2021, 42 modules have been produced in Manchester, 37 of them with high quality and no issues present. Due to the nature of the harsh radiation environment, the sensors have to withstand a fluence up to 1e16 1 MeV neqcm2_\mathrm{eq} \mathrm{cm^{-2}} and still provide a good signal to noise ratio. A new method of a charge collection scan has been proposed, linking the commonly used voltage scan with a threshold scan and using the extrapolated tracking information to estimate the amount of collected charge. The simulation indicates that the scan of a subset of modules will take about 8 min, a feasible duration despite the impact on the physics data taking. A further upgrade of the LHCb is planned for Long Shutdown four of the LHC. This will operate at higher luminosities leading to a significant increase in the pile-up of the collisions from a single proton-proton bunch crossing. For this reason a precise time stamping O\mathcal{O}(50 ps) is to be added. This could be achieved in silicon detectors by using O\mathcal{O}(10) internal gain in the sensor. Simulations of the expected performance of a recently produced batch of sensors are presented. These characterise the anticipated performance of these O\mathcal{O}(50 μ\mum) segmented devices in a test beam, providing the impact of charge sharing and device response to an angular scan
    corecore