5,651 research outputs found

    Operationalising a Threshold Concept in Economics: A Pilot Study Using Multiple Choice Questions on Opportunity Cost

    Get PDF
    This paper addresses the emerging educational framework that envisions threshold concepts as mediators of learning outcomes. While the threshold concepts framework is highly appealing on a theoretical level, few researchers have attempted to measure threshold concept acquisition empirically. Achieving this would open a new arena for exploration and debate in the threshold concepts field, and provide potential results to inform teaching practice. We begin the process of operationalising threshold concepts in economics by attempting to measure students' grasp of the threshold concept of opportunity cost in an introductory economics class. We suggest two potential measures and correlate them with an array of ex ante and ex post variables, including students' expectations of success, prior misconceptions about economics and the work of economists, and actual success in the course. Results cast new light onto the factors that influence the acquisition of threshold concepts, the relationship between threshold concept acquisition and final learning outcomes, and the empirical viability of threshold concepts generally.

    Ontwerp van een universele tekst - I/0 module

    Get PDF

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    The EDGE-CALIFA Survey: Interferometric Observations of 126 Galaxies with CARMA

    Get PDF
    We present interferometric CO observations, made with the Combined Array for Millimeter-wave Astronomy (CARMA) interferometer, of galaxies from the Extragalactic Database for Galaxy Evolution survey (EDGE). These galaxies are selected from the Calar Alto Legacy Integral Field Area (CALIFA) sample, mapped with optical integral field spectroscopy. EDGE provides good-quality CO data (3σ sensitivity before inclination correction, resolution ∌1.4 kpc) for 126 galaxies, constituting the largest interferometric CO survey of galaxies in the nearby universe. We describe the survey and data characteristics and products, then present initial science results. We find that the exponential scale lengths of the molecular, stellar, and star-forming disks are approximately equal, and galaxies that are more compact in molecular gas than in stars tend to show signs of interaction. We characterize the molecular-to-stellar ratio as a function of Hubble type and stellar mass and present preliminary results on the resolved relations between the molecular gas, stars, and star-formation rate. We then discuss the dependence of the resolved molecular depletion time on stellar surface density, nebular extinction, and gas metallicity. EDGE provides a key data set to address outstanding topics regarding gas and its role in star formation and galaxy evolution, which will be publicly available on completion of the quality assessment.Fil: Bolatto, Alberto. University of Maryland; Estados UnidosFil: Wong, Tony. University of Illinois at Urbana; Estados UnidosFil: Utomo, Dyas. University of California at Berkeley; Estados UnidosFil: Blitz, Leo. University of California at Berkeley; Estados UnidosFil: Vogel, Stuart N.. University of Maryland; Estados UnidosFil: SĂĄnchez, SebastiĂĄn F.. Universidad Nacional AutĂłnoma de MĂ©xico; MĂ©xicoFil: Barrera-Ballesteros, Jorge. University Johns Hopkins; Estados UnidosFil: Cao, Yixian. University of Illinois; Estados UnidosFil: Colombo, Dario. Max Planck Institut Fur Radioastronomie; AlemaniaFil: Dannerbauer, Helmut. Universidad de La Laguna; EspañaFil: GarcĂ­a-Benito, RubĂ©n. Instituto de AstrofĂ­sica de AndalucĂ­a; EspañaFil: Herrera-Camus, Rodrigo. Max Planck Institute fĂŒr Extraterrestrische Physik; AlemaniaFil: Husemann, Bernd. Max-Planck-Institut fĂŒr Astronomie; AlemaniaFil: Kalinova, Veselina. Max Planck Institut fĂŒr Radioastronomie; AlemaniaFil: Leroy, Adam K.. Ohio State University; Estados UnidosFil: Leung, Gigi. Max-Planck-Institut fĂŒr Astronomie; AlemaniaFil: Levy, Rebecca C.. University of Maryland; Estados UnidosFil: Mast, Damian. Observatorio Astronomico de la Universidad Nacional de Cordoba; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - CĂłrdoba; ArgentinaFil: Ostriker, Eve. University of Princeton; Estados UnidosFil: Rosolowsky, Erik. University of Alberta; CanadĂĄFil: Sandstrom, Karin M.. University of California at San Diego; Estados UnidosFil: Teuben, Peter. University of Maryland; Estados UnidosFil: Van De Ven, Glenn. Max-Planck-Institut fĂŒr Astronomie; AlemaniaFil: Walter, Fabian. Max-Planck-Institut fĂŒr Astronomie; Alemani

    The CMS Event Builder

    Full text link
    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragments of 8 data sources. The second stage combines the 64 super-fragments into full events. This architecture allows installation of the second stage of the event builder in steps, with the overall throughput scaling linearly with the number of switches in the second stage. Possible implementations of the components of the event builder are discussed and the expected performance of the full event builder is outlined.Comment: Conference CHEP0

    Ready, Set, BABY Live Virtual Prenatal Breastfeeding Education for COVID-19

    Get PDF
    The COVID-19 pandemic has introduced unforeseen challenges in the delivery of lactation training, education, and skilled support worldwide. The World Health Organization (WHO) has developed global recommendations for the protection, promotion, and support of breastfeeding when COVID-19 is suspected or confirmed (World Health Organization, 2020). This interim guidance, which is grounded in the best available clinical evidence and epidemiology, brings attention to the importance of integrating breastfeeding education and skilled lactation support into the COVID-19 pandemic response (Gribble, 2018; UNICEF, 2020)

    Commissioning of the CMS High Level Trigger

    Get PDF
    The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-crossing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008
    • 

    corecore