121 research outputs found

    Teacher Candidates’ Conceptual Understandings of Mathematics Concepts

    Get PDF
    As universities strive to produce the best mathematics teachers possible through both graduate and undergraduate programs, teacher educators must constantly work towards helping teacher candidates create their own conceptual understanding of mathematics. This pilot study examined the effect teaching in a constructivist manner had on teacher candidates’ conceptual understandings of the arithmetic mean and on their ability to transfer this knowledge into their instruction. Results indicated that teaching in a constructivist manner can have a positive impact on teacher candidates\u27 understanding of the arithmetic mean, but their abilities to transfer this new knowledge into their own instructional practices was inconsistent

    The Relationship of Implicit Theories to Elementary Teachers’ Patterns of Engagement in a Mathematics-Focused Professional Development Setting

    Get PDF
    As elementary teachers aim to deepen their mathematical understandings, they engage in a relearning process that involves not only revisiting but also reconstructing their knowledge. To do so, meaningful engagement in immersion and practice-based experiences is required. This exploratory case study investigated the engagement patterns of two elementary teachers with varying implicit beliefs as they participated in a professional development that focused on relearning mathematics. Data were collected on the two participants in the form of video narratives, observation protocols, and interviews. Attention was given to their patterns of engagement in collaborative group settings as the participants moved through different phases of the professional development lesson. Results indicated that the engagement patterns of the two participants closely aligned with learning behaviors described in the implicit beliefs theory. In this way, the results suggested an extension of the implicit theories model to the relearning context. Additional implications and future questions are provided

    Precision measurements of the top quark mass from the Tevatron in the pre-LHC era

    Full text link
    The top quark is the heaviest of the six quarks of the Standard Model. Precise knowledge of its mass is important for imposing constraints on a number of physics processes, including interactions of the as yet unobserved Higgs boson. The Higgs boson is the only missing particle of the Standard Model, central to the electroweak symmetry breaking mechanism and generation of particle masses. In this Review, experimental measurements of the top quark mass accomplished at the Tevatron, a proton-antiproton collider located at the Fermi National Accelerator Laboratory, are described. Topologies of top quark events and methods used to separate signal events from background sources are discussed. Data analysis techniques used to extract information about the top mass value are reviewed. The combination of several most precise measurements performed with the two Tevatron particle detectors, CDF and \D0, yields a value of \Mt = 173.2 \pm 0.9 GeV/c2c^2.Comment: This version contains the most up-to-date top quark mass averag

    CMB observations from the CBI and VSA: A comparison of coincident maps and parameter estimation methods

    Full text link
    We present coincident observations of the Cosmic Microwave Background (CMB) from the Very Small Array (VSA) and Cosmic Background Imager (CBI) telescopes. The consistency of the full datasets is tested in the map plane and the Fourier plane, prior to the usual compression of CMB data into flat bandpowers. Of the three mosaics observed by each group, two are found to be in excellent agreement. In the third mosaic, there is a 2 sigma discrepancy between the correlation of the data and the level expected from Monte Carlo simulations. This is shown to be consistent with increased phase calibration errors on VSA data during summer observations. We also consider the parameter estimation method of each group. The key difference is the use of the variance window function in place of the bandpower window function, an approximation used by the VSA group. A re-evaluation of the VSA parameter estimates, using bandpower windows, shows that the two methods yield consistent results.Comment: 10 pages, 6 figures. Final version. Accepted for publication in MNRA

    Introductory programming: a systematic literature review

    Get PDF
    As computing becomes a mainstream discipline embedded in the school curriculum and acts as an enabler for an increasing range of academic disciplines in higher education, the literature on introductory programming is growing. Although there have been several reviews that focus on specific aspects of introductory programming, there has been no broad overview of the literature exploring recent trends across the breadth of introductory programming. This paper is the report of an ITiCSE working group that conducted a systematic review in order to gain an overview of the introductory programming literature. Partitioning the literature into papers addressing the student, teaching, the curriculum, and assessment, we explore trends, highlight advances in knowledge over the past 15 years, and indicate possible directions for future research

    Promoting mental health in small-medium enterprises: An evaluation of the "Business in Mind" program

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Workplace mental health promotion (WMHP) aims to prevent and effectively manage the social and economic costs of common mental illnesses such as depression. The mental health of managers and employees within small-medium enterprises (SMEs) is a neglected sector in occupational health research and practice, despite the fact that this sector is the most common work setting in most economies. The availability and propensity of SME staff to attend face-to-face training/therapy or workshop style interventions often seen in corporate or public sector work settings is a widely recognised problem. The 'Business in Mind' program employs a DVD mode of delivery that is convenient for SME managers, particularly those operating in regional and remote areas where internet delivery may not be optimal. The objective of the intervention program is to improve the mental health of SME managers, and examine whether employees of managers' whose mental health improves, report positive change in their psychosocial work environment. The mechanisms via which we aim to improve managers' mental health are through the development of their psychological capital (a higher order construct comprised of hope, self efficacy, resilience and optimism) and their skills and capacities for coping with work stress.</p> <p>Methods/Design</p> <p>The effectiveness of two versions of the program (self administered and telephone facilitated) will be assessed using a randomised trial with an active control condition (psychoeducation only). We aim to recruit a minimum of 249 managers and a sample of their employees. This design allows for 83 managers per group, as power analyses showed that this number would allow for attrition of 20% and still enable detection of an effect size of 0.5. The intervention will be implemented over a three month period and postal surveys will assess managers and employees in each group at baseline, intervention completion, and at 6 month follow up. The intervention groups (managers only) will also be assessed at 12 and 24 month follow-up to examine maintenance of effects. Primary outcomes are managers' levels of psychological capital (hope, resilience, self-efficacy and optimism), coping strategies, anxiety and depression symptoms, self-reported health, job satisfaction and job tension. Secondary outcomes are participating managers subordinates' perceptions of manager support, relational justice, emotional climate and job tension. In order to provide an economic evaluation of the intervention, both employees and manager rates of absenteeism and presenteeism will also be assessed.</p> <p>Discussion</p> <p>The intervention being trialled is expected to improve both primary and secondary outcomes. If proven efficacious, the intervention could be disseminated to reach a much larger proportion of the business community.</p> <p>Trial registration</p> <p>Current controlled trials ISRCTN 62853520</p

    A chemical survey of exoplanets with ARIEL

    Get PDF
    Thousands of exoplanets have now been discovered with a huge range of masses, sizes and orbits: from rocky Earth-like planets to large gas giants grazing the surface of their host star. However, the essential nature of these exoplanets remains largely mysterious: there is no known, discernible pattern linking the presence, size, or orbital parameters of a planet to the nature of its parent star. We have little idea whether the chemistry of a planet is linked to its formation environment, or whether the type of host star drives the physics and chemistry of the planet’s birth, and evolution. ARIEL was conceived to observe a large number (~1000) of transiting planets for statistical understanding, including gas giants, Neptunes, super-Earths and Earth-size planets around a range of host star types using transit spectroscopy in the 1.25–7.8 μm spectral range and multiple narrow-band photometry in the optical. ARIEL will focus on warm and hot planets to take advantage of their well-mixed atmospheres which should show minimal condensation and sequestration of high-Z materials compared to their colder Solar System siblings. Said warm and hot atmospheres are expected to be more representative of the planetary bulk composition. Observations of these warm/hot exoplanets, and in particular of their elemental composition (especially C, O, N, S, Si), will allow the understanding of the early stages of planetary and atmospheric formation during the nebular phase and the following few million years. ARIEL will thus provide a representative picture of the chemical nature of the exoplanets and relate this directly to the type and chemical environment of the host star. ARIEL is designed as a dedicated survey mission for combined-light spectroscopy, capable of observing a large and well-defined planet sample within its 4-year mission lifetime. Transit, eclipse and phase-curve spectroscopy methods, whereby the signal from the star and planet are differentiated using knowledge of the planetary ephemerides, allow us to measure atmospheric signals from the planet at levels of 10–100 part per million (ppm) relative to the star and, given the bright nature of targets, also allows more sophisticated techniques, such as eclipse mapping, to give a deeper insight into the nature of the atmosphere. These types of observations require a stable payload and satellite platform with broad, instantaneous wavelength coverage to detect many molecular species, probe the thermal structure, identify clouds and monitor the stellar activity. The wavelength range proposed covers all the expected major atmospheric gases from e.g. H2O, CO2, CH4 NH3, HCN, H2S through to the more exotic metallic compounds, such as TiO, VO, and condensed species. Simulations of ARIEL performance in conducting exoplanet surveys have been performed – using conservative estimates of mission performance and a full model of all significant noise sources in the measurement – using a list of potential ARIEL targets that incorporates the latest available exoplanet statistics. The conclusion at the end of the Phase A study, is that ARIEL – in line with the stated mission objectives – will be able to observe about 1000 exoplanets depending on the details of the adopted survey strategy, thus confirming the feasibility of the main science objectives.Peer reviewedFinal Published versio

    The Zwicky Transient Facility: Data Processing, Products, and Archive

    Get PDF
    The Zwicky Transient Facility (ZTF) is a new robotic time-domain survey currently in progress using the Palomar 48-inch Schmidt Telescope. ZTF uses a 47 square degree field with a 600 megapixel camera to scan the entire northern visible sky at rates of ~3760 square degrees/hour to median depths of g ~ 20.8 and r ~ 20.6 mag (AB, 5sigma in 30 sec). We describe the Science Data System that is housed at IPAC, Caltech. This comprises the data-processing pipelines, alert production system, data archive, and user interfaces for accessing and analyzing the products. The realtime pipeline employs a novel image-differencing algorithm, optimized for the detection of point source transient events. These events are vetted for reliability using a machine-learned classifier and combined with contextual information to generate data-rich alert packets. The packets become available for distribution typically within 13 minutes (95th percentile) of observation. Detected events are also linked to generate candidate moving-object tracks using a novel algorithm. Objects that move fast enough to streak in the individual exposures are also extracted and vetted. The reconstructed astrometric accuracy per science image with respect to Gaia is typically 45 to 85 milliarcsec. This is the RMS per axis on the sky for sources extracted with photometric S/N >= 10. The derived photometric precision (repeatability) at bright unsaturated fluxes varies between 8 and 25 millimag. Photometric calibration accuracy with respect to Pan-STARRS1 is generally better than 2%. The products support a broad range of scientific applications: fast and young supernovae, rare flux transients, variable stars, eclipsing binaries, variability from active galactic nuclei, counterparts to gravitational wave sources, a more complete census of Type Ia supernovae, and Solar System objects.Comment: 30 pages, 16 figures, Published in PASP Focus Issue on the Zwicky Transient Facility (doi: 10.1088/1538-3873/aae8ac

    A cluster randomised controlled trial of the clinical and cost-effectiveness of a 'whole systems' model of self-management support for the management of long- term conditions in primary care: trial protocol

    Get PDF
    BackgroundPatients with long-term conditions are increasingly the focus of quality improvement activities in health services to reduce the impact of these conditions on quality of life and to reduce the burden on care utilisation. There is significant interest in the potential for self-management support to improve health and reduce utilisation in these patient populations, but little consensus concerning the optimal model that would best provide such support. We describe the implementation and evaluation of self-management support through an evidence-based 'whole systems' model involving patient support, training for primary care teams, and service re-organisation, all integrated into routine delivery within primary care.MethodsThe evaluation involves a large-scale, multi-site study of the implementation, effectiveness, and cost-effectiveness of this model of self-management support using a cluster randomised controlled trial in patients with three long-term conditions of diabetes, chronic obstructive pulmonary disease (COPD), and irritable bowel syndrome (IBS). The outcome measures include healthcare utilisation and quality of life. We describe the methods of the cluster randomised trial.DiscussionIf the 'whole systems' model proves effective and cost-effective, it will provide decision-makers with a model for the delivery of self-management support for populations with long-term conditions that can be implemented widely to maximise 'reach' across the wider patient population.Trial registration numberISRCTN: ISRCTN9094004

    The Zwicky Transient Facility: System Overview, Performance, and First Results

    Get PDF
    The Zwicky Transient Facility (ZTF) is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope. A custom-built wide-field camera provides a 47 deg 2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey, the Palomar Transient Factory. We describe the design and implementation of the camera and observing system. The ZTF data system at the Infrared Processing and Analysis Center provides near-real-time reduction to identify moving and varying objects. We outline the analysis pipelines, data products, and associated archive. Finally, we present on-sky performance analysis and first scientific results from commissioning and the early survey. ZTF’s public alert stream will serve as a useful precursor for that of the Large Synoptic Survey Telescope
    • …
    corecore