4,634 research outputs found

    The role of professional development and learning in the early adoption of the New Zealand curriculum by schools.

    Get PDF
    This paper is set in the context of Phase One of the Ministry of Education Curriculum Implementation Exploratory Studies (CIES) project. The schools selected for this study were considered early adopters of the revised New Zealand Curriculum (NZC) (Ministry of Education, 2007). The paper provides theoretical insights and research evidence related to the role of professional development and learning in the early stages of implementation of the revised curriculum. A key finding common to most schools was the progressive development of a professional learning culture led by the principal that focused on pedagogy and student achievement prior to the introduction of the curriculum. The establishment of this culture involved processes that were task-oriented, reflective, consultative and collaborative. While there are strong parallels between the experiences of primary and secondary schools in the study, some important differences have also been noted

    The Bull's-Eye Effect as a Probe of Ω\Omega

    Full text link
    We compare the statistical properties of structures normal and transverse to the line of sight which appear in theoretical N-body simulations of structure formation, and seem also to be present in observational data from redshift surveys. We present a statistic which can quantify this effect in a conceptually different way from standard analyses of distortions of the power-spectrum or correlation function. From tests with NN--body experiments, we argue that this statistic represents a new and potentially powerful diagnostic of the cosmological density parameter, Ω0\Omega_0.Comment: Minor revisions; final version accepted for publication in ApJ Letters. Latex, 16 pages, including 3 figures. Higher resolution versions of figures, including supplementary figures not included in the manuscript, are available at: ftp://kusmos.phsx.ukans.edu/preprints/melott/omeg

    Using stiffness to assess injury risk:comparison of methods for quantifying stiffness and their reliability in triathletes

    Get PDF
    Background: A review of the literature has indicated that lower body stiffness, defined as the extent to which the lower extremity joints resists deformation upon contact with the ground, may be a useful measure for assessing Achilles injury risk in triathletes. The nature of overuse injuries suggests that a variety of different movement patterns could conceivably contribute to the final injury outcome, any number and combination of which might be observed in a single individual. Measurements which incorporate both kinetics and kinematics (such as stiffness) of a movement may be better able to shed light on individuals at risk of injury, with further analysis then providing the exact mechanism of injury for the individual. Stiffness can be measured as vertical, leg or joint stiffness to model how the individual interacts with the environment upon landing. However, several issues with stiffness assessments limit the effectiveness of these measures to monitor athletes’ performance and/or injury risk. This may reflect the variety of common biomechanical stiffness calculations (dynamic, time, true leg and joint) that have been used to examine these three stiffness levels (vertical, leg and joint) across a variety of human movements (i.e. running or hopping) as well as potential issues with the reliability of these measures, especially joint stiffness. Therefore, the aims of this study were to provide a comparison of the various methods for measuring stiffness during two forms of human bouncing locomotion (running and hopping) along with the measurement reliability to determine the best methods to assess links with injury risk in triathletes. Methods: Vertical, leg and joint stiffness were estimated in 12 healthy male competitive triathletes on two occasions, 7 days apart, using both running at 5.0 ms−1 and hopping (2.2 Hz) tasks. Results: Inter-day reliability was good for vertical (ICC = 0.85) and leg (ICC = 0.98) stiffness using the time method. Joint stiffness reliability was poor when assessed individually. Reliability was improved when taken as the sum of the hip, knee and ankle (ICC = 0.86). The knee and ankle combination provided the best correlation with leg stiffness during running (Pearson’s Correlation = 0.82). Discussion: The dynamic and time methods of calculating leg stiffness had better reliability than the “true” method. The time and dynamic methods had the best correlation with the different combinations of joint stiffness, which suggests that they should be considered for biomechanical screening of triathletes. The knee and ankle combination had the best correlation with leg stiffness and is therefore proposed to provide the most information regarding lower limb mechanics during gait in triathletes

    Optimal Moments for the Analysis of Peculiar Velocity Surveys II: Testing

    Full text link
    Analyses of peculiar velocity surveys face several challenges, including low signal--to--noise in individual velocity measurements and the presence of small--scale, nonlinear flows. This is the second in a series of papers in which we describe a new method of overcoming these problems by using data compression as a filter with which to separate large--scale, linear flows from small--scale noise that can bias results. We demonstrate the effectiveness of our method using realistic catalogs of galaxy velocities drawn from N--body simulations. Our tests show that a likelihood analysis of simulated catalogs that uses all of the information contained in the peculiar velocities results in a bias in the estimation of the power spectrum shape parameter Γ\Gamma and amplitude ÎČ\beta, and that our method of analysis effectively removes this bias. We expect that this new method will cause peculiar velocity surveys to re--emerge as a useful tool to determine cosmological parameters.Comment: 28 pages, 9 figure

    Disposition of precipitation: Supply and Demand for Water Use by New Tree Plantations

    Get PDF
    As the greatest rainwater users among all vegetative land covers, tree plantations have been employed strategically to mitigate salinity and water-logging problems. However, large-scale commercial tree plantations in high rainfall areas reduce fresh water inflows to river systems supporting downstream communities, agricultural industries and wetland environmental assets. A bio-economic model was used to estimate economic demand for water by future upstream plantations in a sub-catchment (the 2.8 million ha Macquarie valley in NSW) of the Murray-Darling Basin, Australia. Given four tree-product values, impacts were simulated under two settings: without and with the requirement that permanent water entitlements be purchased from downstream entitlement holders before establishing a tree plantation. Without this requirement, gains in economic surplus from expanding tree plantations exceeded economic losses by downstream irrigators, and stock and domestic water users, but resulted in reductions of up to 154 GL (gigalitres) in annual flows to wetland environments. With this requirement, smaller gains in upstream economic surplus, added to downstream gains, could total $330 million while preserving environmental flows. Extending downstream water markets to new upstream tree plantations, to equilibrate marginal values across water uses, helps ensure water entitlements are not diminished without compensation. Outcomes include better economic-efficiency, social-equity and environmental-sustainability.Environmental Economics and Policy, forest, environmental services, catchment, water sources, interception, entitlement, supply, demand, market, economic surplus, evapo-transpiration, urban water, irrigation, wetlands.,

    Stratigraphy and Chronology of Karst Features on Rodriguez Island, Southwestern Indian Ocean

    Get PDF
    This publication has been made available with the permission of the National Speleological Society (www.caves.org). The attached file is the published version of the article

    Quantifying the Bull's Eye Effect

    Full text link
    We have used N-body simulations to develop two independent methods to quantify redshift distortions known as the Bull's Eye effect (large scale infall plus small scale virial motion). This effect depends upon the mass density, Ω0\Omega_0, so measuring it can in principle give an estimate of this important cosmological parameter. We are able to measure the effect and distinguish between its strength for high and low values of Ω0\Omega_0. Unlike other techniques which utilize redshift distortions, one of our methods is relatively insensitive to bias. In one approach, we use path lengths between contour crossings of the density field. The other is based upon percolation. We have found both methods to be successful in quantifying the effect and distinguishing between values of Ω0\Omega_0. However, only the path lengths method exhibits low sensitivity to bias.Comment: 21 pages, 5 figures, 3 tables; Replaced version - minor corrections, replaced figure 2; To appear in ApJ, Jan. 20, 200
    • 

    corecore