5,874 research outputs found

    A model of the determinants of expenditure on children's personal social services

    Get PDF
    Every year the United Kingdom central government assesses the relative spending needs of English local authorities in respect of the services for which is it responsible. This is done by estimating a Standard Spending Assessment (SSA) for each service, which is intended to indicate the spending requirements of an authority if it were to adopt a standard level of services, given the circumstances in its area. In practice, statistical methods are used to develop SSAs for most services. This report describes the findings of a study designed to review the methods for setting SSAs for a single service: personal social services (PSS) for children, which in 1995/96 accounting for about £1.8 billion of expenditure (4.4% of total local government expenditure). The study was commissioned by the Department of Health and undertaken by a consortium which comprised The University of York, MORI and the National Children’s Bureau. The study was guided by a technical advisory group, comprising representatives from the local authority associations and the Department of Health. In seeking to limit the length of the report, the authors have necessarily omitted a great deal of the technical material produced in the course of the study. We understand that the Department of Health is willing to make this material and the data used in the study available to interested parties, subject to certain confidentiality restrictions. Existing methodology for constructing SSAs had been the subject of some criticism, both in general and specifically in respect of children’s PSS. This document reports the results of a study designed to apply a radically new statistical approach to estimating the SSA for children’s PSS. Previous methods were based on statistical analysis of local authority aggregate data. In contrast, this study is based on an analysis of PSS spending in 1,036 small areas (with populations of about 10,000) within 25 local authorities. A relatively new statistical method known as multilevel modelling, which was originally developed in the educational sector, was used for this purpose.children, SSA, social services

    A survey of meat merchandisers and managers in Missouri : turkey products in the retail store

    Get PDF
    Cover title."This publication is the result of a joint field study effort of the Department of Agricultural Economics and Poultry Husbandry, University of Missouri--Columbia, in fulfillment of a grant made by the Missouri Turkey Merchandising Council"--P. [2] of cover

    Lanczos modes for reduced-order control of flexible structures

    Get PDF
    Lanczos mode models represent low-frequency forced response better than do normal mode models and can be developed for both continuous and finite element structural representations. It was recommended that Lanczos mode models for systems with multiple input and/or rigid body modes should be developed; numerical stability of the Lanczos algorithm should be assessed; and control system designs employing the Lanczos mode models should be attempted

    Reconstructing long-term ecological data from annual census returns: a test for observer bias in counts of bird populations on Skokholm 1928–2002

    Get PDF
    Long-term ecological data are essential for conservation and to monitor and evaluate the effects of environmental change. Bird populations have been routinely assessed on islands off the British coast for many years and here long term data for one such island, Skokholm, is evaluated for robustness in the light of some 20 changes in observers (wardens) on the island over nearly eight decades. It was found that the dataset was robust when compared to bootstrap data with no species showing significant changes in abundance in years when wardens changed. It is concluded that the breeding bird populations on Skokholm and other British offshore islands are an important scientific resource and that protocols should be enacted to ensure the archiving of records, the continuance of data collection using standardised protocols into the future, and the recognition of such long-term data for science in terms of an appropriate conservation designation

    Economic Evaluation of the Impact of Regenerative Agriculture on Farmer Risk

    Get PDF
    The U.S. agriculture industry has seen decades of changes through both technological and cultural innovations. Regenerative agriculture has become the topic at the center of today’s changes within the industry. Regenerative agriculture, very similar to the soil health movement, includes a range of practices with the intention of improving the condition and rigor of the soil. The primary regenerative practices, reduced or no-till and cover cropping, have picked up momentum lately with even the Biden administration focusing upon them. While the assertions made by supporters of these movements sounded hopeful, there remained a need for an economic analysis within regions of Texas specifically. The first objective of this study was to determine if regenerative practices increased yields and/or reduced yield risk enough to offset potentially higher production costs. The secondary objective was to determine whether these impacts were different for farms in different production regions of Texas. Farms throughout four regions of Texas were modeled, with the focus of the study built into each simulation. Each regenerative practice was run through the models, and compared to each farm’s conventional base practices. For two of the representative farms, no-till practices resulted in a higher net present value on average than conventional operations for this five-year analysis. However, for the other two farms conventional practices resulted in the highest average net present value. One constant result throughout the analysis of all four farms was the cover cropping scenario receiving the lowest mean net present value. The models were created to economically assess the effects of transition to regenerative practices. These may be helpful to Texas producers debating on transitioning from conventional practices themselves. Texas farm operations may also benefit from the use of this model as it is updated to make decisions on transitioning in the future as well

    Predicting How Winter Affects Energetics of Age-0 Largemouth Bass: How Do Current Models Fare?

    Get PDF
    During the first winter of life, loss of energy reserves as a function of low feeding activity and scarce prey may contribute to high mortality of age-0 largemouth bass Micropterus salmoides. To explore how two current bioenergetics models predict winter energy depletion, we quantified growth and consumption by age-0 largemouth bass from Alabama, Ohio, and Wisconsin fed maintenance rations in 55-L aquaria in three simulated winters mimicking temperatures and photoperiods at low temperate latitudes (Alabama; 33N), middle latitudes (Ohio; 40N), and high temperate latitudes (Wisconsin; 46N).We compared observed growth in aquaria with that predicted by putting observed consumption into both models. During winter 1995–1996, we validated one of the models with a separate pool experiment (5,800-L) in which age-0 largemouth bass were fed either at 0.5 X or 1.5 X maintenance ration. In aquaria, energy density of the largemouth bass declined in the high- and middle- but not in the low-latitude winter. Though error was slight in the low- and middle-latitude winters for one of the models, both models underestimated growth in the high-latitude winter. To fit the model to the data, the function that estimates weight-specific resting metabolism had to be reduced by about 16%. In pools, where we predicted consumption from observed growth, the model adequately predicted consumption by largemouth bass fed 1.5 X maintenance, but overestimated consumption by 0.5 X maintenance individuals. Current bioenergetics models perform poorly at the cold temperatures (<6C), photoperiods, and low prey abundances typical of high-latitude lakes, likely because metabolic costs are overestimated.This research was funded by National Science Foundation grant DEB 9407859 to R.A.S. and Federal Aid in Sport Fish Restoration Project F-69-P, administered jointly by the U.S. Fish and Wildlife Service and the Ohio Division of Wildlife. A University PostDoctoral Fellowship and a Presidential Fellowship from The Ohio State University supported R.A.W. and J.E.G., respectively, during part of this work

    What's in a pattern? Examining the Type of Signal Multivariate Analysis Uncovers At the Group Level

    Get PDF
    Multivoxel pattern analysis (MVPA) has gained enormous popularity in the neuroimaging community over the past few years. At the group level, most MVPA studies adopt an "information based" approach in which the sign of the effect of individual subjects is discarded and a non-directional summary statistic is carried over to the second level. This is in contrast to a directional "activation based" approach typical in univariate group level analysis, in which both signal magnitude and sign are taken into account. The transition from examining effects in one voxel at a time vs. several voxels (univariate vs. multivariate) has thus tacitly entailed a transition from directional to non-directional signal definition at the group level. While a directional group-level MVPA approach implies that individuals have similar multivariate spatial patterns of activity, in a non-directional approach each individual may have a distinct spatial pattern. Using an experimental dataset, we show that directional and non-directional group-level MVPA approaches uncover distinct brain regions with only partial overlap. We propose a method to quantify the degree of spatial similarity in activation patterns over subjects. Applied to an auditory task, we find higher values in auditory regions compared to control regions.Comment: Revised versio

    Microfabricated Ice-Detection Sensor

    Get PDF
    Knowledge of ice conditions on important aircraft lift and control surfaces is critical for safe operation. These conditions can be determined with conventional ice-detection sensors, but these sensors are often expensive, require elaborate installation procedures, and interrupt the airflow. A micromachined, silicon-based, flush-mounted sensor which generates no internal heat has been designed, batch fabricated, packaged, and tested. The sensor is capable of distinguishing between an ice-covered and a clean surface. It employs a bulk micromachined wafer with a 7 micrometer-thick, boron-doped, silicon diaphragm which serves as one plate of a parallel-plate capacitor. This is bonded to a second silicon wafer which contains the fixed electrodes, one to drive the diaphragm by application of a voltage, the other to measure the deflection by a change in capacitance. The diaphragm sizes ranged from 1x1 mm to 3x3 mm, and the gap between parallel-plate capacitors is 2 micrometers. A 200 V d.c. was applied to the driving electrode which caused the capacitance to increase approximately 0.6pf, from a nominal capacitance of 0.6pf, when the surface was ice free. After the sensor was cooled below the freezing point of water, the same voltage range was applied to the drive electrode. The capacitance increased by the same amount. Then a drop of water was placed over the diaphragm and allowed to freeze. This created an approximately 2mm-thick ice layer. The applied 200V d.c. produced no change in capacitance, confirming that the diaphragm was locked to the ice layer. Since the sensor uses capacitive actuation, it uses very little power and is an ideal candidate for inclusion in a wireless sensing system
    • …
    corecore