815 research outputs found
Evaluating the employment impact of a mandatory job search program
This paper exploits area-based piloting and age-related eligibility rules to identify treatment effects of a labor market program—the New Deal for Young People in the U.K. A central focus is on substitution/displacement effects and on equilibrium wage effects. The program includes extensive job assistance and wage subsidies to employers. We find that the impact of the program significantly raised transitions to employment by about 5 percentage points. The impact is robust to a wide variety of nonexperimental estimators. However, we present some evidence that this effect may not be as large in the longer run
Evaluating the employment impact of a mandatory job search assistance program
This paper exploits area based piloting and age-related eligibility rules to identify treatment effects of
a labor market program – the New Deal for Young People in the UK. A central focus is on
substitution/displacement effects and on equilibrium wage effects. The program includes extensive
job assistance and wage subsidies to employers. We find that the program significantly raised
transitions to employment by about five percentage points (about 20 percent over the pre-program
base). The impact is robust to a wide variety of non-experimental estimators. However we present
some evidence suggesting that this effect may not be as large in the longer run
Evaluating the employment effects of a mandatory job search program
This paper exploits area based piloting and age-related eligibility rules to identify treatment effects of
a labor market program – the New Deal for Young People in the UK. A central focus is on
substitution/displacement effects and on equilibrium wage effects. The program includes extensive
job assistance and wage subsidies to employers. We find that the initial impact of the program
significantly raised transitions to unsubsidized employment by about five percentage points. The
impact is robust to a wide variety of non-experimental estimators. However we present some
evidence that this effect may not be as large in the longer run
Consortia vs reform: creating congruence
Journal ArticleThe inability of research libraries to offer the collections their users desire has become more pronounced each year. In response, libraries have moved in two directions: the rapidly growing consortial movement and the movement to advocate and support reforms in scholarly communication. However, if libraries do not pay careful attention, these two directions may, at least partially, cancel one another out
Consortia vs reform: creating congruence
Journal ArticleThe inability of research libraries to offer the collections their users desire has become more pronounced each year. In response, libraries have moved in two directions: the rapidly growing consortial movement and the movement to advocate and support reforms in scholarly communication. However, if libraries do not pay careful attention, these two directions may, at least partially, cancel one another out
Welfare assessment: correlations and integration between a Qualitative Behavioural Assessment and a clinical/health protocol applied in veal calves farms
This study is aimed at finding correlations and possible integration among Qualitative Behavioural Assessment (QBA) and a specific protocol of clinical/health evaluation. Both welfare assessment methods were based on direct animal observation and were applied in 24 Italian veal calves farms at 3 weeks (wks) of rearing. Principal component analysis (PCA) summarized 20 QBA descriptors on two main components (PC1 and PC2) with eigenvalues above 4 and explaining 29.6 and 20.3% of the variation respectively. PCA on residuals obtained after correcting for housing condition yielded highly similar results, indicating that the rearing environment of the calves was not an important determinant of the observer reliability of QBA. A relationship was found between QBA PC2 and the presence of signs of cross-sucking recorded during the clinical visit (presence PC2=1.11 vs. absence PC2=-1.55,
Calibration and Measurement of Potassium-40 with a Whole-Body Counter
Total potassium content in the human body may be obtained by measuring naturally occurring radioactive potassium-40, in a whole-body counter. The basic problem in calibrating the whole-body counter is discussed, as well as a phantom that is used to solve this problem. The normal potassium levels for men and for women were determined. Two additional parameters - lean body mass and total body water - derived from the potassium content, were also calculated
Variable selection for binary classification using error rate p-values applied to metabolomics data
BACKGROUND: Metabolomics datasets are often high-dimensional though only a limited number of variables are expected to be informative given a specific research question. The important task of selecting informative variables can therefore become complex. In this paper we look at discriminating between two groups. Two tasks need to be performed: (i) finding variables which differ between the two groups; and (ii) determining how the selected variables can be used to classify new subjects. We introduce an approach using minimum classification error rates as test statistics to find discriminatory and therefore informative variables. The thresholds resulting in the minimum error rates can be used to classify new subjects. This approach transforms error rates into p-values and is referred to as ERp. RESULTS: We show that non-parametric hypothesis testing, based on minimum classification error rates as test statistics, can find statistically significantly shifted variables. The discriminatory ability of variables becomes more apparent when error rates are evaluated based on their corresponding p-values, as relatively high error rates can still be statistically significant. ERp can handle unequal and small group sizes, as well as account for the cost of misclassification. ERp retains (if known) or reveals (if unknown) the shift direction, aiding in biological interpretation. The threshold resulting in the minimum error rate can immediately be used to classify new subjects. We use NMR generated metabolomics data to illustrate how ERp is able to discriminate subjects diagnosed with Mycobacterium tuberculosis infected meningitis from a control group. The list of discriminatory variables produced by ERp contains all biologically relevant variables with appropriate shift directions discussed in the original paper from which this data is taken. CONCLUSIONS: ERp performs variable selection and classification, is non-parametric and aids biological interpretation while handling unequal group sizes and misclassification costs. All this is achieved by a single approach which is easy to perform and interpret. ERp has the potential to address many other characteristics of metabolomics data. Future research aims to extend ERp to account for a large proportion of observations below the detection limit, as well as expand on interactions between variables
Metabolomics variable selection and classification in the presence of observations below the detection limit using an extension of ERp
BACKGROUND: ERp is a variable selection and classification method for metabolomics data. ERp uses minimized classification error rates, based on data from a control and experimental group, to test the null hypothesis of no difference between the distributions of variables over the two groups. If the associated p-values are significant they indicate discriminatory variables (i.e. informative metabolites). The p-values are calculated assuming a common continuous strictly increasing cumulative distribution under the null hypothesis. This assumption is violated when zero-valued observations can occur with positive probability, a characteristic of GC-MS metabolomics data, disqualifying ERp in this context. This paper extends ERp to address two sources of zero-valued observations: (i) zeros reflecting the complete absence of a metabolite from a sample (true zeros); and (ii) zeros reflecting a measurement below the detection limit. This is achieved by allowing the null cumulative distribution function to take the form of a mixture between a jump at zero and a continuous strictly increasing function. The extended ERp approach is referred to as XERp. RESULTS: XERp is no longer non-parametric, but its null distributions depend only on one parameter, the true proportion of zeros. Under the null hypothesis this parameter can be estimated by the proportion of zeros in the available data. XERp is shown to perform well with regard to bias and power. To demonstrate the utility of XERp, it is applied to GC-MS data from a metabolomics study on tuberculosis meningitis in infants and children. We find that XERp is able to provide an informative shortlist of discriminatory variables, while attaining satisfactory classification accuracy for new subjects in a leave-one-out cross-validation context. CONCLUSION: XERp takes into account the distributional structure of data with a probability mass at zero without requiring any knowledge of the detection limit of the metabolomics platform. XERp is able to identify variables that discriminate between two groups by simultaneously extracting information from the difference in the proportion of zeros and shifts in the distributions of the non-zero observations. XERp uses simple rules to classify new subjects and a weight pair to adjust for unequal sample sizes or sensitivity and specificity requirements
- …