154 research outputs found

    Evaluating Student Volunteer and Service-Learning Programs: A Casebook for Practitioners

    Get PDF
    Today, evaluation concepts and methods are widely available to those who plan and administer student volunteer programs. Unfortunately, however, evaluation has all too often been carried out-and written about-in ways that have robbed it of its usefulness to people dealing with the realities of day-to-day program operation. Evaluation has thus acquired the reputation among practitioners of being too complex, too costly, too time-consuming, even too threatening to be of much practical value

    Cops, Teachers, and the Art of the Impossible: Explaining the lack of diffusion of impossible job innovations

    Get PDF
    In their now classic Impossible Jobs in Public Management, Hargrove and Glidewell (1990) argue that public agencies with limited legitimacy, high conflict, low professional authority, and weak agency myths have essentially impossible jobs. Leaders of such agencies can do little more than cope, which is also a theme of James Q. Wilson (1989), among others. Yet in the years since publication of Impossible Jobs, one such position, that of police commissioner has proven possible. Over a sustained 17-year period, the New York City Police Department has achieved dramatic reductions in crime with relatively few political repercussions, as described by Kelling and Sousa (2001). A second impossible job discussed by Wilson and also by Frederick Hess (1999), city school superintendent, has also proven possible, with Houston and Edmonton having considerable academic success educating disadvantaged children. In addition, Atlanta and Pittsburgh enjoyed significant success in elementary schooling, though the gains were short-lived for reasons we will describe. More recently, under Michelle Rhee, Washington D.C. schools have made the most dramatic gains among city school systems. These successes in urban crime control and public schooling have not been widely copied. Accordingly, we argue that the real conundrum of impossible jobs is why agency leaders fail to copy successful innovations. Building on the work of Teodoro (2009), we will discuss how the relative illegitimacy of clients and inflexibility of personnel systems combine with the professional norms, job mobility and progressive ambition of agency leaders to limit the diffusion of innovations in law enforcement and schooling. We will conclude with ideas about how to overcome these barriers

    DRAGON-Data: A platform and protocol for integrating genomic and phenotypic data across large psychiatric cohorts

    Get PDF
    Background Current psychiatric diagnoses, although heritable, have not been clearly mapped onto distinct underlying pathogenic processes. The same symptoms often occur in multiple disorders, and a substantial proportion of both genetic and environmental risk factors are shared across disorders. However, the relationship between shared symptoms and shared genetic liability is still poorly understood. Aims Well-characterised, cross-disorder samples are needed to investigate this matter, but few currently exist. Our aim is to develop procedures to purposely curate and aggregate genotypic and phenotypic data in psychiatric research. Method As part of the Cardiff MRC Mental Health Data Pathfinder initiative, we have curated and harmonised phenotypic and genetic information from 15 studies to create a new data repository, DRAGON-Data. To date, DRAGON-Data includes over 45 000 individuals: adults and children with neurodevelopmental or psychiatric diagnoses, affected probands within collected families and individuals who carry a known neurodevelopmental risk copy number variant. Results We have processed the available phenotype information to derive core variables that can be reliably analysed across groups. In addition, all data-sets with genotype information have undergone rigorous quality control, imputation, copy number variant calling and polygenic score generation. Conclusions DRAGON-Data combines genetic and non-genetic information, and is available as a resource for research across traditional psychiatric diagnostic categories. Algorithms and pipelines used for data harmonisation are currently publicly available for the scientific community, and an appropriate data-sharing protocol will be developed as part of ongoing projects (DATAMIND) in partnership with Health Data Research UK

    A gene frequency model for QTL mapping using Bayesian inference

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Information for mapping of quantitative trait loci (QTL) comes from two sources: linkage disequilibrium (non-random association of allele states) and cosegregation (non-random association of allele origin). Information from LD can be captured by modeling conditional means and variances at the QTL given marker information. Similarly, information from cosegregation can be captured by modeling conditional covariances. Here, we consider a Bayesian model based on gene frequency (BGF) where both conditional means and variances are modeled as a function of the conditional gene frequencies at the QTL. The parameters in this model include these gene frequencies, additive effect of the QTL, its location, and the residual variance. Bayesian methodology was used to estimate these parameters. The priors used were: logit-normal for gene frequencies, normal for the additive effect, uniform for location, and inverse chi-square for the residual variance. Computer simulation was used to compare the power to detect and accuracy to map QTL by this method with those from least squares analysis using a regression model (LSR).</p> <p>Results</p> <p>To simplify the analysis, data from unrelated individuals in a purebred population were simulated, where only LD information contributes to map the QTL. LD was simulated in a chromosomal segment of 1 cM with one QTL by random mating in a population of size 500 for 1000 generations and in a population of size 100 for 50 generations. The comparison was studied under a range of conditions, which included SNP density of 0.1, 0.05 or 0.02 cM, sample size of 500 or 1000, and phenotypic variance explained by QTL of 2 or 5%. Both 1 and 2-SNP models were considered. Power to detect the QTL for the BGF, ranged from 0.4 to 0.99, and close or equal to the power of the regression using least squares (LSR). Precision to map QTL position of BGF, quantified by the mean absolute error, ranged from 0.11 to 0.21 cM for BGF, and was better than the precision of LSR, which ranged from 0.12 to 0.25 cM.</p> <p>Conclusions</p> <p>In conclusion given a high SNP density, the gene frequency model can be used to map QTL with considerable accuracy even within a 1 cM region.</p

    QTL detection by multi-parent linkage mapping in oil palm (Elaeis guineensis Jacq.)

    Get PDF
    A quantitative trait locus (QTL) analysis designed for a multi-parent population was carried out and tested in oil palm (Elaeis guineensis Jacq.), which is a diploid cross-fertilising perennial species. A new extension of the MCQTL package was especially designed for crosses between heterozygous parents. The algorithm, which is now available for any allogamous species, was used to perform and compare two types of QTL search for small size families, within-family analysis and across-family analysis, using data from a 2 × 2 complete factorial mating experiment involving four parents from three selected gene pools. A consensus genetic map of the factorial design was produced using 251 microsatellite loci, the locus of the Sh major gene controlling fruit shell presence, and an AFLP marker of that gene. A set of 76 QTLs involved in 24 quantitative phenotypic traits was identified. A comparison of the QTL detection results showed that the across-family analysis proved to be efficient due to the interconnected families, but the family size issue is just partially solved. The identification of QTL markers for small progeny numbers and for marker-assisted selection strategies is discussed

    Improving management of type 1 diabetes in the UK: the Dose Adjustment For Normal Eating (DAFNE) programme as a research test-bed. A mixed-method analysis of the barriers to and facilitators of successful diabetes self-management, a health economic analysis, a cluster randomised controlled trial of different models of delivery of an educational intervention and the potential of insulin pumps and additional educator input to improve outcomes

    Get PDF
    Peer reviewe

    What Electrophysiology Tells Us About Alzheimer’s Disease::A Window into the Synchronization and Connectivity of Brain Neurons

    Get PDF
    Electrophysiology provides a real-time readout of neural functions and network capability in different brain states, on temporal (fractions of milliseconds) and spatial (micro, meso, and macro) scales unmet by other methodologies. However, current international guidelines do not endorse the use of electroencephalographic (EEG)/magnetoencephalographic (MEG) biomarkers in clinical trials performed in patients with Alzheimer’s disease (AD), despite a surge in recent validated evidence. This Position Paper of the ISTAART Electrophysiology Professional Interest Area endorses consolidated and translational electrophysiological techniques applied to both experimental animal models of AD and patients, to probe the effects of AD neuropathology (i.e., brain amyloidosis, tauopathy, and neurodegeneration) on neurophysiological mechanisms underpinning neural excitation/inhibition and neurotransmission as well as brain network dynamics, synchronization, and functional connectivity reflecting thalamocortical and cortico-cortical residual capacity. Converging evidence shows relationships between abnormalities in EEG/MEG markers and cognitive deficits in groups of AD patients at different disease stages. The supporting evidence for the application of electrophysiology in AD clinical research as well as drug discovery pathways warrants an international initiative to include the use of EEG/MEG biomarkers in the main multicentric projects planned in AD patients, to produce conclusive findings challenging the present regulatory requirements and guidelines for AD studies

    Creating and curating an archive: Bury St Edmunds and its Anglo-Saxon past

    Get PDF
    This contribution explores the mechanisms by which the Benedictine foundation of Bury St Edmunds sought to legitimise and preserve their spurious pre-Conquest privileges and holdings throughout the Middle Ages. The archive is extraordinary in terms of the large number of surviving registers and cartularies which contain copies of Anglo-Saxon charters, many of which are wholly or partly in Old English. The essay charts the changing use to which these ancient documents were put in response to threats to the foundation's continued enjoyment of its liberties. The focus throughout the essay is to demonstrate how pragmatic considerations at every stage affects the development of the archive and the ways in which these linguistically challenging texts were presented, re-presented, and represented during the Abbey’s history
    corecore