2,096 research outputs found

    Evaluation Research: Some Possible Contexts of Theory Failure

    Get PDF
    What can evaluation research tell us about social science theory? It is the purpose of this paper to examine that question. There has been much written in the current literature about the relationship between theory and practice. Because it is evaluation research (Breedlove, 1972: 71-89; Newbrough, 1966: 39-52; Suchman, 1971: 43-48; Suchman, 1967; Weiss, 1973: 37-45; Fitz- Gibbons and Morris, 1975: 1-4) that attempts to analyze the results of practice, it is the authors\u27 belief that an examination of evaluation research studies for possible contexts of theory failure will contribute to a linkage between theory and practice

    Universal Impedance Fluctuations in Wave Chaotic Systems

    Full text link
    We experimentally investigate theoretical predictions of universal impedance fluctuations in wave chaotic systems using a microwave analog of a quantum chaotic infinite square well potential. Our approach emphasizes the use of the radiation impedance to remove the non-universal effects of the particular coupling from the outside world to the scatterer. Specific predictions that we test include the probability distribution functions (PDFs) of the real (related to the local density of states in disordered metals) and imaginary parts of the normalized cavity impedance, the equality of the variances of these PDFs, and the dependence of the universal PDFs on a single control parameter characterizing the level of loss. We find excellent agreement between the statistical data and theoretical predictions.Comment: 5 pages, 3 figures, submitted to Phys. Rev. Let

    Modelling and testing the x-ray performance of CCD and CMOS APS detectors using numerical finite element simulations

    Get PDF
    Pixellated monolithic silicon detectors operated in a photon-counting regime are useful in spectroscopic imaging applications. Since a high energy incident photon may produce many excess free carriers upon absorption, both energy and spatial information can be recovered by resolving each interaction event. The performance of these devices in terms of both the energy and spatial resolution is in large part determined by the amount of diffusion which occurs during the collection of the charge cloud by the pixels. Past efforts to predict the X-ray performance of imaging sensors have used either analytical solutions to the diffusion equation or simplified monte carlo electron transport models. These methods are computationally attractive and highly useful but may be complemented using more physically detailed models based on TCAD simulations of the devices. Here we present initial results from a model which employs a full transient numerical solution of the classical semiconductor equations to model charge collection in device pixels under stimulation from initially Gaussian photogenerated charge clouds, using commercial TCAD software. Realistic device geometries and doping are included. By mapping the pixel response to different initial interaction positions and charge cloud sizes, the charge splitting behaviour of the model sensor under various illuminations and operating conditions is investigated. Experimental validation of the model is presented from an e2v CCD30-11 device under varying substrate bias, illuminated using an Fe-55 source

    Universal Statistics of the Scattering Coefficient of Chaotic Microwave Cavities

    Full text link
    We consider the statistics of the scattering coefficient S of a chaotic microwave cavity coupled to a single port. We remove the non-universal effects of the coupling from the experimental S data using the radiation impedance obtained directly from the experiments. We thus obtain the normalized, complex scattering coefficient whose Probability Density Function (PDF) is predicted to be universal in that it depends only on the loss (quality factor) of the cavity. We compare experimental PDFs of the normalized scattering coefficients with those obtained from Random Matrix Theory (RMT), and find excellent agreement. The results apply to scattering measurements on any wave chaotic system.Comment: 10 pages, 8 Figures, Fig.7 in Color, Submitted to Phys. Rev.

    Ranking hospitals based on preventable hospital death rates:a systematic review with implications for both direct measurement and indirect measurement through standardized mortality rates

    Get PDF
    Objectives There is interest in monitoring avoidable or preventable deaths measured directly or indirectly through standardized mortality rates (SMRs). We reviewed studies that use implicit case note reviews to estimate the range of preventable death rates observed, the measurement characteristics of those estimates, and the measurement procedures used to generate them. We comment on the implications for monitoring SMRs and illustrate a way to calculate the number of reviews needed to establish a reliable estimate of preventability of one death or the hospital preventable death rate. Design Systematic review of the literature supplemented by re-analysis of authors previously published and un-published data and measurement design calculations. Data source Searches in PubMed, MEDLINE (OvidSP) and Web of Knowledge in June 2012, updated December 2017. Eligibility criteria Studies of hospital-wide admissions from general and acute medical wards where preventable deaths rates are provided or can be estimate and which can provide inter- observer variations. Results Twenty-four studies were included from 1983-2017. Recent larger studies suggest consistently low rates of preventable deaths (3.0-6.5% since 2012). Reliability of a single review for distinguishing between individual cases with regard to the preventability of death had a Kappa rate of 0.27-0.50 for deaths and 0.24-0.76 for adverse events. A Kappa of 0.35 would require an average of 8-17 reviews of a single case to be precise enough to have confidence about high stakes decisions to change care procedures or impose sanctions within a hospital as a result. No study estimated the variation in preventable deaths across hospitals, although we were able to re-analyse one study to obtain an estimate. Based on this estimate, 200-300 total case-note reviews per hospital could be required to reliably distinguish between hospitals. The studies display considerable heterogeneity: 13/24 studies defined preventable with a threshold of ≥4 in a six-category Likert scale; 11/24 involved a two-stage screening process with nurses at the first stage and physicians at the second. Fifteen studies provided expert clinical review support for reviewer disagreements, advice, or quality control. A ‘generalist/internist’ was the modal physician specialty for reviewers and they received 1-3 days of generic tools orientation and case-note review practice. Methods did not consider the influence of human or environmental factors. Conclusions The literature provides limited information about the measurement characteristics of preventable deaths that suggests substantial numbers of reviews may be needed to create reliable estimates of preventable deaths at the individual or hospital level. Any operational program would require population specific estimates of reliability. Preventable death rates are low, which is likely to make it difficult to use SMRs based on all deaths to validly profile hospitals. The literature provides little information to guide improvements in the measurement procedures. Systematic review registration The systematic review was conceived prior to PROSPERO, and so has not been registered

    Methane Production in a 100-L Upflow Bioreactor by Anaerobic Digestion of Farm Waste

    Get PDF
    Manure Waste from Dairy Farms Has Been Used for Methane Production for Decades, However, Problems Such as Digester Failure Are Routine. the Problem Has Been Investigated in Small Scale (1-2 L) Digesters in the Laboratory; However, Very Little Scale-Up to Intermediate Scales Are Available. We Report Production of Methane in a 100-L Digester and the Results of an Investigation into the Effect of Partial Mixing Induced by Gas Upflow/recirculation in the Digester. the Digester Was Operated for a Period of About 70 D (With 16-D Hydraulic Retention Time) with and Without the Mixing Induced by Gas Recirculation through an Internal Draft Tube. the Results Show a Clear Effect of Mixing on Digester Operation. Without Any Mixing, the Digester Performance Deteriorated within 30-50 D, Whereas with Mixing Continuous Production of Methane Was Observed. This Study Demonstrates the Importance of Mixing and its Critical Role in Design of Large-Scale Anaerobic Digesters. Copyright © 2006 by Humana Press Inc. All Rights of Any Nature Whatsoever Reserved

    Rats distinguish between absence of events and lack of evidence in contingency learning.

    Get PDF
    The goal of three experiments was to study whether rats are aware of the difference between absence of events and lack of evidence. We used a Pavlovian extinction paradigm in which lights consistently signaling sucrose were suddenly paired with the absence of sucrose. The crucial manipulation involved the absent outcomes in the extinction phase. Whereas in the Cover conditions, access to the drinking receptacle was blocked by a metal plate, in the No Cover conditions, the drinking receptacle was accessible. The Test phase showed that in the Cover conditions, the measured expectancies of sucrose were clearly at a higher level than in the No Cover conditions. We compare two competing theories potentially explaining the findings. A cognitive theory interprets the observed effect as evidence that the rats were able to understand that the cover blocked informational access to the outcome information, and therefore the changed learning input did not necessarily signify a change of the underlying contingency in the world. An alternative associationist account, renewal theory, might instead explain the relative sparing of extinction in the Cover condition as a consequence of context change. We discuss the merits of both theories as accounts of our data and conclude that the cognitive explanation is in this case preferred

    Recent applications of a single quadrupole mass spectrometer in 11C, 18F and radiometal chemistry

    Full text link
    Mass spectrometry (MS) has longstanding applications in radiochemistry laboratories, stemming from carbon-dating. However, research on the development of radiotracers for molecular imaging with either positron emission tomography (PET) or single photon emission computed tomography has yet to take full advantage of MS. This inertia has been attributed to the relatively low concentrations of radiopharmaceutical formulations and lack of access to the required MS equipment due to the high costs for purchase and maintenance of specialized MS systems. To date, single quadrupole (SQ)-MS coupled to liquid chromatography (LC) systems is the main form of MS that has been used in radiochemistry laboratories. These LC–MS systems are primarily used for assessing the chemical purity of radiolabeling precursor or standard molecules but also have applications in the determination of metabolites. Herein, we highlight personal experiences using a compact SQ-MS in our PET radiochemistry laboratories, to monitor the small amounts of carrier observed in most radiotracer preparations, even at high molar activities. The use of a SQ-MS in the observation of the low mass associated with non-radioactive species which are formed along with the radiotracer from the trace amounts of carrier found is demonstrated. Herein, we describe a pre-concentration system to detect dilute radiopharmaceutical formulations and metabolite analyses by SQ-MS. Selected examples where SQ-MS was critical for optimization of radiochemical reactions and for unequivocal characterization of radiotracers are showcased. We also illustrate examples where SQ-MS can be applied in identification of radiometal complexes and development of a new purification methodology for Pd-catalyzed radiofluorination reactions, shedding light on the identity of metal complexes present in the labelling solution
    corecore