879 research outputs found
Atovaquone Compared with Dapsone for the Prevention of Pneumocystis carinii Pneumonia in Patients with HIV Infection Who Cannot Tolerate Trimethoprim, Sulfonamides, or Both
BACKGROUND
Although trimethoprim–sulfamethoxazole is the drug of choice for the prevention of Pneumocystis carinii pneumonia, many patients cannot tolerate it and must switch to an alternative agent. METHODS
We conducted a multicenter, open-label, randomized trial comparing daily atovaquone (1500-mg suspension) with daily dapsone (100 mg) for the prevention of P. carinii pneumonia among patients infected with the human immunodeficiency virus who could not tolerate trimethoprim–sulfamethoxazole. The median follow-up period was 27 months. RESULTS
Of 1057 patients enrolled, 298 had a history of P. carinii pneumonia.P. cariniipneumonia developed in 122 of 536 patients assigned to atovaquone (15.7 cases per 100 person-years), as compared with 135 of 521 in the dapsone group (18.4 cases per 100 person-years; relative risk for atovaquone vs. dapsone, 0.85; 95 percent confidence interval, 0.67 to 1.09; P=0.20). The relative risk of death was 1.07 (95 percent confidence interval, 0.89 to 1.30; P=0.45), and the relative risk of discontinuation of the assigned medication because of adverse events was 0.94 (95 percent confidence interval, 0.74 to 1.19; P=0.59). Among the 546 patients who were receiving dapsone at base line, the relative risk of discontinuation because of adverse events was 3.78 for atovaquone as compared with dapsone (95 percent confidence interval, 2.37 to 6.01; P CONCLUSIONS
Among patients who cannot tolerate trimethoprim–sulfamethoxazole, atovaquone and dapsone are similarly effective for the prevention ofP. carinii pneumonia. Our results support the continuation of dapsone prophylaxis among patients who are already receiving it. However, among those not receiving dapsone, atovaquone is better tolerated and may be the preferred choice for prophylaxis against P. cariniipneumonia
Development of a grass measurement optimisation tool to efficiently measure herbage mass on grazed pastures
peer-reviewedAccurate and efficient estimation of herbage mass is essential for optimising grass utilisation and increasing profit for pasture farming. There is no definitive sampling protocol for grass measurement on Irish pastures. This paper presents the Grass Measurement Optimisation Tool (GMOT), designed to generate measurement protocols that optimise for time and accuracy. The GMOT was designed in the form of a decision support tool that generates interactive paddock maps that guide the farmer on how to optimally measure their pastures in a random stratified manner based on GPS co-ordinates, resulting in accurate non-biased estimations of mean herbage mass. Rising plate meter (RPM) measurements and reference herbage cuts were performed on trial plots and grazed paddocks over three years. Measurement routes were optimised using a genetic algorithm based on a traveling salesman problem. Actual survey error was estimated in terms of relative prediction error using Monte Carlo simulations that combined measurement and calibration error distributions for the RPM. Cost benefit analysis was conducted to evaluate the feasibility of using the GMOT on Irish grasslands. Actual error for the RPM decreased from 37% to 26% as measurement rates increased from 1 to 8 ha−1 and reductions in error were negligible (<1%) as measurements increased from 8 to 32 ha−1. Calibration error was the largest source of error (25.9%) compared to measurement error (8%). Optimal measurement value was achieved by performing 8 measures ha−1 and further increasing the measurement rate resulted in diminishing returns. The GMOT is compatible with a range of pasture measurement technologies
Photovoltaic systems on dairy farms: Financial and renewable multi-objective optimization (FARMOO) analysis
peer-reviewedThe aim of this study was to develop a financial and renewable multi-objective optimization (FARMOO) method for dairy farms. Due to increased global milk production and European Union policies concerning renewable energy contributions, the optimization of dairy farms from financial and renewable standpoints is crucial. The FARMOO method found the optimal combination of dairy farm equipment and management practices, based on a trade-off parameter which quantified the relative importance of maximizing farm net profit and maximizing farm renewable contribution. A PV system model was developed and validated to assess the financial performance and renewable contribution of this technology in a dairy farming context. Seven PV system sizes were investigated, ranging from 2 kWp to 11 kWp. Multi-objective optimization using a Genetic Algorithm was implemented to find the optimal combination of equipment and management practices based on the aforementioned trade-off parameter. For a test case of a 195 cow spring calving dairy farm in Ireland, it was found that when the relative importance of farm net profit was high, a PV system was not included in the optimal farm configuration. When net profit and renewable contribution were of equal importance, the optimal farm configuration included an 11 kWp PV system with a scheduled water heating load at 10:00. Multi-objective optimization was carried out for the same test case with the goals of maximizing farm net profit and minimizing farm CO2 emissions. Under this scenario, the optimal farm configuration included an 11 kWp PV system when the relative importance of farm net profit was low. This study included a sensitivity analysis which investigated the use of a 40% grant aid on PV system capital costs. This sensitivity analysis did not significantly improve the financial feasibility of PV systems on dairy farms. Moreover, it was found that load shifting of a farm’s water heating enabled the majority of the PV system’s electricity output to be consumed. Hence the use of batteries with small PV systems on dairy farms may not be necessary. The method described in this study will be used to inform policy and provide decision support relating to PV systems on dairy farms.Teagas
SKA studies of nearby galaxies : star-formation, accretion processes and molecular gas across all environments
Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike LicenceThe SKA will be a transformational instrument in the study of our local Universe. In particular, by virtue of its high sensitivity (both to point sources and diffuse low surface brightness emission), angular resolution and the frequency ranges covered, the SKA will undertake a very wide range of astrophysical research in the field of nearby galaxies. By surveying vast numbers of nearby galaxies of all types with Jy sensitivity and sub-arcsecond angular resolutions at radio wavelengths, the SKA will provide the cornerstone of our understanding of star-formation and accretion activity in the local Universe. In this chapter we outline the key continuum and molecular line science areas where the SKA, both during phase-1 and when it becomes the full SKA, will have a significant scientific impact.Peer reviewedFinal Published versio
Haplotyping the human leukocyte antigen system from single chromosomes
We describe a method for determining the parental HLA haplotypes of a single individual without recourse to conventional segregation genetics. Blood samples were cultured to identify and sort chromosome 6 by bivariate flow cytometry. Single chromosome 6 amplification products were confirmed with a single nucleotide polymorphism (SNP) array and verified by deep sequencing to enable assignment of both alleles at the HLA loci, defining the two haplotypes. This study exemplifies a rapid and efficient method of haplotyping that can be applied to any chromosome pair, or indeed all chromosome pairs, using a single sorting operation. The method represents a cost-effective approach to complete phasing of SNPs, which will facilitate a deeper understanding of the links between SNPs, gene regulation and protein function
2d Stringy Black Holes and Varying Constants
Motivated by the recent interest on models with varying constants and whether
black hole physics can constrain such theories, two-dimensional charged stringy
black holes are considered. We exploit the role of two-dimensional stringy
black holes as toy models for exploring paradoxes which may lead to constrains
on a theory. A two-dimensional charged stringy black hole is investigated in
two different settings. Firstly, the two-dimensional black hole is treated as
an isolated object and secondly, it is contained in a thermal environment. In
both cases, it is shown that the temperature and the entropy of the
two-dimensional charged stringy black hole are decreased when its electric
charge is increased in time. By piecing together our results and previous ones,
we conclude that in the context of black hole thermodynamics one cannot derive
any model independent constraints for the varying constants. Therefore, it
seems that there aren't any varying constant theories that are out of favor
with black hole thermodynamics.Comment: 12 pages, LaTeX, to appear in JHE
Particle Filtering for Sequential Spacecraft Attitude Estimation
A new spacecraft attitude estimation approach using particle filtering is derived. Based on sequential Monte Carlo simulation, the particle filter approximately represents the prob-ability distribution of the state vector with random samples. The filter formulation is based on the star camera measurements using a gyro-based or attitude dynamics-based model for attitude propagation. Modified Rodrigues parameters are used for attitude parametriza-tion when the sample mean and covariance of the attitude are computed. The ambiguity problem associated with the modified Rodrigues parameters in the mean and covariance computation is addressed as well. By using the uniform attitude probability distribution as the initial attitude distribution and using a gradually decreasing measurement variance in the computation of the importance weights, the particle filter based attitude estimator possesses global convergence properties. Simulation results indicate that the particular particle filter, known as bootstrap filter, with as many as 2000 particles is able to converge from arbitrary initial attitude error and initial gyro bias errors as large as 4500 degrees per hour per axis. I
Search For Trapped Antihydrogen
We present the results of an experiment to search for trapped antihydrogen
atoms with the ALPHA antihydrogen trap at the CERN Antiproton Decelerator.
Sensitive diagnostics of the temperatures, sizes, and densities of the trapped
antiproton and positron plasmas have been developed, which in turn permitted
development of techniques to precisely and reproducibly control the initial
experimental parameters. The use of a position-sensitive annihilation vertex
detector, together with the capability of controllably quenching the
superconducting magnetic minimum trap, enabled us to carry out a
high-sensitivity and low-background search for trapped synthesised antihydrogen
atoms. We aim to identify the annihilations of antihydrogen atoms held for at
least 130 ms in the trap before being released over ~30 ms. After a three-week
experimental run in 2009 involving mixing of 10^7 antiprotons with 1.3 10^9
positrons to produce 6 10^5 antihydrogen atoms, we have identified six
antiproton annihilation events that are consistent with the release of trapped
antihydrogen. The cosmic ray background, estimated to contribute 0.14 counts,
is incompatible with this observation at a significance of 5.6 sigma. Extensive
simulations predict that an alternative source of annihilations, the escape of
mirror-trapped antiprotons, is highly unlikely, though this possibility has not
yet been ruled out experimentally.Comment: 12 pages, 7 figure
- …