686 research outputs found

    Clinical effectiveness and cost-effectiveness of pegvisomant for the treatment of acromegaly: a systematic review and economic evaluation

    Get PDF
    Background: Acromegaly, an orphan disease usually caused by a benign pituitary tumour, is characterised by hyper-secretion of growth hormone (GH) and insulin-like growth factor I (IGF-1). It is associated with reduced life expectancy, cardiovascular problems, a variety of insidiously progressing detrimental symptoms and metabolic malfunction. Treatments include surgery, radiotherapy and pharmacotherapy. Pegvisomant (PEG) is a genetically engineered GH analogue licensed as a third or fourth line option when other treatments have failed to normalise IGF-1 levels. Methods: Evidence about effectiveness and cost-effectiveness of PEG was systematically reviewed. Data were extracted from published studies and used for a narrative synthesis of evidence. A decision analytical economic model was identified and modified to assess the cost-effectiveness of PEG. Results: One RCT and 17 non-randomised studies were reviewed for effectiveness. PEG substantially reduced and rapidly normalised IGF-1 levels in the majority of patients, approximately doubled GH levels, and improved some of the signs and symptoms of the disease. Tumour size was unaffected at least in the short term. PEG had a generally safe adverse event profile but a few patients were withdrawn from treatment because of raised liver enzymes. An economic model was identified and adapted to estimate the lower limit for the cost-effectiveness of PEG treatment versus standard care. Over a 20 year time horizon the incremental cost-effectiveness ratio was pound81,000/QALY and pound212,000/LYG. To reduce this to pound30K/QALY would require a reduction in drug cost by about one third. Conclusion: PEG is highly effective for improving patients' IGF-1 level. Signs and symptoms of disease improve but evidence is lacking about long term effects on improved signs and symptoms of disease, quality of life, patient compliance and safety. Economic evaluation indicated that if current standards (UK) for determining cost-effectiveness of therapies were to be applied to PEG it would be considered not to represent good value for money

    Loop Quantum Gravity a la Aharonov-Bohm

    Full text link
    The state space of Loop Quantum Gravity admits a decomposition into orthogonal subspaces associated to diffeomorphism equivalence classes of spin-network graphs. In this paper I investigate the possibility of obtaining this state space from the quantization of a topological field theory with many degrees of freedom. The starting point is a 3-manifold with a network of defect-lines. A locally-flat connection on this manifold can have non-trivial holonomy around non-contractible loops. This is in fact the mathematical origin of the Aharonov-Bohm effect. I quantize this theory using standard field theoretical methods. The functional integral defining the scalar product is shown to reduce to a finite dimensional integral over moduli space. A non-trivial measure given by the Faddeev-Popov determinant is derived. I argue that the scalar product obtained coincides with the one used in Loop Quantum Gravity. I provide an explicit derivation in the case of a single defect-line, corresponding to a single loop in Loop Quantum Gravity. Moreover, I discuss the relation with spin-networks as used in the context of spin foam models.Comment: 19 pages, 1 figure; v2: corrected typos, section 4 expanded

    The Hubble Constant

    Get PDF
    I review the current state of determinations of the Hubble constant, which gives the length scale of the Universe by relating the expansion velocity of objects to their distance. There are two broad categories of measurements. The first uses individual astrophysical objects which have some property that allows their intrinsic luminosity or size to be determined, or allows the determination of their distance by geometric means. The second category comprises the use of all-sky cosmic microwave background, or correlations between large samples of galaxies, to determine information about the geometry of the Universe and hence the Hubble constant, typically in a combination with other cosmological parameters. Many, but not all, object-based measurements give H0H_0 values of around 72-74km/s/Mpc , with typical errors of 2-3km/s/Mpc. This is in mild discrepancy with CMB-based measurements, in particular those from the Planck satellite, which give values of 67-68km/s/Mpc and typical errors of 1-2km/s/Mpc. The size of the remaining systematics indicate that accuracy rather than precision is the remaining problem in a good determination of the Hubble constant. Whether a discrepancy exists, and whether new physics is needed to resolve it, depends on details of the systematics of the object-based methods, and also on the assumptions about other cosmological parameters and which datasets are combined in the case of the all-sky methods.Comment: Extensively revised and updated since the 2007 version: accepted by Living Reviews in Relativity as a major (2014) update of LRR 10, 4, 200

    The ecology of outdoor rape: The case of Stockholm, Sweden

    Get PDF
    The objective of this article is to report the results of an ecological study into the geography of rape in Stockholm, Sweden, using small area data. In order to test the importance of factors indicating opportunity, accessibility and anonymity to the understanding of the geography of rape, a two-stage modelling approach is implemented. First, the overall risk factors associated with the occurrence of rape are identified using a standard Poisson regression, then a local analysis using profile regression is performed. Findings from the whole-map analysis show that accessibility, opportunity and anonymity are all, to different degrees, important in explaining the overall geography of rape - examples of these risk factors are the presence of subway stations or whether a basområde is close to the city centre. The local analysis reveals two groupings of high risk of rape areas associated with a variety of risk factors: city centre areas with a concentration of alcohol outlets, high residential population turnover and high counts of robbery; and poor suburban areas with schools and large female residential populations where subway stations are located and where people express a high fear of crime. The article concludes by reflecting upon the importance of these results for future research as well as indicating the implications of these results for policy

    The Minimum Information Required for a Glycomics Experiment (MIRAGE) project: improving the standards for reporting glycan microarray-based data

    Get PDF
    MIRAGE (Minimum Information Required for A Glycomics Experiment) is an initiative that was created by experts in the fields of glycobiology, glycoanalytics, and glycoinformatics to produce guidelines for reporting results from the diverse types of experiments and analyses used in structural and functional studies of glycans in the scientific literature. As a sequel to the guidelines for sample preparation (Struwe et al. 2016, Glycobiology, 26, 907-910) and mass spectrometry (MS) data (Kolarich et al. 2013, Mol. Cell Proteomics. 12, 991-995), here we present the first version of guidelines intended to improve the standards for reporting data from glycan microarray analyses. For each of eight areas in the workflow of a glycan microarray experiment, we provide guidelines for the minimal information that should be provided in reporting results. We hope that the MIRAGE glycan microarray guidelines proposed here will gain broad acceptance by the community, and will facilitate interpretation and reproducibility of the glycan microarray results with implications in comparison of data from different laboratories and eventual deposition of glycan microarray data in international databases

    Quantization of Midisuperspace Models

    Get PDF
    We give a comprehensive review of the quantization of midisuperspace models. Though the main focus of the paper is on quantum aspects, we also provide an introduction to several classical points related to the definition of these models. We cover some important issues, in particular, the use of the principle of symmetric criticality as a very useful tool to obtain the required Hamiltonian formulations. Two main types of reductions are discussed: those involving metrics with two Killing vector fields and spherically symmetric models. We also review the more general models obtained by coupling matter fields to these systems. Throughout the paper we give separate discussions for standard quantizations using geometrodynamical variables and those relying on loop quantum gravity inspired methods.Comment: To appear in Living Review in Relativit

    Quantum Gravity in 2+1 Dimensions: The Case of a Closed Universe

    Get PDF
    In three spacetime dimensions, general relativity drastically simplifies, becoming a ``topological'' theory with no propagating local degrees of freedom. Nevertheless, many of the difficult conceptual problems of quantizing gravity are still present. In this review, I summarize the rather large body of work that has gone towards quantizing (2+1)-dimensional vacuum gravity in the setting of a spatially closed universe.Comment: 61 pages, draft of review for Living Reviews; comments, criticisms, additions, missing references welcome; v2: minor changes, added reference
    corecore