5,553 research outputs found

    The effect of protons on the performance of swept-charge devices

    Get PDF
    This is the pre-print version of the final paper published in Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2009 Elsevier B.V.The e2v technologies CCD54, or swept-charge device (SCD) has been extensively radiation tested for use in the Chandrayaan-1 X-ray Spectrometer (C1XS) instrument, to be launched as a part of the Indian Space Research Organisation (ISRO) Chandrayaan-1 payload in 2008. The principle use of the SCD is in X-ray fluorescence (XRF) applications, the device providing a relatively large collecting area of 1.1 cm2, and achieving near Fano-limited spectroscopy at −15 °C, a temperature that is easily obtained using a thermoelectric cooler (TEC). This paper describes the structure and operation of the SCD and details the methodology and results obtained from two proton irradiation studies carried out in 2006 and 2008, respectively to quantify the effects of proton irradiation on the operational characteristics of the device. The analysis concentrates on the degradation of the measured FWHM of various elemental lines and quantifies the effects of proton fluence on the observed X-ray fluorescence spectra from mineralogical target samples

    A novel deconvolution beamforming algorithm for virtual phased arrays

    No full text
    Beamforming techniques using phased microphone arrays are one of the most common tools for localizing and quantifying noise sources. However, the use of such devices can result in a series of well-known disadvantages regarding, for instance, their very high cost or transducer mismatch. Virtual Phased Arrays (VPAs) have been proposed as an alternative solution to prevent these difficulties provided the sound field is time stationary. Several frequency domain beamforming techniques can be adapted to only use the relative phase between a fixed and a moving transducer. Therefore the results traditionally obtained using large arrays can be emulated by applying beamforming algorithms to data acquired from only two sensors. This paper presents a novel beamforming algorithm which uses a deconvolution approach to strongly reduce the presence of side lobes. A series of synthetic noise sources with negative source strength are introduced in order to maximize the dynamic range of the beamforming deconvolved map. This iterative sidelobe cleaner algorithm (ISCA) does not require the of use of the covariance matrix of the array, hence it can also be applied to a VPA. The performance of ISCA is compared throughout several simulations with conventional deconvolution algorithms such as DAMAS and NNLS. The results support the robustness and accuracy of the proposed approach, providing clear localization maps in all the conditions evaluated

    Endovascular repair for acute traumatic transection of the descending thoracic aorta: experience of a single centre with a 12-years follow up

    Get PDF
    Background: Most blunt aortic injuries occur in the proximal proximal descending aorta causing acute transection of this vessel. Generally, surgical repair of the ruptured segment of aorta is associated with high rates of morbidity and mortality and in this view endovascular treatment seems to be a valid and safer alternative. Aim of this article is to review our experience with endovascular approach for the treatment of acute traumatic rupture of descending thoracic aorta. Methods: From April 2002 to November 2014, 11 patients (9 males and 2 females) were referred to our Department with a diagnosis of acute transection of thoracic aorta. Following preoperative Computed Tomography (CT) evaluation, thoracic endovascular aortic repair (TEVAR) with left subclavian artery coverage was performed. Follow-up consisted clinical and instrumental (CT, Duplex ultrasound) controls at discharge, 1, 3 and 6 months and yearly thereafter. Results: At 12-year follow up, the overall survival for the entire patients cohort was 100 %, no major or minor neurological complications and no episode of left arm claudication occurred. Cardiovascular, respiratory and bleeding complications, in the early period, was represented by minor, non fatal events. No stent graft failure, collapse, leak or distal migration were detected at CT scan during the entire follow up period. Conclusions: According to our experience, despite the small number of patient population, TEVAR procedure with with left subclavian artery coverage, performed in emergency settings, seems to provide excellent long term results. Trials registration: The protocol was registered at a public trials registry, www.clinicaltrials.gov (trial identifier NCT02376998)

    Impact of a hospice rapid response service on preferred place of death, and costs

    Get PDF
    Background: Many people with a terminal illness would prefer to die at home. A new palliative rapid response service (RRS) provided by a large hospice provider in South East England was evaluated (2010) to provide evidence of impact on achieving preferred place of death and costs. The RRS was delivered by a team of trained health care assistants and available 24/7. The purpose of this study was to (i) compare the characteristics of RRS users and non-users, (ii) explore differences in the proportions of users and non-users dying in the place of their choice, (iii) monitor the whole system service utilisation of users and non-users, and compare costs. Methods: All hospice patients who died with a preferred place of death recorded during an 18 month period were included. Data (demographic, preferences for place of death) were obtained from hospice records. Dying in preferred place was modelled using stepwise logistic regression analysis. Service use data (period between referral to hospice and death) were obtained from general practitioners, community providers, hospitals, social services, hospice, and costs calculated using validated national tariffs. Results: Of 688 patients referred to the hospice when the RRS was operational, 247 (35.9 %) used it. Higher proportions of RRS users than non-users lived in their own homes with a co-resident carer (40.3 % vs. 23.7 %); more non-users lived alone or in residential care (58.8 % vs. 76.3 %). Chances of dying in the preferred place were enhanced 2.1 times by being a RRS user, compared to a non-user, and 1.5 times by having a co-resident carer, compared to living at home alone or in a care home. Total service costs did not differ between users and non-users, except when referred to hospice very close to death (users had higher costs). Conclusions: Use of the RRS was associated with increased likelihood of dying in the preferred place. The RRS is cost neutral

    The cosmological constant and the relaxed universe

    Full text link
    We study the role of the cosmological constant (CC) as a component of dark energy (DE). It is argued that the cosmological term is in general unavoidable and it should not be ignored even when dynamical DE sources are considered. From the theoretical point of view quantum zero-point energy and phase transitions suggest a CC of large magnitude in contrast to its tiny observed value. Simply relieving this disaccord with a counterterm requires extreme fine-tuning which is referred to as the old CC problem. To avoid it, we discuss some recent approaches for neutralising a large CC dynamically without adding a fine-tuned counterterm. This can be realised by an effective DE component which relaxes the cosmic expansion by counteracting the effect of the large CC. Alternatively, a CC filter is constructed by modifying gravity to make it insensitive to vacuum energy.Comment: 6 pages, no figures, based on a talk presented at PASCOS 201

    Nonlinear analysis of a simple model of temperature evolution in a satellite

    Get PDF
    We analyse a simple model of the heat transfer to and from a small satellite orbiting round a solar system planet. Our approach considers the satellite isothermal, with external heat input from the environment and from internal energy dissipation, and output to the environment as black-body radiation. The resulting nonlinear ordinary differential equation for the satellite's temperature is analysed by qualitative, perturbation and numerical methods, which show that the temperature approaches a periodic pattern (attracting limit cycle). This approach can occur in two ways, according to the values of the parameters: (i) a slow decay towards the limit cycle over a time longer than the period, or (ii) a fast decay towards the limit cycle over a time shorter than the period. In the first case, an exactly soluble average equation is valid. We discuss the consequences of our model for the thermal stability of satellites.Comment: 13 pages, 4 figures (5 EPS files

    Effective growth of matter density fluctuations in the running LCDM and LXCDM models

    Full text link
    We investigate the matter density fluctuations \delta\rho/\rho for two dark energy (DE) models in the literature in which the cosmological term \Lambda is a running parameter. In the first model, the running LCDM model, matter and DE exchange energy, whereas in the second model, the LXCDM model, the total DE and matter components are conserved separately. The LXCDM model was proposed as an interesting solution to the cosmic coincidence problem. It includes an extra dynamical component, the "cosmon" X, which interacts with the running \Lambda, but not with matter. In our analysis we make use of the current value of the linear bias parameter, b^2(0)= P_{GG}/P_{MM}, where P_{MM} ~ (\delta\rho/\rho)^2 is the present matter power spectrum and P_{GG} is the galaxy fluctuation power spectrum. The former can be computed within a given model, and the latter is found from the observed LSS data (at small z) obtained by the 2dF galaxy redshift survey. It is found that b^2(0)=1 within a 10% accuracy for the standard LCDM model. Adopting this limit for any DE model and using a method based on the effective equation of state for the DE, we can set a limit on the growth of matter density perturbations for the running LCDM model, the solution of which is known. This provides a good test of the procedure, which we then apply to the LXCDM model in order to determine the physical region of parameter space, compatible with the LSS data. In this region, the LXCDM model is consistent with known observations and provides at the same time a viable solution to the cosmic coincidence problem.Comment: LaTeX, 38 pages, 8 figures. Version accepted in JCA

    Cosmologies with a time dependent vacuum

    Full text link
    The idea that the cosmological term, Lambda, should be a time dependent quantity in cosmology is a most natural one. It is difficult to conceive an expanding universe with a strictly constant vacuum energy density, namely one that has remained immutable since the origin of time. A smoothly evolving vacuum energy density that inherits its time-dependence from cosmological functions, such as the Hubble rate or the scale factor, is not only a qualitatively more plausible and intuitive idea, but is also suggested by fundamental physics, in particular by quantum field theory (QFT) in curved space-time. To implement this notion, is not strictly necessary to resort to ad hoc scalar fields, as usually done in the literature (e.g. in quintessence formulations and the like). A "running" Lambda term can be expected on very similar grounds as one expects (and observes) the running of couplings and masses with a physical energy scale in QFT. Furthermore, the experimental evidence that the equation of state of the dark energy could be evolving with time/redshift (including the possibility that it might currently behave phantom-like) suggests that a time-variable Lambda term (possibly accompanied by a variable Newton's gravitational coupling G=G(t)) could account in a natural way for all these features. Remarkably enough, a class of these models (the "new cosmon") could even be the clue for solving the old cosmological constant problem, including the coincidence problem.Comment: LaTeX, 15 pages, 4 figure

    Perturbations in the relaxation mechanism for a large cosmological constant

    Full text link
    Recently, a mechanism for relaxing a large cosmological constant (CC) has been proposed [arxiv:0902.2215], which permits solutions with low Hubble rates at late times without fine-tuning. The setup is implemented in the LXCDM framework, and we found a reasonable cosmological background evolution similar to the LCDM model with a fine-tuned CC. In this work we analyse analytically the perturbations in this relaxation model, and we show that their evolution is also similar to the LCDM model, especially in the matter era. Some tracking properties of the vacuum energy are discussed, too.Comment: 18 pages, LaTeX; discussion improved, accepted by CQ

    Cosmologies with variable parameters and dynamical cosmon: implications on the cosmic coincidence problem

    Get PDF
    Dynamical dark energy (DE) has been proposed to explain various aspects of the cosmological constant (CC) problem(s). For example, it is very difficult to accept that a strictly constant Lambda-term constitutes the ultimate explanation for the DE in our Universe. It is also hard to acquiesce in the idea that we accidentally happen to live in an epoch where the CC contributes an energy density value right in the ballpark of the rapidly diluting matter density. It should perhaps be more plausible to conceive that the vacuum energy, is actually a dynamical quantity as the Universe itself. More generally, we could even entertain the possibility that the total DE is in fact a mixture of vacuum energy and other dynamical components (e.g. fields, higher order terms in the effective action etc) which can be represented collectively by an effective entity X (dubbed the ``cosmon''). The ``cosmon'', therefore, acts as a dynamical DE component different from the vacuum energy. While it can actually behave phantom-like by itself, the overall DE fluid may effectively appear as standard quintessence, or even mimic at present an almost exact CC behavior. Thanks to the versatility of such cosmic fluid we can show that a composite DE system of this sort (``LXCDM'') may have a key to resolving the mysterious coincidence problem.Comment: LaTeX, 13 pages, 5 figure
    • 

    corecore