2,926 research outputs found

    OVI, NV and CIV in the Galactic Halo: II. Velocity-Resolved Observations with Hubble and FUSE

    Full text link
    We present a survey of NV and OVI (and where available CIV) in the Galactic halo, using data from the Far Ultraviolet Spectroscopic Explorer (FUSE) and the Hubble Space Telescope (HST) along 34 sightlines. These ions are usually produced in nonequilibrium processes such as shocks, evaporative interfaces, or rapidly cooling gas, and thus trace the dynamics of the interstellar medium. Searching for global trends in integrated and velocity-resolved column density ratios, we find large variations in most measures, with some evidence for a systematic trend of higher ionization (lower NV/OVI column density ratio) at larger positive line-of-sight velocities. The slopes of log[N(NV)/N(OVI)] per unit velocity range from -0.015 to +0.005, with a mean of -0.0032+/-0.0022(r)+/-0.0014(sys) dex/(km/s). We compare this dataset with models of velocity-resolved high-ion signatures of several common physical structures. The dispersion of the ratios, OVI/NV/CIV, supports the growing belief that no single model can account for hot halo gas, and in fact some models predict much stronger trends than are observed. It is important to understand the signatures of different physical structures to interpret specific lines of sight and future global surveys.Comment: ApJ in press 43 pages, 22 fig

    Three photometric methods tested on ground-based data of Q 2237+0305

    Get PDF
    The Einstein Cross, Q~2237+0305, has been photometrically observed in four bands on two successive nights at NOT (La Palma, Spain) in October 1995. Three independent algorithms have been used to analyse the data: an automatic image decomposition technique, a CLEAN algorithm and the new MCS deconvolution code. The photometric and astrometric results obtained with the three methods are presented. No photometric variations were found in the four quasar images. Comparison of the photometry from the three techniques shows that both systematic and random errors affect each method. When the seeing is worse than 1.0", the errors from the automatic image decomposition technique and the Clean algorithm tend to be large (0.04-0.1 magnitudes) while the deconvolution code still gives accurate results (1{sigma} error below 0.04) even for frames with seeing as bad as 1.7". Reddening is observed in the quasar images and is found to be compatible with either extinction from the lensing galaxy or colour dependent microlensing. The photometric accuracy depends on the light distribution used to model the lensing galaxy. In particular, using a numerical galaxy model, as done with the MCS algorithm, makes the method less seeing dependent. Another advantage of using a numerical model is that eventual non-homogeneous structures in the galaxy can be modeled. Finally, we propose an observational strategy for a future photometric monitoring of the Einstein Cross.Comment: 9 pages, accepted for publication in A&

    Detection of x-rays from galaxy groups associated with the gravitationally lensed systems PG 1115+080 and B1422+231

    Full text link
    Gravitational lenses that produce multiple images of background quasars can be an invaluable cosmological tool. Deriving cosmological parameters, however, requires modeling the potential of the lens itself. It has been estimated that up to a quarter of lensing galaxies are associated with a group or cluster which perturbs the gravitational potential. Detection of X-ray emission from the group or cluster can be used to better model the lens. We report on the first detection in X-rays of the group associated with the lensing system PG 1115+080 and the first X-ray image of the group associated with the system B1422+231. We find a temperature and rest-frame luminosity of 0.8 +/- 0.1 keV and 7 +/- 2 x 10^{42} ergs/s for PG 1115+080 and 1.0 +infty/-0.3 keV and 8 +/- 3 x 10^{42} ergs/s for B1422+231. We compare the spatial and spectral characteristics of the X-ray emission to the properties of the group galaxies, to lens models, and to the general properties of groups at lower redshift.Comment: Accepted for publication in ApJ. 17 pages, 5 figures. Minor changes to tex

    Variable Selection and Model Averaging in Semiparametric Overdispersed Generalized Linear Models

    Full text link
    We express the mean and variance terms in a double exponential regression model as additive functions of the predictors and use Bayesian variable selection to determine which predictors enter the model, and whether they enter linearly or flexibly. When the variance term is null we obtain a generalized additive model, which becomes a generalized linear model if the predictors enter the mean linearly. The model is estimated using Markov chain Monte Carlo simulation and the methodology is illustrated using real and simulated data sets.Comment: 8 graphs 35 page

    Études expérimentale et analytique des relations entre l'énergie et le parcours développable d'une particule chargée dans un détecteur visuel plastique

    No full text
    On étudie dans le makrofol le seuil d'enregistrement des traces d'ions lourds ainsi que les longueurs de trace développables. Les conditions physicochimiques de développement restant invariantes, on cherche des relations entre le temps d'attaque chimique t et la perte d'énergie spécifique critique Is. On donne des expressions analytiques empiriques du seuil Is et de la longueur développable L, en fonction de la perte spécifique d'énergie I de l'ion incident, pour différents temps d'attaque chimique t

    Étude cinématique des événements binaires et ternaires obtenus par bombardement de cibles d'U, Th, Bi, Pb, Au par des deutons de 2,1 GeV

    No full text
    Une étude cinématique de la fission binaire et ternaire induite par des deutons de 2,1 GeV sur des cibles U, Th, Pb, Bi et Au est réalisée à partir des caractéristiques géométriques des traces des fragments des événements enregistrés dans un détecteur solide de traces en géométrie 4π. Moyennant certaines hypothèses, on établit la distribution en rapport de masse et en énergie des fragments. On a pu mettre en évidence qu'une partie des événements binaires dont l'angle projeté sur le plan perpendiculaire à la trajectoire de la particule incidente est différent de 180°, procèdent d'un même type de désintégration que les événements ternaires. De même, l'analyse cinématique des événements ternaires montre une certaine analogie avec le phénomène connu de la tripartition de noyaux lourds comme le californium par exemple, et pourrait expliquer la production d'une partie des isotopes légers et lourds déficients en neutrons attribués à de la fragmentation par les radio-chimistes

    New high-technology products for the treatment of haemophilia

    Full text link
    This review will focus on new technologies in development that promise to lead to further advances in haemophilia therapeutics. There has been continued interest in the bioengineering of recombinant factor VIII (rFVIII) and factor IX (rFIX) with improved function to overcome some of the limitations in current treatment, the high costs of therapy and to increase availability to a broader world haemophilia population. Bioengineered forms of rFVIII, rFIX or alternative haemostatic molecules may ultimately have an impact on improving the efficacy of therapeutic strategies for the haemophilias by improving biosynthesis and secretion, functional activity, half-life and immunogenicity. Preventing and suppressing inhibitors to factor (F) VIII remain a challenge for both clinicians and scientists. Recent experiments have shown that it is possible to obtain anti-idiotypic antibodies with a number of desirable properties: (i) strong binding avidity to FVIII inhibitors; (ii) neutralization of inhibitory activity both in vitro and in vivo ; (iii) cross-reactivity with antibodies from unrelated patients, and (iv) no interference with FVIII function. An alternative, although complementary approach, makes use of peptides derived from filamentous-phage random libraries. Mimotopes of FVIII can be obtained, which bind to the paratope of inhibitory activity and neutralize their activity both in vitro and in vivo . In this paper, we review advanced genetic strategies for haemophilia therapy. Until recently the traditional concept for gene transfer of inherited and acquired haematological diseases has been focused on how best to obtain stable insertion of a cDNA into a target-cell genome, allowing expression of a therapeutic protein. However, as gene-transfer vector systems continue to improve, the requirement for regulated gene transcription and hence regulated protein expression will become more critical. Inappropriate protein expression levels or expression of transferred cDNAs in non-intended cell types or tissues may lead to target-cell toxicity or activation of unwanted host immune responses. Regulated protein expression requires that the transferred gene be transferred with its own regulatory cassette that allows for gene transcription and translation approaching that of the normal gene in its endogenous context. New molecular techniques, in particular the use of RNA molecules, now allow for transcription of corrective genes that mimic the normal state.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/75577/1/j.1365-2516.2004.00996.x.pd

    Speeding up Simplification of Polygonal Curves using Nested Approximations

    Full text link
    We develop a multiresolution approach to the problem of polygonal curve approximation. We show theoretically and experimentally that, if the simplification algorithm A used between any two successive levels of resolution satisfies some conditions, the multiresolution algorithm MR will have a complexity lower than the complexity of A. In particular, we show that if A has a O(N2/K) complexity (the complexity of a reduced search dynamic solution approach), where N and K are respectively the initial and the final number of segments, the complexity of MR is in O(N).We experimentally compare the outcomes of MR with those of the optimal "full search" dynamic programming solution and of classical merge and split approaches. The experimental evaluations confirm the theoretical derivations and show that the proposed approach evaluated on 2D coastal maps either shows a lower complexity or provides polygonal approximations closer to the initial curves.Comment: 12 pages + figure

    Rasmussen aneurysm

    Get PDF
    A 28-year-old native Romanian homeless man presented to emergency department with two episodes of hemoptysis (estimated 300 ml) in the previous three days. For about two months he had been suffering from a dry cough with concurrent right-sided chest pain, asthenia, weight loss and a slight fever. Clinical examination revealed crepitations over the right lung fields. He was hemodynamically stable and no sign of respiratory distress was detected. Blood tests revealed a high CRP (72 mg/l)
    corecore