657 research outputs found

    Highly Ionized High-Velocity Clouds toward PKS 2155-304 and Markarian 509

    Full text link
    To gain insight into four highly ionized high-velocity clouds (HVCs) discovered by Sembach et al. (1999), we have analyzed data from the Hubble Space Telescope (HST) and Far Ultraviolet Spectroscopic Explorer (FUSE) for the PKS 2155-304 and Mrk 509 sight lines. We measure strong absorption in OVI and column densities of multiple ionization stages of silicon (SiII/III/IV) and carbon (CII/III/IV). We interpret this ionization pattern as a multiphase medium that contains both collisionally ionized and photoionized gas. Toward PKS 2155-304, for HVCs at -140 and -270 km/s, respectively, we measure logN(OVI)=13.80+/-0.03 and log N(OVI)=13.56+/-0.06; from Lyman series absorption, we find log N(HI)=16.37^(+0.22)_(-0.14) and 15.23^(+0.38)_(-0.22). The presence of high-velocity OVI spread over a broad (100 km/s) profile, together with large amounts of low-ionization species, is difficult to reconcile with the low densities, n=5x10^(-6) cm^(-3), in the collisional/photoionization models of Nicastro et al. (2002), although the HVCs show a similar relation in N(SiIV)/N(CIV) versus N(CII)/N(CIV) as high-z intergalactic clouds. Our results suggest that the high-velocity OVI in these absorbers do not necessarily trace the WHIM, but instead may trace HVCs with low total hydrogen column density. We propose that the broad high-velocity OVI absorption arises from shock ionization, at bowshock interfaces produced from infalling clumps of gas with velocity shear. The similar ratios of high ions for HVC Complex C and these highly ionized HVCs suggest a common production mechanism in the Galactic halo.Comment: 38 pages, including 10 figures. ApJ, 10 April, 2004. Replaced with accepted versio

    VALIDITY OF A TREADMILL-MOUNTED PHOTOELECTRIC SYSTEM FOR MEASURING SPATIOTEMPORAL PARAMETERS OVER A RANGE OF RUNNING SPEEDS

    Get PDF
    The purpose of this study was to determine the concurrent validity of a treadmill-mounted photoelectric system (Optojump) for measuring spatiotemporal parameters of runners at a range of running speeds (12-16 km/h). Ten participants ran for 20 s at each of three different speeds (12, 14 and 16 km/h) on a HP Cosmos Pulsar treadmill while spatiotemporal parameters were measured by both the Optojump and a high-speed camera (960 fps). The Optojump was placed on the sides of the treadmill as per the manufacturer protocols. Large timing errors for contact time (13.1%) and swing time (6.8%) were recorded, while excellent validity was shown for the other parameters of stride time, stride length and stride frequency (errors less than 0.6%). Increases in gait speed resulted in significantly lower error values for both contact time and swing time, but had no effect on the other variables. Early identification of initial contact and delayed identification of toe-off in the Optojump system due to placement on the slightly elevated sides of the treadmill are hypothesised to be the cause of the notable errors in contact time and swing time. However, these systematic errors do not negatively affect the other spatiotemporal parameters of stride time, stride length and stride frequency which are all still accurately measured by the Optojump in this set-u

    A Fluorescent Aerogel for Capture and Identification of Interplanetary and Interstellar Dust

    Full text link
    Contemporary interstellar dust has never been analyzed in the laboratory, despite its obvious astronomical importance and its potential as a probe of stellar nucleosynthesis and galactic chemical evolution. Here we report the discovery of a novel fluorescent aerogel which is capable of capturing hypervelocity dust grains and passively recording their kinetic energies. An array of these "calorimetric" aerogel collectors in low earth orbit would lead to the capture and identification of large numbers of interstellar dust grains.Comment: 13 pages, 4 figures, to appear in The Astrophysical Journa

    Excavation damage zone fracture modelling for seismic tomography : a comparison of explicit fractures and effective medium approaches

    Get PDF
    We model the full wavefield produced by a seismic velocity survey and optimise the representation of the fracture zone to best match field waveforms. The velocity survey was part of a mapping study on fractures in the Excavation Damage Zone (EDZ) of ONKALO underground research facility at Olkiluoto. The EDZ results from excavation of the rock mass, which modifies stress conditions changing the nature and behaviour of pre-existing fractures and generating new fracturing. These fractures act as the main transport pathways for contaminants both in and out of a geological disposal facility (GDF). Our goal is to test different representations of the fracture zone and to determine which models most successfully improve the interpretation of the fracture zone, producing estimates of a key unknown parameter, fracture stiffness, in addition to fracture sizes, fracture geometry, fracture density and crack density. We use modelling techniques previously tested in theoretical and laboratory studies and assess their performance on a real engineering problem. The paper introduces the field experiment and relevant information from the GDF in Finland. It describes the methodologies used for representing the fracture networks in the models — Explicit Fracture models with two approximations called Pixelised Fracture Model (PFM) and Equivalent Discrete Fracture Medium (EDFM), the Effective Medium (EM) model, and two versions of the Localised Effective Medium (LEM) model (LEM fine, LEM thick). These alternative representations were used within models of the field experiment and the calculated waveforms were used in an iterative inversion for fracture stiffness. Results show that the EM model and the EDFM model were unsuccessful in matching recorded waveforms. The fine LEM model and the explicit PFM model produced the best results especially after iterative optimisation of the fracture stiffness, giving confidence that further optimisation will lead to improved characterisation of the fracturing from the full waveform data

    Lomas Las Tetas de Cabra fauna

    Get PDF
    88 p. : ill. (1 col.), maps ; 26 cm.Includes bibliographical references (p. 64-70)."Fossil mammal and other vertebrate remains from the Lomas Las Tetas de Cabra in Baja California Norte, Mexico, provide an opportunity to examine the utility of continental scale geochronologies based on land mammal faunas. Early reports proposed a late Paleocene to early Eocene age for this fauna. Recent fieldwork and considerations of cumulative fossil discoveries strongly indicate that the Baja fauna represents the Wasatchian Land Mammal Age (early Eocene) and is strikingly similar to faunas of this age from the western interior of the United States. Wasatchian-age taxa represented in the Baja assemblage include Hyracotherium, Hyopsodus, Meniscotherium (also possibly from Clarkforkian assemblages), Diacodexis, and Prolimnocyon. Also present in the fauna are excellent specimens of Wyolestes and Esteslestes, a new genus of didelphid marsupial, as well as a badly distorted skull of a pantodont. An early Eocene age assignment is supported by analysis of the marine section adjacent to the Tetas de Cabra sequence. The marine organisms are consistent with a middle Ypresian (early Eocene) age assignment. Paleomagnetic analyses of both the terrestrial and marine sections also corroborate this age assignment. These new results substantiate the validity of the Wasatchian as a discrete temporal interval that can be applied at a continental scale. The Wasatchian thus fulfills the expectations for a mammal-based chronology. Similarities, rather than differences, between the Baja assemblage and other Wasatchian-age faunas is the dominant pattern. A choice among dispersal theories for the sources of Wasatchian mammals is not clearly indicated by the faunal evidence"--P. 3

    Constraint methods for determining pathways and free energy of activated processes

    Full text link
    Activated processes from chemical reactions up to conformational transitions of large biomolecules are hampered by barriers which are overcome only by the input of some free energy of activation. Hence, the characteristic and rate-determining barrier regions are not sufficiently sampled by usual simulation techniques. Constraints on a reaction coordinate r have turned out to be a suitable means to explore difficult pathways without changing potential function, energy or temperature. For a dense sequence of values of r, the corresponding sequence of simulations provides a pathway for the process. As only one coordinate among thousands is fixed during each simulation, the pathway essentially reflects the system's internal dynamics. From mean forces the free energy profile can be calculated to obtain reaction rates and insight in the reaction mechanism. In the last decade, theoretical tools and computing capacity have been developed to a degree where simulations give impressive qualitative insight in the processes at quantitative agreement with experiments. Here, we give an introduction to reaction pathways and coordinates, and develop the theory of free energy as the potential of mean force. We clarify the connection between mean force and constraint force which is the central quantity evaluated, and discuss the mass metric tensor correction. Well-behaved coordinates without tensor correction are considered. We discuss the theoretical background and practical implementation on the example of the reaction coordinate of targeted molecular dynamics simulation. Finally, we compare applications of constraint methods and other techniques developed for the same purpose, and discuss the limits of the approach

    Publishing and sharing multi-dimensional image data with OMERO

    Get PDF
    Imaging data are used in the life and biomedical sciences to measure the molecular and structural composition and dynamics of cells, tissues, and organisms. Datasets range in size from megabytes to terabytes and usually contain a combination of binary pixel data and metadata that describe the acquisition process and any derived results. The OMERO image data management platform allows users to securely share image datasets according to specific permissions levels: data can be held privately, shared with a set of colleagues, or made available via a public URL. Users control access by assigning data to specific Groups with defined membership and access rights. OMERO’s Permission system supports simple data sharing in a lab, collaborative data analysis, and even teaching environments. OMERO software is open source and released by the OME Consortium at www.openmicroscopy.org

    Data quality in the human and environmental health sciences: Using statistical confidence scoring to improve QSAR/QSPR modeling

    Get PDF
    A greater number of toxicity data are becoming publicly available allowing for in silico modeling. However, questions often arise as how to incorporate data quality and how to deal with contradicting data if more than a single datum point is available for the same compound. In this study, two well-known and studied QSAR/QSPR models for skin permeability and aquatic toxicology have been investigated in the context of statistical data quality. In particular, the potential benefits of the incorporation of the statistical Confidence Scoring (CS) approach within modelling and validation. As a result, robust QSAR/QSPR models for the skin permeability coefficient and the toxicity of nonpolar narcotics to Aliivibrio fischeri assay were created. CSweighted linear regression for training and CS-weighted root mean square error (RMSE) for validation were statistically superior compared to standard linear regression and standard RMSE. Strategies are proposed as to how to interpret data with high and low CS, as well as how to deal with large datasets containing multiple entries

    New directions in cellular therapy of cancer: a summary of the summit on cellular therapy for cancer

    Get PDF
    A summit on cellular therapy for cancer discussed and presented advances related to the use of adoptive cellular therapy for melanoma and other cancers. The summit revealed that this field is advancing rapidly. Conventional cellular therapies, such as tumor infiltrating lymphocytes (TIL), are becoming more effective and more available. Gene therapy is becoming an important tool in adoptive cell therapy. Lymphocytes are being engineered to express high affinity T cell receptors (TCRs), chimeric antibody-T cell receptors (CARs) and cytokines. T cell subsets with more naïve and stem cell-like characteristics have been shown in pre-clinical models to be more effective than unselected populations and it is now possible to reprogram T cells and to produce T cells with stem cell characteristics. In the future, combinations of adoptive transfer of T cells and specific vaccination against the cognate antigen can be envisaged to further enhance the effectiveness of these therapies
    corecore