316 research outputs found

    Autoantibodies Against Oxidized LDLs and Atherosclerosis in Type 2 Diabetes

    Get PDF
    OBJECTIVE —The aim of our study was to examine, in type 2 diabetic patients, the relationship between autoantibodies against oxidatively modified LDL (oxLDL Ab) and two indexes of atherosclerosis, intimal-medial thickness of the common carotid artery (CCA-IMT), which reflects early atherosclerosis, and the ankle-brachial index (ABI), which reflects advanced atherosclerosis. RESEARCH DESIGN AND METHODS —Thirty newly diagnosed type 2 diabetic patients, 30 type 2 diabetic patients with long duration of disease, and 56 control subjects were studied. To detect oxLDL Ab, the ImmunoLisa Anti-oxLDL Antibody ELISA was used. ABI was estimated at rest by strain-gauge plethysmography. Carotid B-mode imaging was performed on a high-resolution imaging system (ATL HDI 5000). RESULTS —In patients with long duration of disease, IgG oxLDL Ab were significantly higher and ABI significantly lower compared with the other two groups. We found a correlation between IgG oxLDL Ab and CCA-IMT in all diabetic patients. A significant inverse correlation between IgG oxLDL Ab and ABI only in patients with long duration of disease was seen, demonstrating a close relationship between these autoantibodies and advanced atherosclerosis. CONCLUSIONS —IgG OxLDL Ab may be markers of the advanced phase of the atherosclerotic process and the response of the immunological system to the oxLDL, which are present within atherosclerotic lesions

    Observational Mass-to-Light Ratio of Galaxy Systems: from Poor Groups to Rich Clusters

    Get PDF
    We study the mass-to-light ratio of galaxy systems from poor groups to rich clusters, and present for the first time a large database for useful comparisons with theoretical predictions. We extend a previous work, where B_j band luminosities and optical virial masses were analyzed for a sample of 89 clusters. Here we also consider a sample of 52 more clusters, 36 poor clusters, 7 rich groups, and two catalogs, of about 500 groups each, recently identified in the Nearby Optical Galaxy sample by using two different algorithms. We obtain the blue luminosity and virial mass for all systems considered. We devote a large effort to establishing the homogeneity of the resulting values, as well as to considering comparable physical regions, i.e. those included within the virial radius. By analyzing a fiducial, combined sample of 294 systems we find that the mass increases faster than the luminosity: the linear fit gives M\propto L_B^{1.34 \pm 0.03}, with a tendency for a steeper increase in the low--mass range. In agreement with the previous work, our present results are superior owing to the much higher statistical significance and the wider dynamical range covered (about 10^{12}-10^{15} M_solar). We present a comparison between our results and the theoretical predictions on the relation between M/L_B and halo mass, obtained by combining cosmological numerical simulations and semianalytic modeling of galaxy formation.Comment: 25 pages, 12 eps figures, accepted for publication in Ap

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Off-line radiometric analysis of Planck/LFI data

    Get PDF
    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022

    Planck pre-launch status: Low Frequency Instrument calibration and expected scientific performance

    Get PDF
    We give the calibration and scientific performance parameters of the Planck Low Frequency Instrument (LFI) measured during the ground cryogenic test campaign. These parameters characterise the instrument response and constitute our best pre-launch knowledge of the LFI scientific performance. The LFI shows excellent 1/f1/f stability and rejection of instrumental systematic effects; measured noise performance shows that LFI is the most sensitive instrument of its kind. The set of measured calibration parameters will be updated during flight operations through the end of the mission.Comment: Accepted for publications in Astronomy and Astrophysics. Astronomy & Astrophysics, 2010 (acceptance date: 12 Jan 2010

    Lowering the energy threshold in COSINE-100 dark matter searches

    Full text link
    COSINE-100 is a dark matter detection experiment that uses NaI(Tl) crystal detectors operating at the Yangyang underground laboratory in Korea since September 2016. Its main goal is to test the annual modulation observed by the DAMA/LIBRA experiment with the same target medium. Recently DAMA/LIBRA has released data with an energy threshold lowered to 1 keV, and the persistent annual modulation behavior is still observed at 9.5σ\sigma. By lowering the energy threshold for electron recoils to 1 keV, COSINE-100 annual modulation results can be compared to those of DAMA/LIBRA in a model-independent way. Additionally, the event selection methods provide an access to a few to sub-GeV dark matter particles using constant rate studies. In this article, we discuss the COSINE-100 event selection algorithm, its validation, and efficiencies near the threshold
    • 

    corecore