1,848 research outputs found

    Front-End electronics configuration system for CMS

    Get PDF
    The four LHC experiments at CERN have decided to use a commercial SCADA (Supervisory Control And Data Acquisition) product for the supervision of their DCS (Detector Control System). The selected SCADA, which is therefore used for the CMS DCS, is PVSS II from the company ETM. This SCADA has its own database, which is suitable for storing conventional controls data such as voltages, temperatures and pressures. In addition, calibration data and FE (Front-End) electronics configuration need to be stored. The amount of these data is too large to be stored in the SCADA database [1]. Therefore an external database will be used for managing such data. However, this database should be completely integrated into the SCADA framework, it should be accessible from the SCADA and the SCADA features, e.g. alarming, logging should be benefited from. For prototyping, Oracle 8i was selected as the external database manager. The development of the control system for calibration constants and FE electronics configuration has been done in close collaboration with the CMS tracker group and JCOP (Joint COntrols Project)(1). (1)The four LHC experiments and the CERN IT/CO group has merged their efforts to build the experiments controls systems and set up the JCOP at the end of December, 1997 for this purpose.Comment: 3 pages, 4 figures, Icaleps'01 conference PSN WEDT00

    Anisotropic Magneto-Thermopower: the Contribution of Interband Relaxation

    Full text link
    Spin injection in metallic normal/ferromagnetic junctions is investigated taking into account the anisotropic magnetoresistance (AMR) occurring in the ferromagnetic layer. It is shown, on the basis of a generalized two channel model, that there is an interface resistance contribution due to anisotropic scattering, beyond spin accumulation and giant magnetoresistance (GMR). The corresponding expression of the thermopower is derived and compared with the expression for the thermopower produced by the GMR. First measurements of anisotropic magnetothermopower are presented in electrodeposited Ni nanowires contacted with Ni, Au and Cu. The results of this study show that while the giant magnetoresistance and corresponding thermopower demonstrates the role of spin-flip scattering, the observed anisotropic magnetothermopower indicates interband s-d relaxation mechanisms.Comment: 20 pages, 4 figure

    FEC-CCS: A common Front-End Controller card for the CMS detector electronics

    Get PDF
    The FEC-CCS is a custom made 9U VME64x card for the CMS Off-Detector electronics. The FEC-CCS card is responsible for distributing the fast timing signals and the slow control data, through optical links, to the Front-End system. Special effort has been invested in the design of the card in order to make it compatible with the operational requirements of multiple CMS detectors namely the Tracker, ECAL, Preshower, PIXELs, RPCs and TOTEM. This paper describes the design architecture of the FEC-CCS card focusing on the special design features that enable the common utilization by most of the CMS detectors. Results from the integration tests with the detector electronics subsystems and performance measurements will be reported. The design of a custom made testbench for the production testing of the 150 cards produced will be presented and the attained yield will be reported

    Data acquisition software for the CMS strip tracker

    Get PDF
    The CMS silicon strip tracker, providing a sensitive area of approximately 200 m2 and comprising 10 million readout channels, has recently been completed at the tracker integration facility at CERN. The strip tracker community is currently working to develop and integrate the online and offline software frameworks, known as XDAQ and CMSSW respectively, for the purposes of data acquisition and detector commissioning and monitoring. Recent developments have seen the integration of many new services and tools within the online data acquisition system, such as event building, online distributed analysis, an online monitoring framework, and data storage management. We review the various software components that comprise the strip tracker data acquisition system, the software architectures used for stand-alone and global data-taking modes. Our experiences in commissioning and operating one of the largest ever silicon micro-strip tracking systems are also reviewed

    Commissioning and Calibrating the CMS Silicon Strip Tracker

    Get PDF
    The data acquisition system for the CMS Silicon Strip Tracker (SST) is based around a custom analogue front-end ASIC, an analogue optical link system and an off-detector VME board that performs digitization, zero-suppression and data formatting. A complex procedure is required to optimally configure, calibrate and synchronize the 107 channels of the SST readout system. We present an overview of this procedure, which will be used to commission and calibrate the SST during the integration, Start-Up and operational phases of the experiment. Recent experiences from the CMS Magnet Test Cosmic Challenge and system tests at the Tracker Integration Facility are also reported

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    Spin-transfer in an open ferromagnetic layer: from negative damping to effective temperature

    Full text link
    Spin-transfer is a typical spintronics effect that allows a ferromagnetic layer to be switched by spin-injection. Most of the experimental results about spin transfer are described on the basis of the Landau-Lifshitz-Gilbert equation of the magnetization, in which additional current-dependent damping factors are added, and can be positive or negative. The origin of the damping can be investigated further by performing stochastic experiments, like one shot relaxation experiments under spin-injection in the activation regime of the magnetization. In this regime, the N\'eel-Brown activation law is observed which leads to the introduction of a current-dependent effective temperature. In order to justify the introduction of these counterintuitive parameters (effective temperature and negative damping), a detailed thermokinetic analysis of the different sub-systems involved is performed. We propose a thermokinetic description of the different forms of energy exchanged between the electric and the ferromagnetic sub-systems at a Normal/Ferromagnetic junction. The corresponding Fokker Planck equations, including relaxations, are derived. The damping coefficients are studied in terms of Onsager-Casimir transport coefficients, with the help of the reciprocity relations. The effective temperature is deduced in the activation regime.Comment: 65 pages, 10 figure

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    An investigation of the bits corruption in the IEEE 802.11p

    Get PDF
    Data rate management algorithms aim to perform a proper selection of the signal modulation and the coding rate to avoid the corruption of data bits. This paper describes a preliminary investigation on the bit corruption pattern related to the IEEE 802.11p standard. Measurements have been acquired with an experimental test-bed made up with a couple of software radios to perform white-box tests. Software radios are stationary and operate on the same channel without disturbances coming from concurrent communication. The aim of this experimental test-bed is to represent a static scenario where vehicles are stationary such as a crossroad situation. The data analysis shows that a data length reduction as an impact as much as a decrease of the data rate. A deeper analysis of the data bit corruption distribution highlights that some bits are more corrupted than others, rejecting the independent and identically distributed assumption for some situations. This opens a perspective to design algorithms dealing with multiple constraints, even if they are NP-complete

    Background Light in Potential Sites for the ANTARES Undersea Neutrino Telescope

    Get PDF
    The ANTARES collaboration has performed a series of {\em in situ} measurements to study the background light for a planned undersea neutrino telescope. Such background can be caused by 40^{40}K decays or by biological activity. We report on measurements at two sites in the Mediterranean Sea at depths of 2400~m and 2700~m, respectively. Three photomultiplier tubes were used to measure single counting rates and coincidence rates for pairs of tubes at various distances. The background rate is seen to consist of three components: a constant rate due to 40^{40}K decays, a continuum rate that varies on a time scale of several hours simultaneously over distances up to at least 40~m, and random bursts a few seconds long that are only correlated in time over distances of the order of a meter. A trigger requiring coincidences between nearby photomultiplier tubes should reduce the trigger rate for a neutrino telescope to a manageable level with only a small loss in efficiency.Comment: 18 pages, 8 figures, accepted for publication in Astroparticle Physic
    • …
    corecore