630,863 research outputs found

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    Real-time adaptive algorithm for resource monitoring

    Get PDF
    In large scale systems, real-time monitoring of hardware and software resources is a crucial means for any management purpose. In architectures consisting of thousands of servers and hundreds of thousands of component resources, the amount of data monitored at high sampling frequencies represents an overhead on system performance and communication, while reducing sampling may cause quality degradation. We present a real-time adaptive algorithm for scalable data monitoring that is able to adapt the frequency of sampling and data updating for a twofold goal: to minimize computational and communication costs, to guarantee that reduced samples do not affect the accuracy of information about resources. Experiments carried out on heterogeneous data traces referring to synthetic and real environments confirm that the proposed adaptive approach reduces utilization and communication overhead without penalizing the quality of data with respect to existing monitoring algorithms

    The ALICE-LHC Online Data Quality Monitoring Framework: Present and Future

    Get PDF
    ALICE is one of the experiments under installation at CERN Large Hadron Collider, dedicated to the study of heavy-ion collisions. The final ALICE data acquisition system has been installed and is being used for the testing and commissioning of detectors. The online data quality monitoring is an important part of the DAQ software framework (DATE). In this presentation we overview the implementation and usage experience of the interactive tool MOOD used for the commissioning period of ALICE and we present the architecture of the automatic data quality monitoring framework, a distributed application aimed to produce, collect, analyze, visualize and store monitoring data in a large, experiment wide scale

    Peningkatan Kualitas Aplikasi Pemantau Media Sosial dan Media Daring Menggunakan Metode WebQEM

    Get PDF
    Along with the rapid growing world of technology nowadays, application becomes an important requirement for every human in person. Human becomes dependent on variety of applications, ranging from small scale application for entertaining user to large scale web application for one or multiple businesses work. In this research the application that used is social and online media monitoring at PT. XYZ in Indonesia. Based on this case, a software quality testing is needed so a software can be said worthy and qualified to use, not just limited to mere testing, but also for optimizing the software to create a well maintained application in accordance with software development life cycle. In this paper, WebQEM is used in optimization for web application. There are two kinds of evaluations: basic evaluation and global evaluation. First global evaluation of monitoring social media and online media application gives a score of 70,44%. After evaluation, an improvement is applied to the application according to the criteria from the result of first global evaluation. Second global evaluation gives a value of 77,41%, from the improvement in first global evaluation. This proves that WebQEM method for optimization of monitoring social media and online media application can improve the software quality

    Hydrological Monitoring with Hybrid Sensor Networks

    Get PDF
    Existing hydrological monitoring systems suffer from short- comings in accuracy, resolution, and scalability. Their fragility, high power consumption, and lack of autonomy necessitate frequent site visits. Cabling requirements and large size limit their scalability and make them prohibitively expensive. The research described in this paper proposes to alleviate these problems by pairing high-resolution in situ measure- ment with remote data collection and software maintenance. A hybrid sensor network composed of wired and wireless connections autonomously measures various attributes of the soil, including moisture, temperature, and resistivity. The mea- surements are communicated to a processing server over the existing GSM cellular infrastructure. This system enables the collection of data at a scale and resolution that is orders of magnitude greater than any existing method, while dramatically reducing the cost of monitoring. The quality and sheer volume of data collected as a result will enable previously infeasible research in hydrology

    Decentralized Coordination of Dynamic Software Updates in the Internet of Things

    Get PDF
    Large scale IoT service deployments run on a high number of distributed, interconnected computing nodes comprising sensors, actuators, gateways and cloud infrastructure. Since IoT is a fast growing, dynamic domain, the implementation of software components are subject to frequent changes addressing bug fixes, quality insurance or changed requirements. To ensure the continuous monitoring and control of processes, software updates have to be conducted while the nodes are operating without losing any sensed data or actuator instructions. Current IoT solutions usually support the centralized management and automated deployment of updates but are restricted to broadcasting the updates and local update processes at all nodes. In this paper we propose an update mechanism for IoT deployments that considers dependencies between services across multiple nodes involved in a common service and supports a coordinated update of component instances on distributed nodes. We rely on LyRT on all IoT nodes as the runtime supporting local disruption-minimal software updates. Our proposed middleware layer coordinates updates on a set of distributed nodes. We evaluated our approach using a demand response scenario from the smart grid domain

    Monitoring and Data Quality assessments for the ATLAS Liquid Argon Calorimeter at the LHC

    Get PDF
    The ATLAS detector at the Large Hadron Collider is expected to collect an unprecedented wealth of data at a completely new energy scale. In particular its Liquid Argon (LAr) electromagnetic and hadronic calorimeters will play an essential role in measuring final states with electrons and photons and in contributing to the measurement of jets and missing transverse energy. The ATLAS LAr calorimeter is a system of three sampling calorimeters (electromagnetic barrel, hadronic endcaps and forward calorimeters) with LAr as sensitive medium. It is composed by 182,468 readout channels and covers a pseudo-rapidity region up to 4.9. Efficient monitoring will be crucial from the earliest data taking onward and at multiple levels of the electronic readout and triggering systems. Detection of serious data integrity issues along the read-out chain during data taking will be essential so that quick actions can be taken. Moreover, by providing essential information about the performance of each sub-detector, the quality of the data collected (hot or dead channels, alignment and calibration problems, timing problems...) and their impact on physics measurable, the monitoring will be critical in guaranteeing that data is ready for physics analysis in due time. Software tools and criteria for monitoring the LAr data during the cosmic muon runs, which have been taking place since October 2006, are discussed. The further extension to the strategy f or monitoring collisions data expected for the end of year 2009 is also described

    Fine-scale change detection using unmanned aircraft systems (UAS) to inform reproductive biology in nesting Waterbirds

    Get PDF
    Aerial photographic surveys from manned aircraft are commonly used to estimate the size of bird breeding colonies but are rarely used to evaluate reproductive success. Recent technological advances have spurred interest in the use of unmanned aircraft systems (UAS) for monitoring wildlife. The ability to repeatedly sample and collect imagery at fine-scale spatial and temporal resolutions while minimizing disturbance and safety risks make UAS particularly appealing for monitoring colonial nesting waterbirds. In addition, advances in photogrammetric and GIS software have allowed for more streamlined data processing and analysis. Using UAS imagery collected at Anaho Island National Wildlife Refuge during the peak of the nesting bird season, I evaluated the utility of UAS for monitoring and informing the reproductive biology of breeding American white pelicans (Pelecanus erythrorhynchos). By using a multitemporal nearest neighbor analysis for fine-scale change detection, I developed a novel, automated method to differentiate nesting from non-nesting individuals. All UAS images collected were of sufficient pixel resolution to differentiate adult pelicans from chicks, surrounding landscape features, and other species nesting on the island. No visual signs of disturbance due to the UAS were recorded. Pelican counts derived from UAS imagery were significantly higher than counts made from the ground at observation stations on the island. Analysis of multitemporal images provided more accurate classifications of nesting birds than did monotemporal images, on the condition that multitemporal images aligned with less than 0.5 m error. Nest classifications using multitemporal imagery were not significantly different when conducted across a 24 hour period compared to a 2 hour period. This technology shows promise for greatly enhancing the quality of colony monitoring data for large colonies and a species that is highly sensitive to disturbance

    Air pollution assessment over Po valley (Italy) using satellite data and ground station measurements

    Get PDF
    Due to their effect on human health, the study of atmospheric pollutants is an important concern in the Po valley – northern Italy – one of the main industrialized and populated areas of the country. Our work focuses on the applicability of satellite Aerosol Optical Depth (AOD) retrievals in support of air quality monitoring and assessment in urban environments within the Po valley. This has been accomplished by using the implementation of the International MODIS/AIRS Processing Package (IMAPP) Air Quality Applications software, IDEA-I (Infusing satellite Data into Environmental Applications-International) over the Po valley study area. IDEA-I is a globally configurable software package that uses either Terra or Aqua MODerate resolution Imaging Spectro-radiometer (MODIS) AOD product retrievals to identify local domains of high values of aerosol. For our specific analyses, IDEA-I has been used over the large European domain, centred over the Po Valley. One year (2012) of MODIS AOD product retrievals from MODIS on board NASA’s Terra (MOD04) or Aqua (MYD04) satellite has been considered using IDEA-I in a retrospective study. These retrieved data have been also compared with the Particulate Matter (PM 10 ) measurements from the Italian Agency for Environmental Protection (ARPA) ground-based network stations. The acceptable results obtained by the correlation PM 10 – AOD suggest the satellite AOD as a good substitute for monitoring air quality over the Po valley domain. Yet the 10 km resolution of MODIS – AOD product is considered too large for air quality studies at urban scale. Recently, a new Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm has been developed for MODIS which provides AOD data at 1 km of spatial resolution. We have evaluated ability of MODIS product MOD04 and MAIAC products to characterize the spatial distribution of aerosols in the urban area through comparison with surface PM 10 measurements. Using MAIAC data at 1 km, we have examined the relationship between PM 10 concentrations, AOD, and AOD normalized by Planetary Boundary Layer (PBL) depths obtained from NOAA National Center for Environmental Prediction (NCEP) Global Data Assimilation System (GDAS), for the same period of analysis. Results show that the MAIAC retrieval provides a high resolution depiction of the AOD within the Po Valley and performs nearly as well in a statistical sense as the standard MODIS retrieval during the time period considered. Results also highlight that normalization by the analyzed PBL depth to obtain an estimate of the mean boundary layer extinction is needed to capture the seasonal cycle of the observed PM 10 over the Po Valley
    • …
    corecore