476 research outputs found

    Salt-gradient Solar Ponds: Summary of US Department of Energy Sponsored Research

    Get PDF
    The solar pond research program conducted by the United States Department of Energy was discontinued after 1983. This document summarizes the results of the program, reviews the state of the art, and identifies the remaining outstanding issues. Solar ponds is a generic term but, in the context of this report, the term solar pond refers specifically to saltgradient solar pond. Several small research solar ponds have been built and successfully tested. Procedures for filling the pond, maintaining the gradient, adjusting the zone boundaries, and extracting heat were developed. Theories and models were developed and verified. The major remaining unknowns or issues involve the physical behavior of large ponds; i.e., wind mixing of the surface, lateral range or reach of horizontally injected fluids, ground thermal losses, and gradient zone boundary erosion caused by pumping fluid for heat extraction. These issues cannot be scaled and must be studied in a large outdoor solar pond

    Anomaly detection in quasi-periodic energy consumption data series: a comparison of algorithms

    Get PDF
    The diffusion of domotics solutions and of smart appliances and meters enables the monitoring of energy consumption at a very fine level and the development of forecasting and diagnostic applications. Anomaly detection (AD) in energy consumption data streams helps identify data points or intervals in which the behavior of an appliance deviates from normality and may prevent energy losses and break downs. Many statistical and learning approaches have been applied to the task, but the need remains of comparing their performances with data sets of different characteristics. This paper focuses on anomaly detection on quasi-periodic energy consumption data series and contrasts 12 statistical and machine learning algorithms tested in 144 different configurations on 3 data sets containing the power consumption signals of fridges. The assessment also evaluates the impact of the length of the series used for training and of the size of the sliding window employed to detect the anomalies. The generalization ability of the top five methods is also evaluated by applying them to an appliance different from that used for training. The results show that classical machine learning methods (Isolation Forest, One-Class SVM and Local Outlier Factor) outperform the best neural methods (GRU/LSTM autoencoder and multistep methods) and generalize better when applied to detect the anomalies of an appliance different from the one used for training

    Two-colour generation in a chirped seeded Free-Electron Laser

    Full text link
    We present the experimental demonstration of a method for generating two spectrally and temporally separated pulses by an externally seeded, single-pass free-electron laser operating in the extreme-ultraviolet spectral range. Our results, collected on the FERMI@Elettra facility and confirmed by numerical simulations, demonstrate the possibility of controlling both the spectral and temporal features of the generated pulses. A free-electron laser operated in this mode becomes a suitable light source for jitter-free, two-colour pump-probe experiments

    Measurement of salinity distributions in salt‐stratified, double‐diffusive systems by optical deflectometry

    Get PDF
    This is the published version. Copyright © 1986 American Institute of PhysicsReliable salinity measurements in double‐diffusive thermohaline solutions are necessary to understand relevant system behavior. An optical technique, which has previously been used to investigate solutediffusion in isothermal systems, is employed to measure the salinity distribution in a double‐diffusive thermohaline system. The technique is verified by comparison with independent salinity measurements, and its use in a double‐diffusive system reveals detailed salinity distribution information. When used with the shadowgraph method of flow visualization, the salinity measurement technique permits a more quantitative interpretation of the shadowgraphic results

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    Two consecutive immunophenotypic switches in a child with MLL-rearranged acute lymphoblastic leukemia.

    Get PDF
    An 18-month-old girl was diagnosed with prepre-B ALL/t(4;11) leukemia, which during thetreatment and after matched bone marrow transplantation(BMT), underwent two consecutiveswitches from lymphoid to myeloid lineage andvice versa. The high expression of HOXA9 andFLT3 genes remaining genotypically stable in aleukemia throughout phenotypic switches, suggeststhat this leukemia may have originated as acommon B/myeloid progenitors

    The CMS Event Builder

    Full text link
    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragments of 8 data sources. The second stage combines the 64 super-fragments into full events. This architecture allows installation of the second stage of the event builder in steps, with the overall throughput scaling linearly with the number of switches in the second stage. Possible implementations of the components of the event builder are discussed and the expected performance of the full event builder is outlined.Comment: Conference CHEP0

    Расчет гашения обратного напряжения в импульсной схеме

    Get PDF
    Grid and e-science infrastructure interoperability is an increasing demand for Grid applications but interoperability based on common open standards adopted by Grid middle-wares are only starting to emerge on Grid infrastructures and are not broadly provided today. In earlier work we have shown how open standards can be improved by lessons learned from cross-Grid applications that require access to both, High Throughput Computing (HTC) resources as well as High Performance Computing (HPC) resources. This paper provides more insights in several concepts with a particular focus on effectively describing Grid job descriptions in order to satisfy the demands of e-scientists and their cross-Grid applications. Based on lessons learned over years gained with interoperability setups between production Grids such as EGEE, DEISA, and NorduGrid, we illustrate how common open Grid standards (i.e. JSDL and GLUE2) can take cross-Grid application experience into account
    corecore