234,367 research outputs found
LISACode : A scientific simulator of LISA
A new LISA simulator (LISACode) is presented. Its ambition is to achieve a
new degree of sophistication allowing to map, as closely as possible, the
impact of the different sub-systems on the measurements. LISACode is not a
detailed simulator at the engineering level but rather a tool whose purpose is
to bridge the gap between the basic principles of LISA and a future,
sophisticated end-to-end simulator. This is achieved by introducing, in a
realistic manner, most of the ingredients that will influence LISA's
sensitivity as well as the application of TDI combinations. Many user-defined
parameters allow the code to study different configurations of LISA thus
helping to finalize the definition of the detector. Another important use of
LISACode is in generating time series for data analysis developments
Recommended from our members
Expert-based development of a standard in CO2 sequestration monitoring technology
Bureau of Economic Geolog
Pyramid: Enhancing Selectivity in Big Data Protection with Count Featurization
Protecting vast quantities of data poses a daunting challenge for the growing
number of organizations that collect, stockpile, and monetize it. The ability
to distinguish data that is actually needed from data collected "just in case"
would help these organizations to limit the latter's exposure to attack. A
natural approach might be to monitor data use and retain only the working-set
of in-use data in accessible storage; unused data can be evicted to a highly
protected store. However, many of today's big data applications rely on machine
learning (ML) workloads that are periodically retrained by accessing, and thus
exposing to attack, the entire data store. Training set minimization methods,
such as count featurization, are often used to limit the data needed to train
ML workloads to improve performance or scalability. We present Pyramid, a
limited-exposure data management system that builds upon count featurization to
enhance data protection. As such, Pyramid uniquely introduces both the idea and
proof-of-concept for leveraging training set minimization methods to instill
rigor and selectivity into big data management. We integrated Pyramid into
Spark Velox, a framework for ML-based targeting and personalization. We
evaluate it on three applications and show that Pyramid approaches
state-of-the-art models while training on less than 1% of the raw data
Instrumental and Analytic Methods for Bolometric Polarimetry
We discuss instrumental and analytic methods that have been developed for the
first generation of bolometric cosmic microwave background (CMB) polarimeters.
The design, characterization, and analysis of data obtained using Polarization
Sensitive Bolometers (PSBs) are described in detail. This is followed by a
brief study of the effect of various polarization modulation techniques on the
recovery of sky polarization from scanning polarimeter data. Having been
successfully implemented on the sub-orbital Boomerang experiment, PSBs are
currently operational in two terrestrial CMB polarization experiments (QUaD and
the Robinson Telescope). We investigate two approaches to the analysis of data
from these experiments, using realistic simulations of time ordered data to
illustrate the impact of instrumental effects on the fidelity of the recovered
polarization signal. We find that the analysis of difference time streams takes
full advantage of the high degree of common mode rejection afforded by the PSB
design. In addition to the observational efforts currently underway, this
discussion is directly applicable to the PSBs that constitute the polarized
capability of the Planck HFI instrument.Comment: 23 pages, 11 figures. for submission to A&
Reset control for DC-DC converters: an experimental application
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Power converters in grid connected systems are required to have fast response to ensure the stability of the system. The standard PI controllers used in most power converters are capable of fast response but with significant overshoot. In this paper a hybrid control technique for power converter using a reset PI + CI controller is proposed. The PI + CI controller can overcome the limitation of its linear counterpart (PI) and ensure a fast flat response for power converter. The design, stability and cost of feedback analysis for a DC-DC boost converter employing a PI + CI controller is explored in this work. The simulation and experimental results which confirm the fast, flat response will be presented and discussed.Peer ReviewedPostprint (published version
Recommended from our members
Fast, non-monte-carlo estimation of transient performance variation due to device mismatch
This paper describes an efficient way of simulating the effects of device random mismatch on circuit transient characteristics, such as variations in delay or in frequency. The proposed method models DC random offsets as equivalent AC pseudo-noises and leverages the fast, linear periodically time-varying (LPTV) noise analysis available from RF circuit simulators. Therefore, the method can be considered as an extension to DC match analysis and offers a large speed-up compared to the traditional Monte-Carlo analysis. Although the assumed linear perturbation model is valid only for small variations, it enables easy ways to estimate correlations among variations and identify the most sensitive design parameters to mismatch, all at no additional simulation cost. Three benchmarks measuring the variations in the input offset voltage of a clocked comparator, the delay of a logic path, and the frequency of an oscillator demonstrate the speed improvement of about 100-1000x compared to a 1000-point Monte-Carlo method
Deterministic-statistical model coupling in a DSS for river-basin management
This paper presents a method for appropriate coupling of deterministic and statistical models. In the decision-support system for the Elbe river, a conceptual rainfall-runoff model is used to obtain the discharge statistics and corresponding average number of flood days, which is a key input variable for a rule-based model for floodplain vegetation. The required quality of the discharge time series cannot be determined by a sensitivity analysis because a deterministic model is linked to a statistical model. To solve the problem, artificial discharge time series are generated that mimic the hypothetical output of rainfall-runoff models of different accuracy. The results indicate that a feasible calibration of the rainfall-runoff model is sufficient to obtain consistency with the vegetation model in view of its sensitivity to changes in the number of flood days in the floodplains
Millimeter wave satellite concepts, volume 1
The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications
- âŠ