25,519 research outputs found
Inferential Modeling and Independent Component Analysis for Redundant Sensor Validation
The calibration of redundant safety critical sensors in nuclear power plants is a manual task that consumes valuable time and resources. Automated, data-driven techniques, to monitor the calibration of redundant sensors have been developed over the last two decades, but have not been fully implemented. Parity space methods such as the Instrumentation and Calibration Monitoring Program (ICMP) method developed by Electric Power Research Institute and other empirical based inferential modeling techniques have been developed but have not become viable options.
Existing solutions to the redundant sensor validation problem have several major flaws that restrict their applications. Parity space method, such as ICMP, are not robust for low redundancy conditions and their operation becomes invalid when there are only two redundant sensors. Empirical based inferential modeling is only valid when intrinsic correlations between predictor variables and response variables remain static during the model training and testing phase. They also commonly produce high variance results and are not the optimal solution to the problem.
This dissertation develops and implements independent component analysis (ICA) for redundant sensor validation. Performance of the ICA algorithm produces sufficiently low residual variance parameter estimates when compared to simple averaging, ICMP, and principal component regression (PCR) techniques. For stationary signals, it can detect and isolate sensor drifts for as few as two redundant sensors. It is fast and can be embedded into a real-time system. This is demonstrated on a water level control system.
Additionally, ICA has been merged with inferential modeling technique such as PCR to reduce the prediction error and spillover effects from data anomalies. ICA is easy to use with, only the window size needing specification.
The effectiveness and robustness of the ICA technique is shown through the use of actual nuclear power plant data. A bootstrap technique is used to estimate the prediction uncertainties and validate its usefulness. Bootstrap uncertainty estimates incorporate uncertainties from both data and the model. Thus, the uncertainty estimation is robust and varies from data set to data set.
The ICA based system is proven to be accurate and robust; however, classical ICA algorithms commonly fail when distributions are multi-modal. This most likely occurs during highly non-stationary transients. This research also developed a unity check technique which indicates such failures and applies other, more robust techniques during transients. For linear trending signals, a rotation transform is found useful while standard averaging techniques are used during general transients
Calibrating CHIME, A New Radio Interferometer to Probe Dark Energy
The Canadian Hydrogen Intensity Mapping Experiment (CHIME) is a transit
interferometer currently being built at the Dominion Radio Astrophysical
Observatory (DRAO) in Penticton, BC, Canada. We will use CHIME to map neutral
hydrogen in the frequency range 400 -- 800\,MHz over half of the sky, producing
a measurement of baryon acoustic oscillations (BAO) at redshifts between 0.8 --
2.5 to probe dark energy. We have deployed a pathfinder version of CHIME that
will yield constraints on the BAO power spectrum and provide a test-bed for our
calibration scheme. I will discuss the CHIME calibration requirements and
describe instrumentation we are developing to meet these requirements
Redundancy Calibration of Phased Array Stations
Our aim is to assess the benefits and limitations of using the redundant
visibility information in regular phased array systems for improving the
calibration.
Regular arrays offer the possibility to use redundant visibility information
to constrain the calibration of the array independent of a sky model and a beam
models of the station elements. It requires a regular arrangement in the
configuration of array elements and identical beam patterns.
We revised a calibration method for phased array stations using the redundant
visibility information in the system and applied it successfully to a LOFAR
station. The performance and limitations of the method were demonstrated by
comparing its use on real and simulated data. The main limitation is the mutual
coupling between the station elements, which leads to non-identical beams and
stronger baseline dependent noise. Comparing the variance of the estimated
complex gains with the Cramer-Rao Bound (CRB) indicates that redundancy is a
stable and optimum method for calibrating the complex gains of the system.
Our study shows that the use of the redundant visibility does improve the
quality of the calibration in phased array systems. In addition it provides a
powerful tool for system diagnostics. Our results demonstrate that designing
redundancy in both the station layout and the array configuration of future
aperture arrays is strongly recommended. In particular in the case of the
Square Kilometre Array with its dynamic range requirement which surpasses any
existing array by an order of magnitude.Comment: 16 pages, 15 figures, accepted for publication in the A&A in Section
13, acceptance date: 1st May 2012. NOTE: Please contact the first author for
high resolution figure
Fundamental Imaging Limits of Radio Telescope Arrays
The fidelity of radio astronomical images is generally assessed by practical
experience, i.e. using rules of thumb, although some aspects and cases have
been treated rigorously. In this paper we present a mathematical framework
capable of describing the fundamental limits of radio astronomical imaging
problems. Although the data model assumes a single snapshot observation, i.e.
variations in time and frequency are not considered, this framework is
sufficiently general to allow extension to synthesis observations. Using tools
from statistical signal processing and linear algebra, we discuss the
tractability of the imaging and deconvolution problem, the redistribution of
noise in the map by the imaging and deconvolution process, the covariance of
the image values due to propagation of calibration errors and thermal noise and
the upper limit on the number of sources tractable by self calibration. The
combination of covariance of the image values and the number of tractable
sources determines the effective noise floor achievable in the imaging process.
The effective noise provides a better figure of merit than dynamic range since
it includes the spatial variations of the noise. Our results provide handles
for improving the imaging performance by design of the array.Comment: 12 pages, 8 figure
Non-linear Redundancy Calibration
For radio interferometric arrays with a sufficient number of redundant
spacings the multiplicity of measurements of the same sky visibility can be
used to determine both the antenna gains as well as the true visibilities. Many
of the earlier approaches to this problem focused on linearized versions of the
relation between the measured and the true visibilities. Here we propose to use
a standard non-linear minimization algorithm to solve for both the antenna
gains as well as the true visibilities. We show through simulations done in the
context of the ongoing upgrade to the Ooty Radio Telescope that the non-linear
minimization algorithm is fast compared to the earlier approaches. Further,
unlike the most straightforward linearized approach, which works with the
logarithms of the visibilities and the gains, the non-linear minimization
algorithm leads to unbiased solutions. Finally we present error estimates for
the estimated gains and visibilities. Monte-Carlo simulations establish that
the estimator is indeed statistically efficient, achieving the Cramer-Rao
bound.Comment: 9 pages, 5 figures. Accepted for publication in MNRAS. The definitive
version will be available at http://mnras.oxfordjournals.or
Hydrogen Epoch of Reionization Array (HERA)
The Hydrogen Epoch of Reionization Array (HERA) is a staged experiment to
measure 21 cm emission from the primordial intergalactic medium (IGM)
throughout cosmic reionization (), and to explore earlier epochs of our
Cosmic Dawn (). During these epochs, early stars and black holes
heated and ionized the IGM, introducing fluctuations in 21 cm emission. HERA is
designed to characterize the evolution of the 21 cm power spectrum to constrain
the timing and morphology of reionization, the properties of the first
galaxies, the evolution of large-scale structure, and the early sources of
heating. The full HERA instrument will be a 350-element interferometer in South
Africa consisting of 14-m parabolic dishes observing from 50 to 250 MHz.
Currently, 19 dishes have been deployed on site and the next 18 are under
construction. HERA has been designated as an SKA Precursor instrument.
In this paper, we summarize HERA's scientific context and provide forecasts
for its key science results. After reviewing the current state of the art in
foreground mitigation, we use the delay-spectrum technique to motivate
high-level performance requirements for the HERA instrument. Next, we present
the HERA instrument design, along with the subsystem specifications that ensure
that HERA meets its performance requirements. Finally, we summarize the
schedule and status of the project. We conclude by suggesting that, given the
realities of foreground contamination, current-generation 21 cm instruments are
approaching their sensitivity limits. HERA is designed to bring both the
sensitivity and the precision to deliver its primary science on the basis of
proven foreground filtering techniques, while developing new subtraction
techniques to unlock new capabilities. The result will be a major step toward
realizing the widely recognized scientific potential of 21 cm cosmology.Comment: 26 pages, 24 figures, 2 table
Mapping our Universe in 3D with MITEoR
Mapping our universe in 3D by imaging the redshifted 21 cm line from neutral
hydrogen has the potential to overtake the cosmic microwave background as our
most powerful cosmological probe, because it can map a much larger volume of
our Universe, shedding new light on the epoch of reionization, inflation, dark
matter, dark energy, and neutrino masses. We report on MITEoR, a pathfinder
low-frequency radio interferometer whose goal is to test technologies that
greatly reduce the cost of such 3D mapping for a given sensitivity. MITEoR
accomplishes this by using massive baseline redundancy both to enable automated
precision calibration and to cut the correlator cost scaling from N^2 to NlogN,
where N is the number of antennas. The success of MITEoR with its 64
dual-polarization elements bodes well for the more ambitious HERA project,
which would incorporate many identical or similar technologies using an order
of magnitude more antennas, each with dramatically larger collecting area.Comment: To be published in proceedings of 2013 IEEE International Symposium
on Phased Array Systems & Technolog
- …