57,541 research outputs found
Fourier Analysis of Gapped Time Series: Improved Estimates of Solar and Stellar Oscillation Parameters
Quantitative helio- and asteroseismology require very precise measurements of
the frequencies, amplitudes, and lifetimes of the global modes of stellar
oscillation. It is common knowledge that the precision of these measurements
depends on the total length (T), quality, and completeness of the observations.
Except in a few simple cases, the effect of gaps in the data on measurement
precision is poorly understood, in particular in Fourier space where the
convolution of the observable with the observation window introduces
correlations between different frequencies. Here we describe and implement a
rather general method to retrieve maximum likelihood estimates of the
oscillation parameters, taking into account the proper statistics of the
observations. Our fitting method applies in complex Fourier space and exploits
the phase information. We consider both solar-like stochastic oscillations and
long-lived harmonic oscillations, plus random noise. Using numerical
simulations, we demonstrate the existence of cases for which our improved
fitting method is less biased and has a greater precision than when the
frequency correlations are ignored. This is especially true of low
signal-to-noise solar-like oscillations. For example, we discuss a case where
the precision on the mode frequency estimate is increased by a factor of five,
for a duty cycle of 15%. In the case of long-lived sinusoidal oscillations, a
proper treatment of the frequency correlations does not provide any significant
improvement; nevertheless we confirm that the mode frequency can be measured
from gapped data at a much better precision than the 1/T Rayleigh resolution.Comment: Accepted for publication in Solar Physics Topical Issue
"Helioseismology, Asteroseismology, and MHD Connections
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
Review of the mathematical foundations of data fusion techniques in surface metrology
The recent proliferation of engineered surfaces, including freeform and structured surfaces, is challenging current metrology techniques. Measurement using multiple sensors has been proposed to achieve enhanced benefits, mainly in terms of spatial frequency bandwidth, which a single sensor cannot provide. When using data from different sensors, a process of data fusion is required and there is much active research in this area. In this paper, current data fusion methods and applications are reviewed, with a focus on the mathematical foundations of the subject. Common research questions in the fusion of surface metrology data are raised and potential fusion algorithms are discussed
On the Measurement of Privacy as an Attacker's Estimation Error
A wide variety of privacy metrics have been proposed in the literature to
evaluate the level of protection offered by privacy enhancing-technologies.
Most of these metrics are specific to concrete systems and adversarial models,
and are difficult to generalize or translate to other contexts. Furthermore, a
better understanding of the relationships between the different privacy metrics
is needed to enable more grounded and systematic approach to measuring privacy,
as well as to assist systems designers in selecting the most appropriate metric
for a given application.
In this work we propose a theoretical framework for privacy-preserving
systems, endowed with a general definition of privacy in terms of the
estimation error incurred by an attacker who aims to disclose the private
information that the system is designed to conceal. We show that our framework
permits interpreting and comparing a number of well-known metrics under a
common perspective. The arguments behind these interpretations are based on
fundamental results related to the theories of information, probability and
Bayes decision.Comment: This paper has 18 pages and 17 figure
Recommended from our members
An Assessment of PIER Electric Grid Research 2003-2014 White Paper
This white paper describes the circumstances in California around the turn of the 21st century that led the California Energy Commission (CEC) to direct additional Public Interest Energy Research funds to address critical electric grid issues, especially those arising from integrating high penetrations of variable renewable generation with the electric grid. It contains an assessment of the beneficial science and technology advances of the resultant portfolio of electric grid research projects administered under the direction of the CEC by a competitively selected contractor, the University of California’s California Institute for Energy and the Environment, from 2003-2014
The Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics
The Jeffreys-Lindley paradox displays how the use of a p-value (or number of
standard deviations z) in a frequentist hypothesis test can lead to an
inference that is radically different from that of a Bayesian hypothesis test
in the form advocated by Harold Jeffreys in the 1930s and common today. The
setting is the test of a well-specified null hypothesis (such as the Standard
Model of elementary particle physics, possibly with "nuisance parameters")
versus a composite alternative (such as the Standard Model plus a new force of
nature of unknown strength). The p-value, as well as the ratio of the
likelihood under the null hypothesis to the maximized likelihood under the
alternative, can strongly disfavor the null hypothesis, while the Bayesian
posterior probability for the null hypothesis can be arbitrarily large. The
academic statistics literature contains many impassioned comments on this
paradox, yet there is no consensus either on its relevance to scientific
communication or on its correct resolution. The paradox is quite relevant to
frontier research in high energy physics. This paper is an attempt to explain
the situation to both physicists and statisticians, in the hope that further
progress can be made.Comment: v4: Continued editing for clarity. Figure added. v5: Minor fixes to
biblio. Same as published version except for minor copy-edits, Synthese
(2014). v6: fix typos, and restore garbled sentence at beginning of Sec 4 to
v
Long-term monitoring of geodynamic surface deformation using SAR interferometry
Thesis (Ph.D.) University of Alaska Fairbanks, 2014Synthetic Aperture Radar Interferometry (InSAR) is a powerful tool to measure surface deformation and is well suited for surveying active volcanoes using historical and existing satellites. However, the value and applicability of InSAR for geodynamic monitoring problems is limited by the influence of temporal decorrelation and electromagnetic path delay variations in the atmosphere, both of which reduce the sensitivity and accuracy of the technique. The aim of this PhD thesis research is: how to optimize the quantity and quality of deformation signals extracted from InSAR stacks that contain only a low number of images in order to facilitate volcano monitoring and the study of their geophysical signatures. In particular, the focus is on methods of mitigating atmospheric artifacts in interferograms by combining time-series InSAR techniques and external atmospheric delay maps derived by Numerical Weather Prediction (NWP) models. In the first chapter of the thesis, the potential of the NWP Weather Research & Forecasting (WRF) model for InSAR data correction has been studied extensively. Forecasted atmospheric delays derived from operational High Resolution Rapid Refresh for the Alaska region (HRRRAK) products have been compared to radiosonding measurements in the first chapter. The result suggests that the HRRR-AK operational products are a good data source for correcting atmospheric delays in spaceborne geodetic radar observations, if the geophysical signal to be observed is larger than 20 mm. In the second chapter, an advanced method for integrating NWP products into the time series InSAR workflow is developed. The efficiency of the algorithm is tested via simulated data experiments, which demonstrate the method outperforms other more conventional methods. In Chapter 3, a geophysical case study is performed by applying the developed algorithm to the active volcanoes of Unimak Island Alaska (Westdahl, Fisher and Shishaldin) for long term volcano deformation monitoring. The volcano source location at Westdahl is determined to be approx. 7 km below sea level and approx. 3.5 km north of the Westdahl peak. This study demonstrates that Fisher caldera has had continuous subsidence over more than 10 years and there is no evident deformation signal around Shishaldin peak.Chapter 1. Performance of the High Resolution Atmospheric Model HRRR-AK for Correcting Geodetic Observations from Spaceborne Radars -- Chapter 2. Robust atmospheric filtering of InSAR data based on numerical weather prediction models -- Chapter 3. Subtle motion long term monitoring of Unimak Island from 2003 to 2010 by advanced time series SAR interferometry -- Chapter 4. Conclusion and future work
- …