28,923 research outputs found
Pilot interaction with automated airborne decision making systems
An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered
Aeronautical Engineering: A continuing bibliography, supplement 120
This bibliography contains abstracts for 297 reports, articles, and other documents introduced into the NASA scientific and technical information system in February 1980
The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools
On September 14, 2015, the newly upgraded Laser Interferometer
Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW)
signal, emitted a billion light-years away by a coalescing binary of two
stellar-mass black holes. The detection was announced in February 2016, in time
for the hundredth anniversary of Einstein's prediction of GWs within the theory
of general relativity (GR). The signal represents the first direct detection of
GWs, the first observation of a black-hole binary, and the first test of GR in
its strong-field, high-velocity, nonlinear regime. In the remainder of its
first observing run, LIGO observed two more signals from black-hole binaries,
one moderately loud, another at the boundary of statistical significance. The
detections mark the end of a decades-long quest, and the beginning of GW
astronomy: finally, we are able to probe the unseen, electromagnetically dark
Universe by listening to it. In this article, we present a short historical
overview of GW science: this young discipline combines GR, arguably the
crowning achievement of classical physics, with record-setting, ultra-low-noise
laser interferometry, and with some of the most powerful developments in the
theory of differential geometry, partial differential equations,
high-performance computation, numerical analysis, signal processing,
statistical inference, and data science. Our emphasis is on the synergy between
these disciplines, and how mathematics, broadly understood, has historically
played, and continues to play, a crucial role in the development of GW science.
We focus on black holes, which are very pure mathematical solutions of
Einstein's gravitational-field equations that are nevertheless realized in
Nature, and that provided the first observed signals.Comment: 41 pages, 5 figures. To appear in Bulletin of the American
Mathematical Societ
On Quantum Statistical Inference, I
Recent developments in the mathematical foundations of quantum mechanics have
brought the theory closer to that of classical probability and statistics. On
the other hand, the unique character of quantum physics sets many of the
questions addressed apart from those met classically in stochastics.
Furthermore, concurrent advances in experimental techniques and in the theory
of quantum computation have led to a strong interest in questions of quantum
information, in particular in the sense of the amount of information about
unknown parameters in given observational data or accessible through various
possible types of measurements. This scenery is outlined (with an audience of
statisticians and probabilists in mind).Comment: A shorter version containing some different material will appear
(2003), with discussion, in J. Roy. Statist. Soc. B, and is archived as
quant-ph/030719
Realtime market microstructure analysis: online Transaction Cost Analysis
Motivated by the practical challenge in monitoring the performance of a large
number of algorithmic trading orders, this paper provides a methodology that
leads to automatic discovery of the causes that lie behind a poor trading
performance. It also gives theoretical foundations to a generic framework for
real-time trading analysis. Academic literature provides different ways to
formalize these algorithms and show how optimal they can be from a
mean-variance, a stochastic control, an impulse control or a statistical
learning viewpoint. This paper is agnostic about the way the algorithm has been
built and provides a theoretical formalism to identify in real-time the market
conditions that influenced its efficiency or inefficiency. For a given set of
characteristics describing the market context, selected by a practitioner, we
first show how a set of additional derived explanatory factors, called anomaly
detectors, can be created for each market order. We then will present an online
methodology to quantify how this extended set of factors, at any given time,
predicts which of the orders are underperforming while calculating the
predictive power of this explanatory factor set. Armed with this information,
which we call influence analysis, we intend to empower the order monitoring
user to take appropriate action on any affected orders by re-calibrating the
trading algorithms working the order through new parameters, pausing their
execution or taking over more direct trading control. Also we intend that use
of this method in the post trade analysis of algorithms can be taken advantage
of to automatically adjust their trading action.Comment: 33 pages, 12 figure
- …