6,989 research outputs found
NOSS Altimeter Detailed Algorithm specifications
The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users
Dynamic Linkages in Credit Risk: Modeling the Time-Varying Correlation between the Money and Derivatives Markets over the Crisis Period
This paper examines the dynamic linkages in credit risk between the money market and the derivatives market during 2004–9. We use the T-bill–Eurodollar (TED) spread to measure credit risk in the money market and the credit default swap (CDS) index spread for the derivatives market. The linkages are measured by a dynamic conditional correlation–Glosten–Jagannathan–Runkle–generalized auto regressive conditional heteroscedasticity model. The results show that the correlation between the TED spread and the CDS index spread fluctuated around zero prior to the crisis. While the correlation increased before the crisis, it moved notably higher during the crisis. Finally, the correlation fell in early 2009 but persisted at a level between 0.05 and 0.1, higher than the precrisis period
The Dependence Structure in Credit Risk between Money and Derivatives Markets: A Time-Varying Conditional Copula Approach
Purpose
The purpose of this paper is to examine the dynamic dependence structure in credit risk between the money market and the derivatives market during 2004-2009. The authors use the TED spread to measure credit risk in the money market and CDS index spread for the derivatives market.
Design/methodology/approach
The dependence structure is measured by a time-varying Gaussian copula. A copula is a function that joins one-dimensional distribution functions together to form multivariate distribution functions. The copula contains all the information on the dependence structure of the random variables while also removing the linear correlation restriction. Therefore, provides a straightforward way of modelling non-linear and non-normal joint distributions.
Findings
The results show that the correlation between these two markets while fluctuating with a general upward trend prior to 2007 exhibited a noticeably higher correlation after 2007. This points to the evidence of credit contagion during the crisis. Three different phases are identified for the crisis period which sheds light on the nature of contagion mechanisms in financial markets. The correlation of the two spreads fell in early 2009, although remained higher than the pre-crisis level. This is partly due to policy intervention that lowered the TED spread while the CDS spread remained higher due to the Eurozone sovereign debt crisis.
Originality/value
The paper examines the relationship between the TED and CDS spreads which measure credit risk in an economy. This paper contributes to the literature on dynamic co-movement, contagion effects and risk linkages
Models for the Effects of G-seat Cuing on Roll-axis Tracking Performance
Including whole-body motion in a flight simulator improves performance for a variety of tasks requiring a pilot to compensate for the effects of unexpected disturbances. A possible mechanism for this improvement is that whole-body motion provides high derivative vehicle state information whic allows the pilot to generate more lead in responding to the external disturbances. During development of motion simulating algorithms for an advanced g-cuing system it was discovered that an algorithm based on aircraft roll acceleration producted little or no performance improvement. On the other hand, algorithms based on roll position or roll velocity produced performance equivalent to whole-body motion. The analysis and modeling conducted at both the sensory system and manual control performance levels to explain the above results are described
Experimental demonstration of a measurement-based realisation of a quantum channel
We introduce and experimentally demonstrate a method for realising a quantum
channel using the measurement-based model. Using a photonic setup and modifying
the bases of single-qubit measurements on a four-qubit entangled cluster state,
representative channels are realised for the case of a single qubit in the form
of amplitude and phase damping channels. The experimental results match the
theoretical model well, demonstrating the successful performance of the
channels. We also show how other types of quantum channels can be realised
using our approach. This work highlights the potential of the measurement-based
model for realising quantum channels which may serve as building blocks for
simulations of realistic open quantum systems.Comment: 8 pages, 4 figure
The Voice of Opportunity From China and Japan
https://digitalcommons.acu.edu/crs_books/1335/thumbnail.jp
Postcard: Spiker and McMillan Music Store, Columbus, Kansas
This monochrome photographic postcard features a music shop in Columbus, Kansas. The black and white photo is blue in hue. The shop contains pianos lined along the right wall. Display cases line the left wall. Violins hang on the right wall. Printed text is on the bottom of the card and handwriting is on the back of the card.https://scholars.fhsu.edu/tj_postcards/1553/thumbnail.jp
Splitting Proofs for Interpolation
We study interpolant extraction from local first-order refutations. We
present a new theoretical perspective on interpolation based on clearly
separating the condition on logical strength of the formula from the
requirement on the com- mon signature. This allows us to highlight the space of
all interpolants that can be extracted from a refutation as a space of simple
choices on how to split the refuta- tion into two parts. We use this new
insight to develop an algorithm for extracting interpolants which are linear in
the size of the input refutation and can be further optimized using metrics
such as number of non-logical symbols or quantifiers. We implemented the new
algorithm in first-order theorem prover VAMPIRE and evaluated it on a large
number of examples coming from the first-order proving community. Our
experiments give practical evidence that our work improves the state-of-the-art
in first-order interpolation.Comment: 26th Conference on Automated Deduction, 201
NOSS altimeter algorithm specifications
A description of all algorithms required for altimeter processing is given. Each description includes title, description, inputs/outputs, general algebraic sequences and data volume. All required input/output data files are described and the computer resources required for the entire altimeter processing system were estimated. The majority of the data processing requirements for any radar altimeter of the Seasat-1 type are scoped. Additions and deletions could be made for the specific altimeter products required by other projects
- …