332 research outputs found
Recovering the Probability Density Function of Asset Prices Using GARCH as Diffusion Approximations
This paper uses Garch models to estimate the objective and risk-neutral density functions of financial asset prices and, by comparing their shapes, recover detailed information on economic agents' attitudes toward risk. It differs from recent papers investigating analogous issues because it uses Nelson's (1990) result that Garch schemes are approximations of the kind of differential equations typically employed in finance to describe the evolution of asset prices. This feature of Garch schemes usually has been overshadowed by their well-known role as simple econometric tools providing reliable estimates of unobserved conditional variances. We show instead that the diffusion approximation property of Garch gives good results and can be extended to situations with i) non-standard distributions for the innovations of a conditional mean equation of asset price changes and ii) volatility concepts different from the variance. The objective PDF of the asset price is recovered from the estimation of a nonlinear Garch fitted to the historical path of the asset price. The risk-neutral PDF is extracted from crosssections of bond option prices, after introducing a volatility risk premium function. The direct comparison of the shapes of the two PDFS reveals the price attached by economic agents to the different states of nature. Applications are carried out with regard to the futures written on the Italian 10-year bond.option pricing, stochastic volatility, ARCH, volatility risk premium
A Simple Approach to the Estimation of Continuous Time CEV Stochastic Volatility Models of the Short-Term Rate
Aim of this article is to judge the empirical performance of 'ARCH models as diffusion approximations' of models of the short-term rate with stochastic volatility. Our estimation strategy is based both on moment conditions needed to guarantee the convergence of the discrete time models and on the quasi indirect inference principle. Unlike previous literature in which standard ARCH models approximate only specific diffusion models (those in which the variance of volatility is proportional to the square of volatility), our estimation strategy relies on ARCH models that approximate any CEV-diffusion model for volatility. A MonteCarlo study reveals that the filtering performances of these models are remarkably good, even in the presence of important misspecification. Finally, based on a natural substitute of a global specification test for just-identified problems designed within indirect inference methods, we provide strong empirical evidence that approximating diffusions with our models gives rise to a disaggregation bias that is not significant.stochastic volatility, CEV-ARCH, indirect inference, yield curve
Effective Quantum Extended Spacetime of Polymer Schwarzschild Black Hole
The physical interpretation and eventual fate of gravitational singularities
in a theory surpassing classical general relativity are puzzling questions that
have generated a great deal of interest among various quantum gravity
approaches. In the context of loop quantum gravity (LQG), one of the major
candidates for a non-perturbative background-independent quantisation of
general relativity, considerable effort has been devoted to construct effective
models in which these questions can be studied. In these models, classical
singularities are replaced by a "bounce" induced by quantum geometry
corrections. Undesirable features may arise however depending on the details of
the model. In this paper, we focus on Schwarzschild black holes and propose a
new effective quantum theory based on polymerisation of new canonical phase
space variables inspired by those successful in loop quantum cosmology. The
quantum corrected spacetime resulting from the solutions of the effective
dynamics is characterised by infinitely many pairs of trapped and anti-trapped
regions connected via a space-like transition surface replacing the central
singularity. Quantum effects become relevant at a unique mass independent
curvature scale, while they become negligible in the low curvature region near
the horizon. The effective quantum metric describes also the exterior regions
and asymptotically classical Schwarzschild geometry is recovered. We however
find that physically acceptable solutions require us to select a certain subset
of initial conditions, corresponding to a specific mass (de-)amplification
after the bounce. We also sketch the corresponding quantum theory and
explicitly compute the kernel of the Hamiltonian constraint operator.Comment: 50 pages, 10 figures; v2: journal version, minor comment and
references added; v3: minor corrections in section 5.3 to match journal
versio
A note on the Hamiltonian as a polymerisation parameter
In effective models of loop quantum gravity, the onset of quantum effects is
controlled by a so-called polymerisation scale. It is sometimes necessary to
make this scale phase space dependent in order to obtain sensible physics. A
particularly interesting choice recently used to study quantum corrected black
hole spacetimes takes the generator of time translations itself to set the
scale. We review this idea, point out errors in recent treatments, and show how
to fix them in principle.Comment: 7 pages, 2 figures; v2: journal version, minor clarification
Fisher Metric, Geometric Entanglement and Spin Networks
Starting from recent results on the geometric formulation of quantum
mechanics, we propose a new information geometric characterization of
entanglement for spin network states in the context of quantum gravity. For the
simple case of a single-link fixed graph (Wilson line), we detail the
construction of a Riemannian Fisher metric tensor and a symplectic structure on
the graph Hilbert space, showing how these encode the whole information about
separability and entanglement. In particular, the Fisher metric defines an
entanglement monotone which provides a notion of distance among states in the
Hilbert space. In the maximally entangled gauge-invariant case, the
entanglement monotone is proportional to a power of the area of the surface
dual to the link thus supporting a connection between entanglement and the
(simplicial) geometric properties of spin network states. We further extend
such analysis to the study of non-local correlations between two non-adjacent
regions of a generic spin network graph characterized by the bipartite
unfolding of an Intertwiner state. Our analysis confirms the interpretation of
spin network bonds as a result of entanglement and to regard the same spin
network graph as an information graph, whose connectivity encodes, both at the
local and non-local level, the quantum correlations among its parts. This gives
a further connection between entanglement and geometry.Comment: 29 pages, 3 figures, revised version accepted for publicatio
PYFLOW_2.0: a computer program for calculating flow properties and impact parameters of past dilute pyroclastic density currents based on field data
This paper presents PYFLOW_2.0, a hazard tool for the calculation of the impact parameters of dilute pyroclastic density currents (DPDCs). DPDCs represent the dilute turbulent type of gravity flows that occur during explosive volcanic eruptions; their hazard is the result of their mobility and the capability to laterally impact buildings and infrastructures and to transport variable amounts of volcanic ash along the path. Starting from data coming from the analysis of deposits formed by DPDCs, PYFLOW_2.0 calculates the flow properties (e.g., velocity, bulk density, thickness) and impact parameters (dynamic pressure, deposition time) at the location of the sampled outcrop. Given the inherent uncertainties related to sampling, laboratory analyses, and modeling assumptions, the program provides ranges of variations and probability density functions of the impact parameters rather than single specific values; from these functions, the user can interrogate the program to obtain the value of the computed impact parameter at any specified exceedance probability. In this paper, the sedimentological models implemented in PYFLOW_2.0 are presented, program functionalities are briefly introduced, and two application examples are discussed so as to show the capabilities of the software in quantifying the impact of the analyzed DPDCs in terms of dynamic pressure, volcanic ash concentration, and residence time in the atmosphere. The software and user’s manual are made available as a downloadable electronic supplement
Mapping the spatial variation of soil moisture at the large scale using GPR for pavement applications
The characterization of shallow soil moisture spatial variability at the large scale is a crucial issue in many research studies and fields of application ranging from agriculture and geology to civil and environmental engineering. In this framework, this work contributes to the research in the area of pavement engineering for preventing damages and planning effective management. High spatial variations of subsurface water content can lead to unexpected damage of the load-bearing layers; accordingly, both safety and operability of roads become lower, thereby affecting an increase in expected accidents.
A pulsed ground-penetrating radar system with ground-coupled antennas, i.e., 600-MHz and 1600-MHz center frequencies of investigation, was used to collect data in a 16 m × 16 m study site in the Po Valley area in northern Italy. Two ground-penetrating radar techniques were employed to non-destructively retrieve the subsurface moisture spatial profile. The first technique is based on the evalu¬ation of the dielectric permittivity from the attenuation of signal amplitudes. Therefore, dielectrics were converted into moisture values using soil-specific coefficients from Topp’s relationship. Ground-penetrating-radar-derived values of soil moisture were then compared with measurements from eight capacitance probes. The second technique is based on the Rayleigh scattering of the signal from the Fresnel theory, wherein the shifts of the peaks of frequency spectra are assumed comprehensive indi¬cators for characterizing the spatial variability of moisture. Both ground-penetrating radar methods have shown great promise for mapping the spatial variability of soil moisture at the large scale
Effect of short–time variations of wind velocity on mass transfer rate between street canyons and the atmospheric boundary layer
Abstract 2D URANS CFD simulations were conducted to study the effect of short–time variations of wind velocity on mass transfer rate between street canyons and the atmospheric boundary layer (ABL). A street canyon with a height–to– width ratio (aspect ratio) of three was considered as a case study. The study is of practical interest since it illustrates a skimming flow regime, the regime where pollutants are less effectively exchanged between the canyon and the above atmosphere, typically found in many urban areas in Mediterranean countries. Short–time variations of wind velocity magnitude were simulated assuming a sinusoidal function with average magnitude = 4 m s −1 ; amplitude ±2 m s −1 and period from 1 to 40 s, and subsequently with short–time averaged (0.1 s, 1 s and 10 s) real world data measured with an ultrasonic anemometer (50 Hz). Mass transfer rate between the canyon and the ABL was evaluated as the rate of reduction of spatially averaged concentration of a passive pollutant, carbon monoxide (CO), in the street canyon. Results show that mass transfer rate increases with the frequency of short–time variations. In CFD studies pertaining to pollutant dispersion in street canyons, wind hourly average velocity is usually assumed as a reference value to simulate real world cases. Our results show that this input data must be completed with additional information about the extent of variation in wind intensity and its frequency in the hour
On the Role of Fiducial Structures in Minisuperspace Reduction and Quantum Fluctuations in LQC
We study the homogeneous minisuperspace reduction within the canonical
framework for a scalar field theory and gravity. Symmetry reduction is
implemented via second class constraints for the field modes over a
partitioning of the non-compact spatial slice into disjoint cells. The
canonical structure of the resulting homogeneous theories is obtained via the
associated Dirac bracket which can only be defined on a finite number of cells
homogeneously patched together and agrees with the full theory Poisson bracket
for the averaged fields. This identifies a finite region , the fiducial
cell, whose size sets the physical scale over which homogeneity is imposed,
namely a wavelength cutoff. The reduced theory results from 1) selecting a
subset of -averaged observables of the full theory; 2) neglecting
inhomogeneous modes with wavelengths and
; 3) neglecting boundary terms encoding interactions between
neighbouring cells. The error made is of order . As a result,
the off-shell structures of the reduced theory depend on the size of and
different identify canonically inequivalent theories whose dynamics
though is -independent. Their quantisation leads then to a family of
-labeled quantum representations and the quantum version of an active
rescaling of is implemented via a suitable dynamics-preserving
isomorphism between the different theories. We discuss the consequences for
statistical moments, fluctuations, and semiclassical states in both a standard
and polymer quantisation. For a scalar field of mass , we also sketch the
quantum reduction and identify a subsector of the QFT where the results of
the"first reduced, then quantised" theories can be reproduced with good
approximation as long as . Finally, a strategy to include
inhomogeneities in cosmology is outlined.Comment: 71 + 13 pages, 4 figure
- …