41,653 research outputs found
Exact spectral function of a Tonks-Girardeau gas in a lattice
The single-particle spectral function of a strongly correlated system is an
essential ingredient to describe its dynamics and transport properties. We
develop a general method to calculate the exact spectral function of a strongly
interacting one-dimensional Bose gas in the Tonks-Girardeau regime, valid for
any type of confining potential, and apply it to bosons on a lattice to obtain
the full spectral function, at all energy and momentum scales. We find that it
displays three main singularity lines. The first two can be identified as the
analogs of Lieb-I and Lieb-II modes of a uniform fluid; the third one, instead,
is specifically due to the presence of the lattice. We show that the spectral
function displays a power-law behaviour close to the Lieb-I and Lieb-II
singularities, as predicted by the non-linear Luttinger liquid description, and
obtain the exact exponents. In particular, the Lieb-II mode shows a divergence
in the spectral function, differently from what happens in the dynamical
structure factor, thus providing a route to probe it in experiments with
ultracold atoms.Comment: 10 pages, 3 figure
On the Exponential Decay of the n-point Correlation Functions and the Analyticity of the Pressure
The goal of this paper is to provide estimates leading to a direct proof of
the exponential decay of the n-point correlation functions for certain
unbounded models of Kac type. The methods are based on estimating higher order
derivatives of the solution of the Witten Laplacian equation on one forms
associated with the hamiltonian of the system. We also provide a formula for
the Taylor coefficients of the pressure that is suitable for a direct proof the
analyticity
Biomass and reproduction of Pacific sardine (Sardinops sagax) off the Pacific northwestern United States, 2003–2005
The Pacific sardine (Sardinops sagax) is distributed along the west coast of North America from Baja California to British Columbia. This article presents estimates of biomass, spawning biomass, and related biological parameters based on four trawl-ichthyoplankton surveys conducted during July 2003 –March 2005 off Oregon and Washington. The trawl-based biomass estimates, serving as
relative abundance, were 198,600 t (coefficient of variation [CV] = 0.51) in July 2003, 20,100 t (0.8) in March
2004, 77,900 t (0.34) in July 2004, and 30,100 t (0.72) in March 2005 over an area close to 200,000 km2. The biomass estimates, high in July and low in March, are a strong indication of migration in and out of this area. Sardine spawn in July off the Pacific Northwest (PNW) coast and
none of the sampled fish had spawned in March. The estimated spawning biomass for July 2003 and July 2004 was 39,184 t (0.57) and 84,120 t (0.93), respectively. The average active female sardine in the PNW spawned every 20–40 days compared to every 6–8 days off California. The spawning habitat was located in the southeastern area off the PNW coast, a shift from the northwest area off the PNW
coast in the 1990s. Egg production in off the PNW for 2003–04 was lower than that off California and that in the 1990s. Because the biomass of Pacific sardine off the PNW appears to be supported heavily by migratory fish from California, the sustainability of the local PNW population relies on the stability of the population off California, and on local oceanographic conditions for local residence
Can Insurers Pay for the "Big One"? Measuring the Capacity of an Insurance Market to Respond to Catastrophic Losses
This paper presents a theoretical and empirical analysis of the capacity of the U.S. property-liability insurance industry to finance major catastrophic property losses. The topic is important because catastrophic events such as the Northridge earthquake and Hurricane Andrew have raised questions about the ability of the insurance industry to respond to the "Big One," usually defined as a hurricane or earthquake in the 300 billion, should be able to sustain a loss of this magnitude. However, the reality could be different; depending on the distribution of damage and the spread of coverage as well as the correlations between insurer losses and industry losses. Thus, the prospect of a mega catastrophe brings the real threat of widespread insurance failures and unpaid insurance claims. Our theoretical analysis takes as its starting point the well-known article by Borch (1962), which shows that the Pareto optimal result in a market characterized by risk averse insurers is for each insurer to hold a proportion of the "market portfolio" of insurance contracts. Each insurer pays a proportion of total industry losses; and the industry behaves as a single firm, paying 100 percent of losses up to the point where industry net premiums and equity are exhausted. Borch's theorem gives rise to a natural definition of industry capacity as the amount of industry resources that are deliverable conditional on an industry loss of a given size. In our theoretical analysis, we show that the necessary condition for industry capacity to be maximized is that all insurers hold a proportionate share of the industry underwriting portfolio. The sufficient condition for capacity maximization, given a level of total resources in the industry, is for all insurers to hold a net of reinsurance underwriting portfolio which is perfectly correlated with aggregate industry losses. Based on these theoretical results, we derive an option-like model of insurer responses to catastrophes, leading to an insurer response-function where the total payout, conditional on total industry losses, is a function of the industry and company expected losses, industry and company standard deviation of losses, company net worth, and the correlation between industry and company losses. The industry response function is obtained by summing the company response functions, giving the capacity of the industry to respond to losses of various magnitudes. We utilize 1997 insurer financial statement data to estimate the capacity of the industry to respond to catastrophic losses. Two samples of insurers are utilized - a national sample, to measure the capacity of the industry as a whole to respond to a national event, and a Florida sample, to measure the capacity of the industry to respond to a Florida hurricane. The empirical analysis estimates the capacity of the industry to bear losses ranging from the expected value of loss up to a loss equal to total company resources. We develop a measure of industry efficiency equal to the difference between the loss that would be paid if the industry acts as a single firm and the actual estimated payment based on our option model. The results indicate that national industry efficiency ranges from about 78 to 85 percent, based on catastrophe losses ranging from zero to 200 to 20 billion catastrophe. For a catastrophe of .88 in equity capital per dollar of incurred losses, whereas in 1997 this ratio had increased to 100 billion, our lower bound estimate of industry capacity in 1991 is only 79.6 percent, based on the national sample, compared to 92.8 percent in 1997. For the Florida sample, we estimate that insurers could have paid at least 72.2 percent of a $100 billion catastrophe in 1991 and 89.7 percent in 1997. Thus, the industry is clearly much better capitalized now than it was prior to Andrew. The results suggest that the gaps in catastrophic risk financing are presently not sufficient to justify Federal government intervention in private insurance markets in the form of Federally sponsored catastrophe reinsurance. However, even though the industry could adequately fund the "Big One," doing so would disrupt the functioning of insurance markets and cause price increases for all types of property-liability insurance. Thus, it appears that there is still a gap in capacity that provides a role for privately and publicly traded catastrophic loss derivative contracts.
Decoherence in a fermion environment: Non-Markovianity and Orthogonality Catastrophe
We analyze the non-Markovian character of the dynamics of an open two-level
atom interacting with a gas of ultra-cold fermions. In particular, we discuss
the connection between the phenomena of orthogonality catastrophe and Fermi
edge singularity occurring in such a kind of environment and the memory-keeping
effects which are displayed in the time evolution of the open system
Stark shift and field ionization of arsenic donors in Si-SOI structures
We develop an efficient back gate for silicon-on-insulator (SOI) devices
operating at cryogenic temperatures, and measure the quadratic hyperfine Stark
shift parameter of arsenic donors in isotopically purified Si-SOI layers
using such structures. The back gate is implemented using MeV ion implantation
through the SOI layer forming a metallic electrode in the handle wafer,
enabling large and uniform electric fields up to 2 V/m to be
applied across the SOI layer. Utilizing this structure we measure the Stark
shift parameters of arsenic donors embedded in the Si SOI layer and find
a contact hyperfine Stark parameter of m/V. We also demonstrate electric-field driven dopant ionization in
the SOI device layer, measured by electron spin resonance.Comment: 5 pages, 3 figure
WARNING: Physics Envy May Be Hazardous To Your Wealth!
The quantitative aspirations of economists and financial analysts have for
many years been based on the belief that it should be possible to build models
of economic systems - and financial markets in particular - that are as
predictive as those in physics. While this perspective has led to a number of
important breakthroughs in economics, "physics envy" has also created a false
sense of mathematical precision in some cases. We speculate on the origins of
physics envy, and then describe an alternate perspective of economic behavior
based on a new taxonomy of uncertainty. We illustrate the relevance of this
taxonomy with two concrete examples: the classical harmonic oscillator with
some new twists that make physics look more like economics, and a quantitative
equity market-neutral strategy. We conclude by offering a new interpretation of
tail events, proposing an "uncertainty checklist" with which our taxonomy can
be implemented, and considering the role that quants played in the current
financial crisis.Comment: v3 adds 2 reference
Template-based Gravitational-Wave Echoes Search Using Bayesian Model Selection
The ringdown of the gravitational-wave signal from a merger of two black
holes has been suggested as a probe of the structure of the remnant compact
object, which may be more exotic than a black hole. It has been pointed out
that there will be a train of echoes in the late-time ringdown stage for
different types of exotic compact objects. In this paper, we present a
template-based search methodology using Bayesian statistics to search for
echoes of gravitational waves. Evidence for the presence or absence of echoes
in gravitational-wave events can be established by performing Bayesian model
selection. The Occam factor in Bayesian model selection will automatically
penalize the more complicated model that echoes are present in
gravitational-wave strain data because of its higher degree of freedom to fit
the data. We find that the search methodology was able to identify
gravitational-wave echoes with Abedi et al.'s echoes waveform model about 82.3%
of the time in simulated Gaussian noise in the Advanced LIGO and Virgo network
and about 61.1% of the time in real noise in the first observing run of
Advanced LIGO with significance. Analyses using this method are
performed on the data of Advanced LIGO's first observing run, and we find no
statistical significant evidence for the detection of gravitational-wave
echoes. In particular, we find combined evidence of the three events
in Advanced LIGO's first observing run. The analysis technique developed in
this paper is independent of the waveform model used, and can be used with
different parametrized echoes waveform models to provide more realistic
evidence of the existence of echoes from exotic compact objects.Comment: 16 pages, 6 figure
- …