2,124 research outputs found
Structural Change in (Economic) Time Series
Methods for detecting structural changes, or change points, in time series
data are widely used in many fields of science and engineering. This chapter
sketches some basic methods for the analysis of structural changes in time
series data. The exposition is confined to retrospective methods for univariate
time series. Several recent methods for dating structural changes are compared
using a time series of oil prices spanning more than 60 years. The methods
broadly agree for the first part of the series up to the mid-1980s, for which
changes are associated with major historical events, but provide somewhat
different solutions thereafter, reflecting a gradual increase in oil prices
that is not well described by a step function. As a further illustration, 1990s
data on the volatility of the Hang Seng stock market index are reanalyzed.Comment: 12 pages, 6 figure
Post-Lie Algebras, Factorization Theorems and Isospectral-Flows
In these notes we review and further explore the Lie enveloping algebra of a
post-Lie algebra. From a Hopf algebra point of view, one of the central
results, which will be recalled in detail, is the existence of a second Hopf
algebra structure. By comparing group-like elements in suitable completions of
these two Hopf algebras, we derive a particular map which we dub post-Lie
Magnus expansion. These results are then considered in the case of
Semenov-Tian-Shansky's double Lie algebra, where a post-Lie algebra is defined
in terms of solutions of modified classical Yang-Baxter equation. In this
context, we prove a factorization theorem for group-like elements. An explicit
exponential solution of the corresponding Lie bracket flow is presented, which
is based on the aforementioned post-Lie Magnus expansion.Comment: 49 pages, no-figures, review articl
The Wasteland of Random Supergravities
We show that in a general \cal{N} = 1 supergravity with N \gg 1 scalar
fields, an exponentially small fraction of the de Sitter critical points are
metastable vacua. Taking the superpotential and Kahler potential to be random
functions, we construct a random matrix model for the Hessian matrix, which is
well-approximated by the sum of a Wigner matrix and two Wishart matrices. We
compute the eigenvalue spectrum analytically from the free convolution of the
constituent spectra and find that in typical configurations, a significant
fraction of the eigenvalues are negative. Building on the Tracy-Widom law
governing fluctuations of extreme eigenvalues, we determine the probability P
of a large fluctuation in which all the eigenvalues become positive. Strong
eigenvalue repulsion makes this extremely unlikely: we find P \propto exp(-c
N^p), with c, p being constants. For generic critical points we find p \approx
1.5, while for approximately-supersymmetric critical points, p \approx 1.3. Our
results have significant implications for the counting of de Sitter vacua in
string theory, but the number of vacua remains vast.Comment: 39 pages, 9 figures; v2: fixed typos, added refs and clarification
The Oslo definitions for coeliac disease and related terms.
ObjectiveThe literature suggests a lack of consensus on the use of terms related to coeliac disease (CD) and gluten.DesignA multidisciplinary task force of 16 physicians from seven countries used the electronic database PubMed to review the literature for CD-related terms up to January 2011. Teams of physicians then suggested a definition for each term, followed by feedback of these definitions through a web survey on definitions, discussions during a meeting in Oslo and phone conferences. In addition to 'CD', the following descriptors of CD were evaluated (in alphabetical order): asymptomatic, atypical, classical, latent, non-classical, overt, paediatric classical, potential, refractory, silent, subclinical, symptomatic, typical, CD serology, CD autoimmunity, genetically at risk of CD, dermatitis herpetiformis, gluten, gluten ataxia, gluten intolerance, gluten sensitivity and gliadin-specific antibodies.ResultsCD was defined as 'a chronic small intestinal immune-mediated enteropathy precipitated by exposure to dietary gluten in genetically predisposed individuals'. Classical CD was defined as 'CD presenting with signs and symptoms of malabsorption. Diarrhoea, steatorrhoea, weight loss or growth failure is required.' 'Gluten-related disorders' is the suggested umbrella term for all diseases triggered by gluten and the term gluten intolerance should not to be used. Other definitions are presented in the paper.ConclusionThis paper presents the Oslo definitions for CD-related terms
Nonuniform Cardiac Denervation Observed by 11C-meta-Hydroxyephedrine PET in 6-OHDA-Treated Monkeys
Parkinson's disease presents nonmotor complications such as autonomic dysfunction that do not respond to traditional anti-parkinsonian therapies. The lack of established preclinical monkey models of Parkinson's disease with cardiac dysfunction hampers development and testing of new treatments to alleviate or prevent this feature. This study aimed to assess the feasibility of developing a model of cardiac dysautonomia in nonhuman primates and preclinical evaluations tools. Five rhesus monkeys received intravenous injections of 6-hydroxydopamine (total dose: 50 mg/kg). The animals were evaluated before and after with a battery of tests, including positron emission tomography with the norepinephrine analog 11C-meta-hydroxyephedrine. Imaging 1 week after neurotoxin treatment revealed nearly complete loss of specific radioligand uptake. Partial progressive recovery of cardiac uptake found between 1 and 10 weeks remained stable between 10 and 14 weeks. In all five animals, examination of the pattern of uptake (using Logan plot analysis to create distribution volume maps) revealed a persistent region-specific significant loss in the inferior wall of the left ventricle at 10 (P<0.001) and 14 weeks (P<0.01) relative to the anterior wall. Blood levels of dopamine, norepinephrine (P<0.05), epinephrine, and 3,4-dihydroxyphenylacetic acid (P<0.01) were notably decreased after 6-hydroxydopamine at all time points. These results demonstrate that systemic injection of 6-hydroxydopamine in nonhuman primates creates a nonuniform but reproducible pattern of cardiac denervation as well as a persistent loss of circulating catecholamines, supporting the use of this method to further develop a monkey model of cardiac dysautonomia
Effective Dark Matter Model: Relic density, CDMS II, Fermi LAT and LHC
The Cryogenic Dark Matter Search recently announced the observation of two
signal events with a 77% confidence level. Although statistically inconclusive,
it is nevertheless suggestive. In this work we present a model-independent
analysis on the implication of a positive signal in dark matter scattering off
nuclei. Assuming the interaction between (scalar, fermion or vector) dark
matter and the standard model induced by unknown new physics at the scale
, we examine various dimension-6 tree-level induced operators and
constrain them using the current experimental data, e.g. the WMAP data of the
relic abundance, CDMS II direct detection of the spin-independent scattering,
and indirect detection data (Fermi LAT cosmic gamma-ray), etc. Finally, the LHC
reach is also explored
Recommended from our members
Beam Energy and Centrality Dependence of Direct-Photon Emission from Ultrarelativistic Heavy-Ion Collisions.
The PHENIX collaboration presents first measurements of low-momentum (0.41 GeV/c) direct-photon yield dN_{γ}^{dir}/dη is a smooth function of dN_{ch}/dη and can be well described as proportional to (dN_{ch}/dη)^{α} with α≈1.25. This scaling behavior holds for a wide range of beam energies at the Relativistic Heavy Ion Collider and the Large Hadron Collider, for centrality selected samples, as well as for different A+A collision systems. At a given beam energy, the scaling also holds for high p_{T} (>5 GeV/c), but when results from different collision energies are compared, an additional sqrt[s_{NN}]-dependent multiplicative factor is needed to describe the integrated-direct-photon yield
Recommended from our members
Projected WIMP sensitivity of the LUX-ZEPLIN dark matter experiment
LUX-ZEPLIN (LZ) is a next-generation dark matter direct detection experiment that will operate 4850 feet underground at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. Using a two-phase xenon detector with an active mass of 7 tonnes, LZ will search primarily for low-energy interactions with weakly interacting massive particles (WIMPs), which are hypothesized to make up the dark matter in our galactic halo. In this paper, the projected WIMP sensitivity of LZ is presented based on the latest background estimates and simulations of the detector. For a 1000 live day run using a 5.6-tonne fiducial mass, LZ is projected to exclude at 90% confidence level spin-independent WIMP-nucleon cross sections above 1.4×10-48 cm2 for a 40 GeV/c2 mass WIMP. Additionally, a 5σ discovery potential is projected, reaching cross sections below the exclusion limits of recent experiments. For spin-dependent WIMP-neutron(-proton) scattering, a sensitivity of 2.3×10-43 cm2 (7.1×10-42 cm2) for a 40 GeV/c2 mass WIMP is expected. With underground installation well underway, LZ is on track for commissioning at SURF in 2020
- …