21,512 research outputs found
Updating constraint preconditioners for KKT systems in quadratic programming via low-rank corrections
This work focuses on the iterative solution of sequences of KKT linear
systems arising in interior point methods applied to large convex quadratic
programming problems. This task is the computational core of the interior point
procedure and an efficient preconditioning strategy is crucial for the
efficiency of the overall method. Constraint preconditioners are very effective
in this context; nevertheless, their computation may be very expensive for
large-scale problems, and resorting to approximations of them may be
convenient. Here we propose a procedure for building inexact constraint
preconditioners by updating a "seed" constraint preconditioner computed for a
KKT matrix at a previous interior point iteration. These updates are obtained
through low-rank corrections of the Schur complement of the (1,1) block of the
seed preconditioner. The updated preconditioners are analyzed both
theoretically and computationally. The results obtained show that our updating
procedure, coupled with an adaptive strategy for determining whether to
reinitialize or update the preconditioner, can enhance the performance of
interior point methods on large problems.Comment: 22 page
Quantum theory of intersubband polarons
We present a microscopic quantum theory of intersubband polarons,
quasiparticles originated from the coupling between intersubband transitions
and longitudinal optical phonons. To this aim we develop a second quantized
theory taking into account both the Fr\"ohlich interaction between phonons and
intersubband transitions and the Coulomb interaction between the intersubband
transitions themselves. Our results show that the coupling between the phonons
and the intersubband transitions is extremely intense, thanks both to the
collective nature of the intersubband excitations and to the natural tight
confinement of optical phonons. Not only the coupling is strong enough to
spectroscopically resolve the resonant splitting between the modes (strong
coupling regime), but it can become comparable to the bare frequency of the
excitations (ultrastrong coupling regime). We thus predict the possibility to
exploit intersubband polarons both for applied optoelectronic research, where a
precise control of the phonon resonances is needed, and also to observe
fundamental quantum vacuum physics, typical of the ultrastrong coupling regime
Phase diagrams of charged colloidal rods: can a uniaxial charge distribution break chiral symmetry?
We construct phase diagrams for charged rodlike colloids within the
second-virial approximation as a function of rod concentration, salt
concentration, and colloidal charge. Besides the expected isotropic-nematic
transition, we also find parameter regimes with a coexistence between a nematic
and a second, more highly aligned nematic phase including an
isotropic-nematic-nematic triple point and a nematic-nematic critical point,
which can all be explained in terms of the twisting effect. We compute the
Frank elastic constants to see if the twist elastic constant can become
negative, which would indicate the possibility of a cholesteric phase
spontaneously forming. Although the twisting effect reduces the twist elastic
constant, we find that it always remains positive. In addition, we find that
for finite aspect-ratio rods the twist elastic constant is also always
positive, such that there is no evidence of chiral symmetry breaking due to a
uniaxial charge distribution.Comment: Added a reference to Sec. 4 and extended discussions in Secs. 4 and
7, results unchange
Recommended from our members
Financial crisis and international supervision: New evidence on the discretionary use of loan loss provisions at Euro Area commercial banks
We examine the discretionary use of loan loss provisions during the recent financial crisis, when Euro Area banks experienced not only a negatuve effect on the quality of their loans and a reduction in their profitability, but were also subject to a new form of stricter supervision, namely the EBA 2010 and 2011 stress test exercises. Overall, we find support for the only income smoothing hypothesis and we do not observe any difference in listed banks'behavior when compared to unlisted banks. Banks subject to EBA stress tests had higher incentives to smooth income only for the 2011 EBA exercise when a larger and more detailed set of information was released. This may suggest an unwilled side effect that accounting setters and banking regulators and supervisors should account for
Theory of continuum percolation II. Mean field theory
I use a previously introduced mapping between the continuum percolation model
and the Potts fluid to derive a mean field theory of continuum percolation
systems. This is done by introducing a new variational principle, the basis of
which has to be taken, for now, as heuristic. The critical exponents obtained
are , and , which are identical with the mean
field exponents of lattice percolation. The critical density in this
approximation is \rho_c = 1/\ve where \ve = \int d \x \, p(\x) \{ \exp [-
v(\x)/kT] - 1 \}. p(\x) is the binding probability of two particles
separated by \x and v(\x) is their interaction potential.Comment: 25 pages, Late
Telescopes don't make catalogues!
Astronomical instruments make intensity measurements; any precise
astronomical experiment ought to involve modeling those measurements. People
make catalogues, but because a catalogue requires hard decisions about
calibration and detection, no catalogue can contain all of the information in
the raw pixels relevant to most scientific investigations. Here we advocate
making catalogue-like data outputs that permit investigators to test hypotheses
with almost the power of the original image pixels. The key is to provide users
with approximations to likelihood tests against the raw image pixels. We
advocate three options, in order of increasing difficulty: The first is to
define catalogue entries and associated uncertainties such that the catalogue
contains the parameters of an approximate description of the image-level
likelihood function. The second is to produce a K-catalogue sampling in
"catalogue space" that samples a posterior probability distribution of
catalogues given the data. The third is to expose a web service or equivalent
that can re-compute on demand the full image-level likelihood for any
user-supplied catalogue.Comment: presented at ELSA 2010: Gaia, at the frontiers of astrometr
Predicting the cosmological constant with the scale-factor cutoff measure
It is well known that anthropic selection from a landscape with a flat prior
distribution of cosmological constant Lambda gives a reasonable fit to
observation. However, a realistic model of the multiverse has a physical volume
that diverges with time, and the predicted distribution of Lambda depends on
how the spacetime volume is regulated. We study a simple model of the
multiverse with probabilities regulated by a scale-factor cutoff, and calculate
the resulting distribution, considering both positive and negative values of
Lambda. The results are in good agreement with observation. In particular, the
scale-factor cutoff strongly suppresses the probability for values of Lambda
that are more than about ten times the observed value. We also discuss several
qualitative features of the scale-factor cutoff, including aspects of the
distributions of the curvature parameter Omega and the primordial density
contrast Q.Comment: 16 pages, 6 figures, 2 appendice
Stochastic modeling of high-speed data links with nonlinear dynamic terminations
This paper addresses the statistical modeling and simulation of high-speed interconnects with uncertain physical properties and nonlinear dynamical terminations. The proposed approach is based on the expansion of voltage and current variables in terms of orthogonal polynomials of random variables. It extends the available literature results on the generation of an augmented deterministic SPICE equivalent of the stochastic link to the case in which the terminations are nonlinear and dynamical, like those modeling IC buffers. A single and standard SPICE simulation of the aforementioned equivalent circuit allows to efficiently compute the expansion coefficients that provide statistical information pertinent to the interconnect response. The feasibility and strength of the approach are demonstrated by means of a coupled microstrip interconnect with drivers and receiver
- …
