851 research outputs found
Tropical Fourier-Motzkin elimination, with an application to real-time verification
We introduce a generalization of tropical polyhedra able to express both
strict and non-strict inequalities. Such inequalities are handled by means of a
semiring of germs (encoding infinitesimal perturbations). We develop a tropical
analogue of Fourier-Motzkin elimination from which we derive geometrical
properties of these polyhedra. In particular, we show that they coincide with
the tropically convex union of (non-necessarily closed) cells that are convex
both classically and tropically. We also prove that the redundant inequalities
produced when performing successive elimination steps can be dynamically
deleted by reduction to mean payoff game problems. As a complement, we provide
a coarser (polynomial time) deletion procedure which is enough to arrive at a
simply exponential bound for the total execution time. These algorithms are
illustrated by an application to real-time systems (reachability analysis of
timed automata).Comment: 29 pages, 8 figure
Cyclic projectors and separation theorems in idempotent convex geometry
Semimodules over idempotent semirings like the max-plus or tropical semiring
have much in common with convex cones. This analogy is particularly apparent in
the case of subsemimodules of the n-fold cartesian product of the max-plus
semiring it is known that one can separate a vector from a closed subsemimodule
that does not contain it. We establish here a more general separation theorem,
which applies to any finite collection of closed semimodules with a trivial
intersection. In order to prove this theorem, we investigate the spectral
properties of certain nonlinear operators called here idempotent cyclic
projectors. These are idempotent analogues of the cyclic nearest-point
projections known in convex analysis. The spectrum of idempotent cyclic
projectors is characterized in terms of a suitable extension of Hilbert's
projective metric. We deduce as a corollary of our main results the idempotent
analogue of Helly's theorem.Comment: 20 pages, 1 figur
Assimilation of IASI partial tropospheric columns with an Ensemble Kalman Filter over Europe
Partial lower tropospheric ozone columns provided by the IASI (Infrared Atmospheric Sounding Interferometer) instrument have been assimilated into a chemistry-transport model at continental scale (CHIMERE) using an Ensemble Square Root Kalman Filter (EnSRF). Analyses are made for the month of July 2007 over the European domain. Launched in 2006, aboard the MetOp-A satellite, IASI shows high sensitivity for ozone in the free troposphere and low sensitivity at the ground; therefore it is important to evaluate if assimilation of these observations can improve free tropospheric ozone, and possibly surface ozone. The analyses are validated against independent ozone observations from sondes, MOZAIC<sup>1</sup> aircraft and ground based stations (AIRBASE – the European Air quality dataBase) and compared with respect to the free run of CHIMERE. These comparisons show a decrease in error of 6 parts-per-billion (ppb) in the free troposphere over the Frankfurt area, and also a reduction of the root mean square error (respectively bias) at the surface of 19% (33%) for more than 90% of existing ground stations. This provides evidence of the potential of data assimilation of tropospheric IASI columns to better describe the tropospheric ozone distribution, including surface ozone, despite the lower sensitivity. <br><br> The changes in concentration resulting from the observational constraints were quantified and several geophysical explanations for the findings of this study were drawn. The corrections were most pronounced over Italy and the Mediterranean region, we noted an average reduction of 8–9 ppb in the free troposphere with respect to the free run, and still a reduction of 5.5 ppb at ground, likely due to a longer residence time of air masses in this part associated to the general circulation pattern (i.e. dominant western circulation) and to persistent anticyclonic conditions over the Mediterranean basin. This is an important geophysical result, since the ozone burden is large over this area, with impact on the radiative balance and air quality. <br><br><br> <sup>1</sup> Measurements of OZone, water vapour, carbon monoxide and nitrogen oxides by in-service AIrbus airCraft (<a href="http://mozaic.aero.obs-mip.fr/web/"target="_blank">http://mozaic.aero.obs-mip.fr/web/</a>)
Tropical polyhedra are equivalent to mean payoff games
We show that several decision problems originating from max-plus or tropical
convexity are equivalent to zero-sum two player game problems. In particular,
we set up an equivalence between the external representation of tropical convex
sets and zero-sum stochastic games, in which tropical polyhedra correspond to
deterministic games with finite action spaces. Then, we show that the winning
initial positions can be determined from the associated tropical polyhedron. We
obtain as a corollary a game theoretical proof of the fact that the tropical
rank of a matrix, defined as the maximal size of a submatrix for which the
optimal assignment problem has a unique solution, coincides with the maximal
number of rows (or columns) of the matrix which are linearly independent in the
tropical sense. Our proofs rely on techniques from non-linear Perron-Frobenius
theory.Comment: 28 pages, 5 figures; v2: updated references, added background
materials and illustrations; v3: minor improvements, references update
Recommended from our members
Containment and equivalence of weighted automata: Probabilistic and max-plus cases
This paper surveys some results regarding decision problems for probabilistic and max-plus automata, such as containment and equivalence. Probabilistic and max-plus automata are part of the general family of weighted automata, whose semantics are maps from words to real values. Given two weighted automata, the equivalence problem asks whether their semantics are the same, and the containment problem whether one is point-wise smaller than the other one. These problems have been studied intensively and this paper will review some techniques used to show (un)decidability and state a list of open questions that still remain
Using mass-flow controllers for obtaining extremely stable ECR ion source beams
Original publication available at http://www.jacow.orgInternational audienceBeam stability and reproducibility is of paramount importance in applications requiring precise control of implanted radiation dose, like in the case of Hadrontherapy. The beam intensity over several weeks or months should be kept constant. Moreover, the timing for changing the nature of the beam and, as a consequence, the tuning of the source should be minimized. Standard valves usually used in conjunction of ECR ion sources have the disadvantage of controlling the conductance, which can vary significantly with external conditions, like ambient temperature and inlet pressure of the gas. The use of flow controllers is the natural way for avoiding these external constraints. In this contribution we present the results obtained using a new model of Mass-flow controller in the source Supernanogan, for production of C4+ and H3+ beams. Extremely stable beams (± 2.5%) without retuning of the source over several weeks could be obtained. The reproducibility of the source tuning parameters could also be demonstrated
Ion source developments for stable and radioactive ion beams at GANIL
Since now many years, the Ganil ion source team has in charge to develop ion sources with three main purposes. The first one concerns the radioactive ion production that implies high efficiency ion sources as the amount of created exotic atoms is very low (between 10 to 108 particle per second). The second one deals with high intensities of stable metallic ion beams for the injectors of the accelerator while the last one tries to increase the intensities of very high charge state ion beams for atomic physic. Concerning radioactive ion production, the recent results obtained, in collaboration with the ISN Grenoble group, with the 1+/n+ method drove us to develop a new concept of ecr ion source for monocharged ion production. The results of the first tests of this source will be given. This new idea for the construction of ecr ion source can be applied to multicharged ion production. Concerning the high charge state ion beam production, a new source called SUPERSHYPIE has been built that allow to increase by a factor 2 the length of the plasma of an ECR4M source. This new concept has just been started and has produced arround 50 nAe of Ar17+ . The first results of this new source will be presented. Concerning the developments of metallic ion beams, a separated poster will be presented at this workshop
Set optimization - a rather short introduction
Recent developments in set optimization are surveyed and extended including
various set relations as well as fundamental constructions of a convex analysis
for set- and vector-valued functions, and duality for set optimization
problems. Extensive sections with bibliographical comments summarize the state
of the art. Applications to vector optimization and financial risk measures are
discussed along with algorithmic approaches to set optimization problems
On the complexity of strongly connected components in directed hypergraphs
We study the complexity of some algorithmic problems on directed hypergraphs
and their strongly connected components (SCCs). The main contribution is an
almost linear time algorithm computing the terminal strongly connected
components (i.e. SCCs which do not reach any components but themselves).
"Almost linear" here means that the complexity of the algorithm is linear in
the size of the hypergraph up to a factor alpha(n), where alpha is the inverse
of Ackermann function, and n is the number of vertices. Our motivation to study
this problem arises from a recent application of directed hypergraphs to
computational tropical geometry.
We also discuss the problem of computing all SCCs. We establish a superlinear
lower bound on the size of the transitive reduction of the reachability
relation in directed hypergraphs, showing that it is combinatorially more
complex than in directed graphs. Besides, we prove a linear time reduction from
the well-studied problem of finding all minimal sets among a given family to
the problem of computing the SCCs. Only subquadratic time algorithms are known
for the former problem. These results strongly suggest that the problem of
computing the SCCs is harder in directed hypergraphs than in directed graphs.Comment: v1: 32 pages, 7 figures; v2: revised version, 34 pages, 7 figure
- …
