257 research outputs found
Improved Approximation Algorithms for Computing k Disjoint Paths Subject to Two Constraints
For a given graph with positive integral cost and delay on edges,
distinct vertices and , cost bound and delay bound , the bi-constraint path (BCP) problem is to compute disjoint
-paths subject to and . This problem is known NP-hard, even when
\cite{garey1979computers}. This paper first gives a simple approximation
algorithm with factor-, i.e. the algorithm computes a solution with
delay and cost bounded by and respectively. Later, a novel improved
approximation algorithm with ratio
is developed by constructing
interesting auxiliary graphs and employing the cycle cancellation method. As a
consequence, we can obtain a factor- approximation algorithm by
setting and a factor- algorithm by
setting . Besides, by setting , an
approximation algorithm with ratio , i.e. an algorithm with
only a single factor ratio on cost, can be immediately obtained. To
the best of our knowledge, this is the first non-trivial approximation
algorithm for the BCP problem that strictly obeys the delay constraint.Comment: 12 page
Distribution of graph-distances in Boltzmann ensembles of RNA secondary structures
Large RNA molecules often carry multiple functional domains whose spatial
arrangement is an important determinant of their function. Pre-mRNA splicing,
furthermore, relies on the spatial proximity of the splice junctions that can
be separated by very long introns. Similar effects appear in the processing of
RNA virus genomes. Albeit a crude measure, the distribution of spatial
distances in thermodynamic equilibrium therefore provides useful information on
the overall shape of the molecule can provide insights into the interplay of
its functional domains. Spatial distance can be approximated by the
graph-distance in RNA secondary structure. We show here that the equilibrium
distribution of graph-distances between arbitrary nucleotides can be computed
in polynomial time by means of dynamic programming. A naive implementation
would yield recursions with a very high time complexity of O(n^11). Although we
were able to reduce this to O(n^6) for many practical applications a further
reduction seems difficult. We conclude, therefore, that sampling approaches,
which are much easier to implement, are also theoretically favorable for most
real-life applications, in particular since these primarily concern long-range
interactions in very large RNA molecules.Comment: Peer-reviewed and presented as part of the 13th Workshop on
Algorithms in Bioinformatics (WABI2013
Microscopic Study of Slablike and Rodlike Nuclei: Quantum Molecular Dynamics Approach
Structure of cold dense matter at subnuclear densities is investigated by
quantum molecular dynamics (QMD) simulations. We succeeded in showing that the
phases with slab-like and rod-like nuclei etc. can be formed dynamically from
hot uniform nuclear matter without any assumptions on nuclear shape. We also
observe intermediate phases, which has complicated nuclear shapes. Geometrical
structures of matter are analyzed with Minkowski functionals, and it is found
out that intermediate phases can be characterized as ones with negative Euler
characteristic. Our result suggests the existence of these kinds of phases in
addition to the simple ``pasta'' phases in neutron star crusts.Comment: 6 pages, 4 figures, RevTex4; to be published in Phys. Rev. C Rapid
Communication (accepted version
Primordial fluctuations and non-Gaussianities from multifield DBI Galileon inflation
We study a cosmological scenario in which the DBI action governing the motion
of a D3-brane in a higher-dimensional spacetime is supplemented with an induced
gravity term. The latter reduces to the quartic Galileon Lagrangian when the
motion of the brane is non-relativistic and we show that it tends to violate
the null energy condition and to render cosmological fluctuations ghosts. There
nonetheless exists an interesting parameter space in which a stable phase of
quasi-exponential expansion can be achieved while the induced gravity leaves
non trivial imprints. We derive the exact second-order action governing the
dynamics of linear perturbations and we show that it can be simply understood
through a bimetric perspective. In the relativistic regime, we also calculate
the dominant contribution to the primordial bispectrum and demonstrate that
large non-Gaussianities of orthogonal shape can be generated, for the first
time in a concrete model. More generally, we find that the sign and the shape
of the bispectrum offer powerful diagnostics of the precise strength of the
induced gravity.Comment: 34 pages including 9 figures, plus appendices and bibliography.
Wordings changed and references added; matches version published in JCA
Global Systems Science and Policy
The vision of Global Systems Science (GSS) is to provide scientific evidence and means to engage into a reflective dialogue to support policy-making and public action and to enable civil society to collectively engage in societal action in response to global challenges like climate change, urbanisation, or social inclusion. GSS has four elements: policy and its implementation, the science of complex systems, policy informatics, and citizen engagement. It aims to give policy makers and citizens a better understanding of the possible behaviours of complex social systems. Policy informatics helps generate and evaluate policy options with computer-based tools and the abundance of data available today. The results they generate are made accessible to everybodyâpolicymakers, citizensâthrough intuitive user interfaces, animations, visual analytics, gaming, social media, and so on. Examples of Global Systems include epidemics, finance, cities, the Internet, trade systems and more. GSS addresses the question of policies having desirable outcomes, not necessarily optimal outcomes. The underpinning idea of GSS is not to precisely predict but to establish possible and desirable futures and their likelihood. Solving policy problems is a process, often needing the requirements, constraints, and lines of action to be revisited and modified, until the problem is âsatisficedâ, i.e. an acceptable compromise is found between competing objectives and constraints. Thus policy problems and their solutions coevolve much as in a design process. Policy and societal action is as much about attempts to understand objective facts as it is about the narratives that guide our actions. GSS tries to reconcile these apparently contradictory modes of operations. GSS thus provides policy makers and society guidance on their course of action rather than proposing (illusionary) optimal solutions
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in âs = 7 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fbâ1 of protonâproton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results
Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC
Measurements of inclusive jet suppression in heavy ion collisions at the LHC
provide direct sensitivity to the physics of jet quenching. In a sample of
lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated
luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with
a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the
transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the
anti-kt algorithm with values for the distance parameter that determines the
nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of
the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp.
Jet production is found to be suppressed by approximately a factor of two in
the 10% most central collisions relative to peripheral collisions. Rcp varies
smoothly with centrality as characterized by the number of participating
nucleons. The observed suppression is only weakly dependent on jet radius and
transverse momentum. These results provide the first direct measurement of
inclusive jet suppression in heavy ion collisions and complement previous
measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables,
submitted to Physics Letters B. All figures including auxiliary figures are
available at
http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02
A Combinatorial Framework for Designing (Pseudoknotted) RNA Algorithms
We extend an hypergraph representation, introduced by Finkelstein and
Roytberg, to unify dynamic programming algorithms in the context of RNA folding
with pseudoknots. Classic applications of RNA dynamic programming energy
minimization, partition function, base-pair probabilities...) are reformulated
within this framework, giving rise to very simple algorithms. This
reformulation allows one to conceptually detach the conformation space/energy
model -- captured by the hypergraph model -- from the specific application,
assuming unambiguity of the decomposition. To ensure the latter property, we
propose a new combinatorial methodology based on generating functions. We
extend the set of generic applications by proposing an exact algorithm for
extracting generalized moments in weighted distribution, generalizing a prior
contribution by Miklos and al. Finally, we illustrate our full-fledged
programme on three exemplary conformation spaces (secondary structures,
Akutsu's simple type pseudoknots and kissing hairpins). This readily gives sets
of algorithms that are either novel or have complexity comparable to classic
implementations for minimization and Boltzmann ensemble applications of dynamic
programming
- âŠ