5,482 research outputs found
Grid-Obstacle Representations with Connections to Staircase Guarding
In this paper, we study grid-obstacle representations of graphs where we
assign grid-points to vertices and define obstacles such that an edge exists if
and only if an -monotone grid path connects the two endpoints without
hitting an obstacle or another vertex. It was previously argued that all planar
graphs have a grid-obstacle representation in 2D, and all graphs have a
grid-obstacle representation in 3D. In this paper, we show that such
constructions are possible with significantly smaller grid-size than previously
achieved. Then we study the variant where vertices are not blocking, and show
that then grid-obstacle representations exist for bipartite graphs. The latter
has applications in so-called staircase guarding of orthogonal polygons; using
our grid-obstacle representations, we show that staircase guarding is
\textsc{NP}-hard in 2D.Comment: To appear in the proceedings of the 25th International Symposium on
Graph Drawing and Network Visualization (GD 2017
Inapproximability of maximal strip recovery
In comparative genomic, the first step of sequence analysis is usually to
decompose two or more genomes into syntenic blocks that are segments of
homologous chromosomes. For the reliable recovery of syntenic blocks, noise and
ambiguities in the genomic maps need to be removed first. Maximal Strip
Recovery (MSR) is an optimization problem proposed by Zheng, Zhu, and Sankoff
for reliably recovering syntenic blocks from genomic maps in the midst of noise
and ambiguities. Given genomic maps as sequences of gene markers, the
objective of \msr{d} is to find subsequences, one subsequence of each
genomic map, such that the total length of syntenic blocks in these
subsequences is maximized. For any constant , a polynomial-time
2d-approximation for \msr{d} was previously known. In this paper, we show that
for any , \msr{d} is APX-hard, even for the most basic version of the
problem in which all gene markers are distinct and appear in positive
orientation in each genomic map. Moreover, we provide the first explicit lower
bounds on approximating \msr{d} for all . In particular, we show that
\msr{d} is NP-hard to approximate within . From the other
direction, we show that the previous 2d-approximation for \msr{d} can be
optimized into a polynomial-time algorithm even if is not a constant but is
part of the input. We then extend our inapproximability results to several
related problems including \cmsr{d}, \gapmsr{\delta}{d}, and
\gapcmsr{\delta}{d}.Comment: A preliminary version of this paper appeared in two parts in the
Proceedings of the 20th International Symposium on Algorithms and Computation
(ISAAC 2009) and the Proceedings of the 4th International Frontiers of
Algorithmics Workshop (FAW 2010
The Nylon Scintillator Containment Vessels for the Borexino Solar Neutrino Experiment
Borexino is a solar neutrino experiment designed to observe the 0.86 MeV Be-7
neutrinos emitted in the pp cycle of the sun. Neutrinos will be detected by
their elastic scattering on electrons in 100 tons of liquid scintillator. The
neutrino event rate in the scintillator is expected to be low (~0.35 events per
day per ton), and the signals will be at energies below 1.5 MeV, where
background from natural radioactivity is prominent. Scintillation light
produced by the recoil electrons is observed by an array of 2240
photomultiplier tubes. Because of the intrinsic radioactive contaminants in
these PMTs, the liquid scintillator is shielded from them by a thick barrier of
buffer fluid. A spherical vessel made of thin nylon film contains the
scintillator, separating it from the surrounding buffer. The buffer region
itself is divided into two concentric shells by a second nylon vessel in order
to prevent inward diffusion of radon atoms. The radioactive background
requirements for Borexino are challenging to meet, especially for the
scintillator and these nylon vessels. Besides meeting requirements for low
radioactivity, the nylon vessels must also satisfy requirements for mechanical,
optical, and chemical properties. The present paper describes the research and
development, construction, and installation of the nylon vessels for the
Borexino experiment
Constant-degree graph expansions that preserve the treewidth
Many hard algorithmic problems dealing with graphs, circuits, formulas and
constraints admit polynomial-time upper bounds if the underlying graph has
small treewidth. The same problems often encourage reducing the maximal degree
of vertices to simplify theoretical arguments or address practical concerns.
Such degree reduction can be performed through a sequence of splittings of
vertices, resulting in an _expansion_ of the original graph. We observe that
the treewidth of a graph may increase dramatically if the splittings are not
performed carefully. In this context we address the following natural question:
is it possible to reduce the maximum degree to a constant without substantially
increasing the treewidth?
Our work answers the above question affirmatively. We prove that any simple
undirected graph G=(V, E) admits an expansion G'=(V', E') with the maximum
degree <= 3 and treewidth(G') <= treewidth(G)+1. Furthermore, such an expansion
will have no more than 2|E|+|V| vertices and 3|E| edges; it can be computed
efficiently from a tree-decomposition of G. We also construct a family of
examples for which the increase by 1 in treewidth cannot be avoided.Comment: 12 pages, 6 figures, the main result used by quant-ph/051107
On the possiblity of detecting Solar pp-neutrino with a large volume liquid organic scintillator detector
It is shown that a large volume liquid organic scintillator detector with an
energy resolution of 10 keV at 200 keV 1 sigma will be sensitive to solar
pp-neutrino, if operated at the target radiopurity levels for the Borexino
detector, or the solar neutrino project of KamLAND.Comment: 18 pages, 2 figures, 4 tables. Contributed paper to the
Nonaccelerating New Neutrino Physic. NANP-2003, Dubna. To be published in
Phys.At.Nucl.(2004
Measurement of Ultra-Low Potassium Contaminations with Accelerator Mass Spectrometry
Levels of trace radiopurity in active detector materials is a subject of
major concern in low-background experiments. Among the radio-isotopes, \k40
is one of the most abundant and yet whose signatures are difficult to reject.
Procedures were devised to measure trace potassium concentrations in the
inorganic salt CsI as well as in organic liquid scintillator (LS) with
Accelerator Mass Spectrometry (AMS), giving, respectively, the
\k40-contamination levels of and g/g.
Measurement flexibilities and sensitivities are improved over conventional
methods. The projected limiting sensitivities if no excess of potassium signals
had been observed over background are g/g and g/g for the CsI and LS, respectively. Studies of the LS samples
indicate that the radioactive contaminations come mainly in the dye solutes,
while the base solvents are orders of magnitude cleaner. The work demonstrate
the possibilities of measuring naturally-occurring isotopes with the AMS
techniques.Comment: 18 pages, 4 figures, 3 table
Parameterized Complexity of the k-anonymity Problem
The problem of publishing personal data without giving up privacy is becoming
increasingly important. An interesting formalization that has been recently
proposed is the -anonymity. This approach requires that the rows of a table
are partitioned in clusters of size at least and that all the rows in a
cluster become the same tuple, after the suppression of some entries. The
natural optimization problem, where the goal is to minimize the number of
suppressed entries, is known to be APX-hard even when the records values are
over a binary alphabet and , and when the records have length at most 8
and . In this paper we study how the complexity of the problem is
influenced by different parameters. In this paper we follow this direction of
research, first showing that the problem is W[1]-hard when parameterized by the
size of the solution (and the value ). Then we exhibit a fixed parameter
algorithm, when the problem is parameterized by the size of the alphabet and
the number of columns. Finally, we investigate the computational (and
approximation) complexity of the -anonymity problem, when restricting the
instance to records having length bounded by 3 and . We show that such a
restriction is APX-hard.Comment: 22 pages, 2 figure
Neutral B Flavor Tagging for the Measurement of Mixing-induced CP Violation at Belle
We describe a flavor tagging algorithm used in measurements of the CP
violation parameter sin2phi_1 at the Belle experiment. Efficiencies and wrong
tag fractions are evaluated using flavor-specific B meson decays into hadronic
and semileptonic modes. We achieve a total effective efficiency of $ 28.8 +-
0.6 %.Comment: 28 pages, 9 figure
Constraining Non-Standard Interactions of the Neutrino with Borexino
We use the Borexino 153.6 ton.year data to place constraints on non-standard
neutrino-electron interactions, taking into account the uncertainty in the 7Be
solar neutrino flux, and backgrounds due to 85Kr and 210Bi beta-decay. We find
that the bounds are comparable to existing bounds from all other experiments.
Further improvement can be expected in Phase II of Borexino due to the
reduction in the 85Kr background.Comment: 21 pages, 16 pdf figures, 2 tables. Analysis updated including the
uncertainty in sin^2\theta_{23}. Accepted in JHE
New limits on nucleon decays into invisible channels with the BOREXINO Counting Test Facility
The results of background measurements with the second version of the
BOREXINO Counting Test Facility (CTF-II), installed in the Gran Sasso
Underground Laboratory, were used to obtain limits on the instability of
nucleons, bounded in nuclei, for decays into invisible channels ():
disappearance, decays to neutrinos, etc. The approach consisted of a search for
decays of unstable nuclides resulting from and decays of parents
C, C and O nuclei in the liquid scintillator and the water
shield of the CTF. Due to the extremely low background and the large mass (4.2
ton) of the CTF detector, the most stringent (or competitive) up-to-date
experimental bounds have been established: y, y, y and y, all at 90% C.L.Comment: 22 pages, 3 figures,submitted to Phys.Lett.
- …