1,165 research outputs found
Improved Combinatorial Group Testing Algorithms for Real-World Problem Sizes
We study practically efficient methods for performing combinatorial group
testing. We present efficient non-adaptive and two-stage combinatorial group
testing algorithms, which identify the at most d items out of a given set of n
items that are defective, using fewer tests for all practical set sizes. For
example, our two-stage algorithm matches the information theoretic lower bound
for the number of tests in a combinatorial group testing regimen.Comment: 18 pages; an abbreviated version of this paper is to appear at the
9th Worksh. Algorithms and Data Structure
Motion of condensates in non-Markovian zero-range dynamics
Condensation transition in a non-Markovian zero-range process is studied in
one and higher dimensions. In the mean-field approximation, corresponding to
infinite range hopping, the model exhibits condensation with a stationary
condensate, as in the Markovian case, but with a modified phase diagram. In the
case of nearest-neighbor hopping, the condensate is found to drift by a
"slinky" motion from one site to the next. The mechanism of the drift is
explored numerically in detail. A modified model with nearest-neighbor hopping
which allows exact calculation of the steady state is introduced. The steady
state of this model is found to be a product measure, and the condensate is
stationary.Comment: 31 pages, 9 figure
Laser application to measure vertical sea temperature and turbidity, design phase
An experiment to test a new method was designed, using backscattered radiation from a laser beam to measure oceanographic parameters in a fraction of a second. Tyndall, Rayleigh, Brillouin, and Raman scattering all are utilized to evaluate the parameters. A beam from a continuous argon ion laser is used together with an interferometer and interference filters to gather the information. The results are checked by direct measurements. Future shipboard and airborne experiments are described
A vascularis intervenciókat követő restenosis vizsgálata klinikai és kísérletes tanulmányokban
Restenosis following endovascular interventions is the main limitation of their long-term success. The incidence of restenosis varies according to the method (stenting, endarterectomy) and the treated vascular region, but the pathomechanism and risk factors are similar. The current article reviews of the author's previous studies in this field. In clinical studies, we compared the restenosis rate after carotid artery stenting and carotid endarterectomy. We also analyzed the complement activation profile after these interventions. In another study, we investigated the role of two polymorphisms of the estrogen receptor alpha in the occurrence of carotid restenosis after either carotid artery stenting or carotid endarterectomy. In an animal model of carotid endarterectomy, we studied the role of the nitrite-oxide-cyclic guanosine monophosphate signaling and the effect of the phosphodiesterase-5 inhibitor therapy in neointimal hyperplasia. Our results suggest that higher incidence of restenosis following carotid endarterectomy can be correlated with the more highly expressed complement activation after this type of carotid intervention. Polymorphisms in the estrogen receptor alpha gene could contribute to the restenosis formation, especially in women. Neointimal hyperplasia can be attenuated by increased cyclic guanosine monophosphate signaling
Comb and Branch‐on‐Branch Model Polystyrenes with Exceptionally High Strain Hardening Factor SHF > 1000 and Their Impact on Physical Foaming
The influence of topology on the strain hardening in uniaxial elongation is investigated using monodisperse comb and dendrigraft model polystyrenes (PS) synthesized via living anionic polymerization. A backbone with a molecular weight of M = 310 kg mol is used for all materials, while a number of 100 short (SCB, M = 15 kg mol) or long chain branches (LCB, M = 40 kg mol) are grafted onto the backbone. The synthesized LCB comb serves as precursor for the dendrigraft-type branch-on-branch (bob) structures to add a second generation of branches (SCB, M ≈ 14 kg mol) that is varied in number from 120 to 460. The SCB and LCB combs achieve remarkable strain hardening factors (SHF) of around 200 at strain rates greater than 0.1 s. In contrast, the bob PS reach exceptionally high SHF of 1750 at very low strain rates of 0.005 s using a tilted sample placement to extend the maximum Hencky strain from 4 to 6. To the best of the authors’ knowledge, SHF this high have never been reported for polymer melts. Furthermore, the batch foaming with CO is investigated and the volume expansions of the resulting polymer foams are correlated to the uniaxial elongational properties
An Efficient Data Structure for Dynamic Two-Dimensional Reconfiguration
In the presence of dynamic insertions and deletions into a partially
reconfigurable FPGA, fragmentation is unavoidable. This poses the challenge of
developing efficient approaches to dynamic defragmentation and reallocation.
One key aspect is to develop efficient algorithms and data structures that
exploit the two-dimensional geometry of a chip, instead of just one. We propose
a new method for this task, based on the fractal structure of a quadtree, which
allows dynamic segmentation of the chip area, along with dynamically adjusting
the necessary communication infrastructure. We describe a number of algorithmic
aspects, and present different solutions. We also provide a number of basic
simulations that indicate that the theoretical worst-case bound may be
pessimistic.Comment: 11 pages, 12 figures; full version of extended abstract that appeared
in ARCS 201
From the zero-field metal-insulator transition in two dimensions to the quantum Hall transition: a percolation-effective-medium theory
Effective-medium theory is applied to the percolation description of the
metal-insulator transition in two dimensions with emphasis on the continuous
connection between the zero-magnetic-field transition and the quantum Hall
transition. In this model the system consists of puddles connected via saddle
points, and there is loss of quantum coherence inside the puddles. The
effective conductance of the network is calculated using appropriate
integration over the distribution of conductances, leading to a determination
of the magnetic field dependence of the critical density. Excellent
quantitative agreement is obtained with the experimental data, which allows an
estimate of the puddle physical parameters
Autonomous decision-making against induced seismicity in deep fluid injections
The rise in the frequency of anthropogenic earthquakes due to deep fluid
injections is posing serious economic, societal, and legal challenges to
geo-energy and waste-disposal projects. We propose an actuarial approach to
mitigate this risk, first by defining an autonomous decision-making process
based on an adaptive traffic light system (ATLS) to stop risky injections, and
second by quantifying a "cost of public safety" based on the probability of an
injection-well being abandoned. The ATLS underlying statistical model is first
confirmed to be representative of injection-induced seismicity, with examples
taken from past reservoir stimulation experiments (mostly from Enhanced
Geothermal Systems, EGS). Then the decision strategy is formalized: Being
integrable, the model yields a closed-form ATLS solution that maps a risk-based
safety standard or norm to an earthquake magnitude not to exceed during
stimulation. Finally, the EGS levelized cost of electricity (LCOE) is
reformulated in terms of null expectation, with the cost of abandoned
injection-well implemented. We find that the price increase to mitigate the
increased seismic risk in populated areas can counterbalance the heat credit.
However this "public safety cost" disappears if buildings are based on
earthquake-resistant designs or if a more relaxed risk safety standard or norm
is chosen.Comment: 8 pages, 4 figures, conference (International Symposium on Energy
Geotechnics, 26-28 September 2018, Lausanne, Switzerland
Recommended from our members
Novel picornavirus in turkey poults with hepatitis, California, USA
To identify a candidate etiologic agent for turkey viral hepatitis, we analyzed samples from diseased turkey poults from 8 commercial flocks in California, USA, that were collected during 2008–2010. High-throughput pyrosequencing of RNA from livers of poults with turkey viral hepatitis (TVH) revealed picornavirus sequences. Subsequent cloning of the ≈9-kb genome showed an organization similar to that of picornaviruses with conservation of motifs within the P1, P2, and P3 genome regions, but also unique features, including a 1.2-kb sequence of unknown function at the junction of P1 and P2 regions. Real-time PCR confirmed viral RNA in liver, bile, intestine, serum, and cloacal swab specimens from diseased poults. Analysis of liver by in situ hybridization with viral probes and immunohistochemical testing of serum demonstrated viral nucleic acid and protein in livers of diseased poults. Molecular, anatomic, and immunologic evidence suggests that TVH is caused by a novel picornavirus, tentatively named turkey hepatitis virus
- …