1,144 research outputs found
Estimating the number of change-points in a two-dimensional segmentation model without penalization
In computational biology, numerous recent studies have been dedicated to the
analysis of the chromatin structure within the cell by two-dimensional
segmentation methods. Motivated by this application, we consider the problem of
retrieving the diagonal blocks in a matrix of observations. The theoretical
properties of the least-squares estimators of both the boundaries and the
number of blocks proposed by L\'evy-Leduc et al. [2014] are investigated. More
precisely, the contribution of the paper is to establish the consistency of
these estimators. A surprising consequence of our results is that, contrary to
the onedimensional case, a penalty is not needed for retrieving the true number
of diagonal blocks. Finally, the results are illustrated on synthetic data.Comment: 30 pages, 8 figure
Answering Conjunctive Queries under Updates
We consider the task of enumerating and counting answers to -ary
conjunctive queries against relational databases that may be updated by
inserting or deleting tuples. We exhibit a new notion of q-hierarchical
conjunctive queries and show that these can be maintained efficiently in the
following sense. During a linear time preprocessing phase, we can build a data
structure that enables constant delay enumeration of the query results; and
when the database is updated, we can update the data structure and restart the
enumeration phase within constant time. For the special case of self-join free
conjunctive queries we obtain a dichotomy: if a query is not q-hierarchical,
then query enumeration with sublinear delay and sublinear update time
(and arbitrary preprocessing time) is impossible.
For answering Boolean conjunctive queries and for the more general problem of
counting the number of solutions of k-ary queries we obtain complete
dichotomies: if the query's homomorphic core is q-hierarchical, then size of
the the query result can be computed in linear time and maintained with
constant update time. Otherwise, the size of the query result cannot be
maintained with sublinear update time. All our lower bounds rely on the
OMv-conjecture, a conjecture on the hardness of online matrix-vector
multiplication that has recently emerged in the field of fine-grained
complexity to characterise the hardness of dynamic problems. The lower bound
for the counting problem additionally relies on the orthogonal vectors
conjecture, which in turn is implied by the strong exponential time hypothesis.
By sublinear we mean for some
, where is the size of the active domain of the current
database
Beyond Worst-Case Analysis for Joins with Minesweeper
We describe a new algorithm, Minesweeper, that is able to satisfy stronger
runtime guarantees than previous join algorithms (colloquially, `beyond
worst-case guarantees') for data in indexed search trees. Our first
contribution is developing a framework to measure this stronger notion of
complexity, which we call {\it certificate complexity}, that extends notions of
Barbay et al. and Demaine et al.; a certificate is a set of propositional
formulae that certifies that the output is correct. This notion captures a
natural class of join algorithms. In addition, the certificate allows us to
define a strictly stronger notion of runtime complexity than traditional
worst-case guarantees. Our second contribution is to develop a dichotomy
theorem for the certificate-based notion of complexity. Roughly, we show that
Minesweeper evaluates -acyclic queries in time linear in the certificate
plus the output size, while for any -cyclic query there is some instance
that takes superlinear time in the certificate (and for which the output is no
larger than the certificate size). We also extend our certificate-complexity
analysis to queries with bounded treewidth and the triangle query.Comment: [This is the full version of our PODS'2014 paper.
A continuous non-linear shadowing model of columnar growth
We propose the first continuous model with long range screening (shadowing)
that described columnar growth in one space dimension, as observed in plasma
sputter deposition. It is based on a new continuous partial derivative equation
with non-linear diffusion and where the shadowing effects apply on all the
different processes.Comment: Fast Track Communicatio
Animal Models of Zika Virus Sexual Transmission
ZIKV was first identified in the 1940s as a mosquito-borne virus; however, sexual transmission, which is uncommon for arboviruses, was demonstrated more than 60 years later. Tissue culture and animal models have allowed scientists to study how this transmission is possible. Immunocompromised mice infected with ZIKV had high viral loads in their testes, and infection of immunocompetent female mice was achieved following intravaginal inoculation or inoculation via mating with an infected male. These mouse studies lead researchers to investigate the individual components of the male reproductive system. In cell culture and mouse models, ZIKV can persist in Sertoli and germ cells of the testes and epithelial cells in the epididymis, which may lead to sexual transmission even after ZIKV has been cleared from other tissues. ZIKV has also been studied in nonhuman primates (NHPs), which appears to mimic the limited human epidemiological data, with low rates of symptomatic individuals and similar clinical signs. Although refinement is needed, these animal models have proven to be key in ZIKV research and continue to help uncovering the mechanisms of sexual transmission. This review will focus on the animal models used to elucidate the mechanisms of sexual transmission and persistence of flaviviruses
Modeling Chromosomes in Mouse to Explore the Function of Genes, Genomic Disorders, and Chromosomal Organization
One of the challenges of genomic research after the completion of the human genome project is to assign a function to all the genes and to understand their interactions and organizations. Among the various techniques, the emergence of chromosome engineering tools with the aim to manipulate large genomic regions in the mouse model offers a powerful way to accelerate the discovery of gene functions and provides more mouse models to study normal and pathological developmental processes associated with aneuploidy. The combination of gene targeting in ES cells, recombinase technology, and other techniques makes it possible to generate new chromosomes carrying specific and defined deletions, duplications, inversions, and translocations that are accelerating functional analysis. This review presents the current status of chromosome engineering techniques and discusses the different applications as well as the implication of these new techniques in future research to better understand the function of chromosomal organization and structures
Ringing effects reduction by improved deconvolution algorithm Application to A370 CFHT image of gravitational arcs
We develop a self-consistent automatic procedure to restore informations from
astronomical observations. It relies on both a new deconvolution algorithm
called LBCA (Lower Bound Constraint Algorithm) and the use of the Wiener
filter. In order to explore its scientific potential for strong and weak
gravitational lensing, we process a CFHT image of the galaxies cluster Abell
370 which exhibits spectacular strong gravitational lensing effects. A high
quality restoration is here of particular interest to map the dark matter
within the cluster. We show that the LBCA turns out specially efficient to
reduce ringing effects introduced by classical deconvolution algorithms in
images with a high background. The method allows us to make a blind detection
of the radial arc and to recover morphological properties similar to
thoseobserved from HST data. We also show that the Wiener filter is suitable to
stop the iterative process before noise amplification, using only the
unrestored data.Comment: A&A in press 9 pages 9 figure
Argon I lines produced in a hollow cathode source, 332 nm to 5865 nm
We report precision measurements by
Fourier transform spectroscopy of the
vacuum wavenumber, line width, and relative
signal strength of 928 lines in the
Ar I spectrum. Wavelength in air and classification
of the transition are supplied for
each line. A comparison of our results with
other precision measurements illustrates
the sensitivity of Ar I wavelengths to conditions
in the light source
Solar Carbon Monoxide, Thermal Profiling, and the Abundances of C, O, and their Isotopes
A solar photospheric "thermal profiling" analysis is presented, exploiting
the infrared rovibrational bands of carbon monoxide (CO) as observed with the
McMath-Pierce Fourier transform spectrometer (FTS) at Kitt Peak, and from above
the Earth's atmosphere by the Shuttle-borne ATMOS experiment. Visible continuum
intensities and center-limb behavior constrained the temperature profile of the
deep photosphere, while CO center-limb behavior defined the thermal structure
at higher altitudes. The oxygen abundance was self consistently determined from
weak CO absorptions. Our analysis was meant to complement recent studies based
on 3-D convection models which, among other things, have revised the historical
solar oxygen (and carbon) abundance downward by a factor of nearly two;
although in fact our conclusions do not support such a revision. Based on
various considerations, an oxygen abundance of 700+/-100 ppm (parts per million
relative to hydrogen) is recommended; the large uncertainty reflects the model
sensitivity of CO. New solar isotopic ratios also are reported for 13C, 17O,
and 18O.Comment: 90 pages, 19 figures (some with parts "a", "b", etc.); to be
published in the Astrophysical Journal Supplement
Real-world Experience With Sunitinib Treatment in Patients With Metastatic Renal Cell Carcinoma: Clinical Outcome According to Risk Score.
BACKGROUND: ADONIS is an ongoing observational study in 9 European countries, designed to evaluate treatment patterns/outcomes in patients with metastatic renal cell carcinoma (mRCC) treated with first-line sunitinib and/or second-line axitinib post sunitinib. We present an evaluation of sunitinib efficacy by risk group, in the real-world setting examined in ADONIS. PATIENTS AND METHODS: Patients were enrolled at the start of first-line sunitinib treatment or second-line axitinib post sunitinib treatment. Evaluation of sunitinib efficacy was assessed by International Metastatic Renal Cell Carcinoma Database Consortium (IMDC) and Memorial Sloan Kettering Cancer Center risk criteria. RESULTS: For all patients in this analysis (N = 467), the median progression-free survival was 23.8 months (95% confidence interval [CI], 16.5-28.5 months), 11.8 months (95% CI, 8.1-17.4 months), and 4.6 months (95% CI, 2.5-7.7 months) for IMDC favorable-, intermediate-, and poor-risk groups, respectively. The median overall survival was 97.1 months (95% CI, 46.3 months-not evaluable [NE]), 33.5 months (95% CI, 20.5-46.6 months), and 10.0 months (95% CI, 4.5-19.8 months) for the respective risk groups. Data on individual risk factors were available for a subgroup of patients, allowing analysis by intermediate risk by 1 versus 2 risk factors. When including this subgroup (n = 120), the median overall survival for IMDC favorable-, intermediate-1, and intermediate-2 risk factors was 21.6 months (95% CI, 16.3 months-NE), 20.5 months (15.5 months-NE), and 15.1 months (4.1 months-NE), respectively. CONCLUSIONS: For patients overall and by risk-group stratification, survival estimates were aligned with previously published data. In patients with intermediate-1 risk, overall survival was very similar to patients with favorable risk. However, further exploration of outcome data from different sources is needed to confirm these observations
- …