210 research outputs found
Optimal Complexity and Certification of Bregman First-Order Methods
We provide a lower bound showing that the convergence rate of the
NoLips method (a.k.a. Bregman Gradient) is optimal for the class of functions
satisfying the -smoothness assumption. This assumption, also known as
relative smoothness, appeared in the recent developments around the Bregman
Gradient method, where acceleration remained an open issue. On the way, we show
how to constructively obtain the corresponding worst-case functions by
extending the computer-assisted performance estimation framework of Drori and
Teboulle (Mathematical Programming, 2014) to Bregman first-order methods, and
to handle the classes of differentiable and strictly convex functions.Comment: To appear in Mathematical Programmin
GIS-Based Forage Species Adaptation Mapping
Selecting forage crops adapted to the climatic and edaphic conditions of specific locations is essential for economic sustainability and environmental protection. Yet, currently, proper selection is difficult due to the absence of advanced selection tools. Significant improvements are being made in the process through Geographic Information System (GIS)-based mapping. Climate and soil GIS layers are being matched with forage characteristics through rules describing species tolerances. Better matching will reduce economic risks and environmental hazards associated with sub-optimal crop selection and subsequent performance. Once developed, these forage crop selection strategies and tools can be adapted for use with other crops. A matrix of species characteristics is being assembled for 6 major forage crops. GIS-based climate and soils maps are being developed and reviewed. Base layer climate and soils maps and the species adaptation maps will be placed on a CD-ROM to help educators, consultants, farmers, and ranchers match their conditions to suitable forage crop species. A WWW segment is being developed to provide a source of current information and links to original data and supplementary materials
Mass segregation of different populations inside the cluster NGC6101
We have used ESO telescopes at La Silla and the Hubble Space Telescope (HST)
in order to obtain accurate B,V,I CCD photometry for the stars located within
200" (~= 2 half-mass radii, r_h = 1.71') from the center of the cluster NGC
6101. Color-Magnitude Diagrams extending from the red-giant tip to about 5
magnitudes below the main-sequence turnoff MSTO (V = 20.05 +- 0.05) have been
constructed.
The following results have been obtained from the analysis of the CMDs: a)
The overall morphology of the main branches confirms previous results from the
literature, in particular the existence of a sizeable population of 73 "blue
stragglers", which had been already partly detected (27).They are considerably
more concentrated than either the subgiant branch or the main sequence stars,
and have the same spatial distribution as the horizontal branch stars (84%
prob. from K-S test). An hypothesis on the possible BSS progeny is also
presented. b) The HB is narrow and the bulk of stars is blue, as expected for a
typical metal-poor globular cluster. c) The derived magnitudes for the HB and
the MSTO, $V(ZAHB) = 16.59+-0.10, V(TO) = 20.05+-0.05, coupled with the values
E(B-V) = 0.1, [Fe/H] = -1.80, Y = 0.23 yield a distance modulus (m-M)_V = 16.23
and an age similar to other ``old'' metal-poor globular clusters. In
particular, from the comparison with theoretical isochrones, we derive for this
cluster an age of 13 Gyrs. d) By using the large statistical sample of Red
Giant Branch (RGB) stars, we detected with high accuracy the position of the
bump in the RGB luminosity function. This observational feature has been
compared with theoretical prescriptions, yielding a good agreement within the
current theoretical and observational uncertainties.Comment: 13 pages, 17 figures, uses documentclass 'aa' v 5.01 with package
'graphicx'. Accepted for publication in Astronomy & Astrophysic
A role for core planar polarity proteins in cell contact-mediated orientation of planar cell division across the mammalian embryonic skin
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. © The Author(s) 2017. Supplementary information accompanies this paper at doi:10.1038/s41598-017-01971-2.The question of how cell division orientation is determined is fundamentally important for understanding tissue and organ shape in both healthy or disease conditions. Here we provide evidence for cell contact-dependent orientation of planar cell division in the mammalian embryonic skin. We propose a model where the core planar polarity proteins Celsr1 and Frizzled-6 (Fz6) communicate the long axis orientation of interphase basal cells to neighbouring basal mitoses so that they align their horizontal division plane along the same axis. The underlying mechanism requires a direct, cell surface, planar polarised cue, which we posit depends upon variant post-translational forms of Celsr1 protein coupled to Fz6. Our hypothesis has parallels with contact-mediated division orientation in early C. elegans embryos suggesting functional conservation between the adhesion-GPCRs Celsr1 and Latrophilin-1. We propose that linking planar cell division plane with interphase neighbour long axis geometry reinforces axial bias in skin spreading around the mouse embryo body.Peer reviewe
Determining Supersymmetric Parameters With Dark Matter Experiments
In this article, we explore the ability of direct and indirect dark matter
experiments to not only detect neutralino dark matter, but to constrain and
measure the parameters of supersymmetry. In particular, we explore the
relationship between the phenomenological quantities relevant to dark matter
experiments, such as the neutralino annihilation and elastic scattering cross
sections, and the underlying characteristics of the supersymmetric model, such
as the values of mu (and the composition of the lightest neutralino), m_A and
tan beta. We explore a broad range of supersymmetric models and then focus on a
smaller set of benchmark models. We find that by combining astrophysical
observations with collider measurements, mu can often be constrained far more
tightly than it can be from LHC data alone. In models in the A-funnel region of
parameter space, we find that dark matter experiments can potentially determine
m_A to roughly +/-100 GeV, even when heavy neutral MSSM Higgs bosons (A, H_1)
cannot be observed at the LHC. The information provided by astrophysical
experiments is often highly complementary to the information most easily
ascertained at colliders.Comment: 46 pages, 76 figure
Low Complexity Regularization of Linear Inverse Problems
Inverse problems and regularization theory is a central theme in contemporary
signal processing, where the goal is to reconstruct an unknown signal from
partial indirect, and possibly noisy, measurements of it. A now standard method
for recovering the unknown signal is to solve a convex optimization problem
that enforces some prior knowledge about its structure. This has proved
efficient in many problems routinely encountered in imaging sciences,
statistics and machine learning. This chapter delivers a review of recent
advances in the field where the regularization prior promotes solutions
conforming to some notion of simplicity/low-complexity. These priors encompass
as popular examples sparsity and group sparsity (to capture the compressibility
of natural signals and images), total variation and analysis sparsity (to
promote piecewise regularity), and low-rank (as natural extension of sparsity
to matrix-valued data). Our aim is to provide a unified treatment of all these
regularizations under a single umbrella, namely the theory of partial
smoothness. This framework is very general and accommodates all low-complexity
regularizers just mentioned, as well as many others. Partial smoothness turns
out to be the canonical way to encode low-dimensional models that can be linear
spaces or more general smooth manifolds. This review is intended to serve as a
one stop shop toward the understanding of the theoretical properties of the
so-regularized solutions. It covers a large spectrum including: (i) recovery
guarantees and stability to noise, both in terms of -stability and
model (manifold) identification; (ii) sensitivity analysis to perturbations of
the parameters involved (in particular the observations), with applications to
unbiased risk estimation ; (iii) convergence properties of the forward-backward
proximal splitting scheme, that is particularly well suited to solve the
corresponding large-scale regularized optimization problem
Getting the whole picture: High content screening using three-dimensional cellular model systems and whole animal assays
Phenotypic or High Content Screening (HCS) is becoming more widely used for primary screening campaigns in drug discovery. Currently the vast majority of HCS campaigns are using cell lines grown in well-established monolayer cultures (2D tissue culture). There is widespread recognition that the more biologically relevant 3D tissue culture technologies such as spheroids and organoids and even whole animal assays will eventually be run as primary HCS. Upgrading the IT infrastructure to cope with the increase in data volumes requires investments in hardware (and software) and this will be manageable. However, the main bottleneck for the effective adoption and use of 3D tissue culture and whole animal assays in HCS is anticipated to be the development of software for the analysis of 3D images. In this review we summarize the current state of the available software and how they may be applied to analyzing 3D images obtained from a HCS campaign
The WOMAN Trial (World Maternal Antifibrinolytic Trial): tranexamic acid for the treatment of postpartum haemorrhage: an international randomised, double blind placebo controlled trial
<p>Abstract</p> <p>Background</p> <p>Each year, worldwide about 530,000 women die from causes related to pregnancy and childbirth. Of the deaths 99% are in low and middle income countries. Obstetric haemorrhage is the leading cause of maternal mortality, most occurring in the postpartum period. Systemic antifibrinolytic agents are widely used in surgery to prevent clot breakdown (fibrinolysis) in order to reduce surgical blood loss. At present there is little reliable evidence from randomised trials on the effectiveness of tranexamic acid in the treatment of postpartum haemorrhage.</p> <p>Methods</p> <p>The Trial aims to determine the effect of early administration of tranexamic acid on mortality, hysterectomy and other morbidities (surgical interventions, blood transfusion, risk of non-fatal vascular events) in women with clinically diagnosed postpartum haemorrhage. The use of health services and safety, especially thromboembolic effect, on breastfed babies will also be assessed. The trial will be a large, pragmatic, randomised, double blind, placebo controlled trial among 15,000 women with a clinical diagnosis of postpartum haemorrhage. All legally adult women with clinically diagnosed postpartum haemorrhage following vaginal delivery of a baby or caesarean section will potentially be eligible. The fundamental eligibility criterion is the responsible clinician's 'uncertainty' as to whether or not to use an antifibrinolytic agent in a particular woman with postpartum haemorrhage. Treatment will entail a dose of tranexamic acid (1 gram by intravenous injection) or placebo (sodium chloride 0.9%) will be given as soon as possible after randomisation. A second dose may be given if after 30 minutes bleeding continues, or if it stops and restarts within 24 hours after the first dose.</p> <p>The main analyses will be on an 'intention to treat' basis, irrespective of whether the allocated treatment was received or not. Subgroup analyses for the primary outcome will be based on type of delivery; administration or not of prophylactic uterotonics; and on whether the clinical decision to consider trial entry was based primarily on estimated blood loss alone or on haemodynamic instability. A study with 15,000 women will have over 90% power to detect a 25% reduction from 4% to 3% in the primary endpoint of mortality or hysterectomy.</p> <p>Trial registration</p> <p>Current Controlled Trials: ISRCTN76912190 and Clinicaltrials.gov ID: NCT00872469</p
- …