87 research outputs found

    Border forces and friction control epithelial closure dynamics

    Get PDF
    Epithelization, the process whereby an epithelium covers a cell-free surface, is not only central to wound healing but also pivotal in embryonic morphogenesis, regeneration, and cancer. In the context of wound healing, the epithelization mechanisms differ depending on the sizes and geometries of the wounds as well as on the cell type while a unified theoretical decription is still lacking. Here, we used a barrier-based protocol that allows for making large arrays of well-controlled circular model wounds within an epithelium at confluence, without injuring any cells. We propose a physical model that takes into account border forces, friction with the substrate, and tissue rheology. Despite the presence of a contractile actomyosin cable at the periphery of the wound, epithelization was mostly driven by border protrusive activity. Closure dynamics was quantified by an epithelization coefficient D=σp/ΟD = \sigma_p/\xi defined as the ratio of the border protrusive stress σp\sigma_p to the friction coefficient Ο\xi between epithelium and substrate. The same assay and model showed a high sensitivity to the RasV12 mutation on human epithelial cells, demonstrating the general applicability of the approach and its potential to quantitatively characterize metastatic transformations.Comment: 44 pages, 17 figure

    Cosmic Acceleration from Causal Backreaction with Recursive Nonlinearities

    Full text link
    We revisit the causal backreaction paradigm, in which the need for Dark Energy is eliminated via the generation of an apparent cosmic acceleration from the causal flow of inhomogeneity information coming in towards each observer from distant structure-forming regions. This second-generation formalism incorporates "recursive nonlinearities": the process by which already-established metric perturbations will then act to slow down all future flows of inhomogeneity information. Here, the long-range effects of causal backreaction are now damped, weakening its impact for models that were previously best-fit cosmologies. Nevertheless, we find that causal backreaction can be recovered as a replacement for Dark Energy via the adoption of larger values for the dimensionless `strength' of the clustering evolution functions being modeled -- a change justified by the hierarchical nature of clustering and virialization in the universe, occurring on multiple cosmic length scales simultaneously. With this, and with one new model parameter representing the slowdown of clustering due to astrophysical feedback processes, an alternative cosmic concordance can once again be achieved for a matter-only universe in which the apparent acceleration is generated entirely by causal backreaction effects. One drawback is a new degeneracy which broadens our predicted range for the observed jerk parameter j0Obsj_{0}^{\mathrm{Obs}}, thus removing what had appeared to be a clear signature for distinguishing causal backreaction from Cosmological Constant Λ\LambdaCDM. As for the long-term fate of the universe, incorporating recursive nonlinearities appears to make the possibility of an `eternal' acceleration due to causal backreaction far less likely; though this does not take into account gravitational nonlinearities or the large-scale breakdown of cosmological isotropy, effects not easily modeled within this formalism.Comment: 53 pages, 7 figures, 3 tables. This paper is an advancement of previous research on Causal Backreaction; the earlier work is available at arXiv:1109.4686 and arXiv:1109.515

    Evidence for Unresolved Gamma-Ray Point Sources in the Inner Galaxy

    Get PDF
    We present a new method to characterize unresolved point sources (PSs), generalizing traditional template fits to account for non-Poissonian photon statistics. We apply this method to Fermi Large Area Telescope gamma-ray data to characterize PS populations at high latitudes and in the Inner Galaxy. We find that PSs (resolved and unresolved) account for ~50% of the total extragalactic gamma-ray background in the energy range ~1.9 to 11.9 GeV. Within 10∘^\circ of the Galactic Center with ∣bâˆŁâ‰„2∘|b| \geq 2^\circ, we find that ~5-10% of the flux can be accounted for by a population of unresolved PSs, distributed consistently with the observed ~GeV gamma-ray excess in this region. The excess is fully absorbed by such a population, in preference to dark-matter annihilation. The inferred source population is dominated by near-threshold sources, which may be detectable in future searches.Comment: 7+22 pages, 4+18 figures; v2, minor changes, new Pass 8 data analyzed (conclusions unchanged); v3, PRL version, substantive improvements and additional checks (conclusion unchanged

    Best Practices and Recommendations for Crowdsourced QoE - Lessons learned from the Qualinet Task Force Crowdsourcing

    Get PDF
    Crowdsourcing is a popular approach that outsources tasks via the Internet to a large number of users. Commercial crowdsourcing platforms provide a global pool of users employed for performing short and simple online tasks. For quality assessment of multimedia services and applications, crowdsourcing enables new possibilities by moving the subjective test into the crowd resulting in larger diversity of the test subjects, faster turnover of test campaigns, and reduced costs due to low reimbursement costs of the participants. Further, crowdsourcing allows easily addressing additional features like real-life environments. This white paper summarizes the recommendations and best practices for crowdsourced quality assessment of multimedia applications from the Qualinet Task Force on “Crowdsourcing”. The European Network on Quality of Experience in Multimedia Systems and Services Qualinet (COST Action IC 1003, see www.qualinet.eu) established this task force in 2012 which has more than 30 members. The recommendation paper resulted from the experience in designing, implementing, and conducting crowdsourcing experiments as well as the analysis of the crowdsourced user ratings and context data

    A Proposal for a Three Detector Short-Baseline Neutrino Oscillation Program in the Fermilab Booster Neutrino Beam

    Get PDF
    A Short-Baseline Neutrino (SBN) physics program of three LAr-TPC detectors located along the Booster Neutrino Beam (BNB) at Fermilab is presented. This new SBN Program will deliver a rich and compelling physics opportunity, including the ability to resolve a class of experimental anomalies in neutrino physics and to perform the most sensitive search to date for sterile neutrinos at the eV mass-scale through both appearance and disappearance oscillation channels. Using data sets of 6.6e20 protons on target (P.O.T.) in the LAr1-ND and ICARUS T600 detectors plus 13.2e20 P.O.T. in the MicroBooNE detector, we estimate that a search for muon neutrino to electron neutrino appearance can be performed with ~5 sigma sensitivity for the LSND allowed (99% C.L.) parameter region. In this proposal for the SBN Program, we describe the physics analysis, the conceptual design of the LAr1-ND detector, the design and refurbishment of the T600 detector, the necessary infrastructure required to execute the program, and a possible reconfiguration of the BNB target and horn system to improve its performance for oscillation searches.Comment: 209 pages, 129 figure

    BCR’s CDP Digital Imaging Best Practices, Version 2.0

    Get PDF
    This is the published version.These Best Practices — also referred to as the CDP Best Practices -- have been created through the collaboration of working groups pulled from library, museum and archive practitioners. Version 1 was created through funding from the Institute for Museum and Library Services through a grant to the University of Denver and the Colorado Digitization Program in 2003. Version 2 of the guidelines were published by BCR in 2008 and represents a significant update of practices under the leadership of their CDP Digital Imaging Best Practices Working Group. The intent has been to help standardize and share protocols governing the implementation of digital projects. The result of these collaborations is a set of best practice documents that cover issues such as digital imaging, Dublin Core metadata and digital audio. These best practice documents are intended to help with the design and implementation of digitization projects. Because they were collaboratively designed by experts in the field, you can be certain they include the best possible information, in addition to having been field tested and proven in practice. These best practice documents are an ongoing collaborative project, and LYRASIS will add information and new documents as they are developed
    • 

    corecore