823 research outputs found

    Inductively guided circuits for ultracold dressed atoms

    Get PDF
    Recent progress in optics, atomic physics and material science has paved the way to study quantum effects in ultracold atomic alkali gases confined to non-trivial geometries. Multiply connected traps for cold atoms can be prepared by combining inhomogeneous distributions of DC and radio-frequency electromagnetic fields with optical fields that require complex systems for frequency control and stabilization. Here we propose a flexible and robust scheme that creates closed quasi-one-dimensional guides for ultracold atoms through the ‘dressing’ of hyperfine sublevels of the atomic ground state, where the dressing field is spatially modulated by inductive effects over a micro-engineered conducting loop. Remarkably, for commonly used atomic species (for example, 7Li and 87Rb), the guide operation relies entirely on controlling static and low-frequency fields in the regimes of radio-frequency and microwave frequencies. This novel trapping scheme can be implemented with current technology for micro-fabrication and electronic control

    Anatomical Network Comparison of Human Upper and Lower, Newborn and Adult, and Normal and Abnormal Limbs, with Notes on Development, Pathology and Limb Serial Homology vs. Homoplasy

    Get PDF
    How do the various anatomical parts (modules) of the animal body evolve into very different integrated forms (integration) yet still function properly without decreasing the individual's survival? This long-standing question remains unanswered for multiple reasons, including lack of consensus about conceptual definitions and approaches, as well as a reasonable bias toward the study of hard tissues over soft tissues. A major difficulty concerns the non-trivial technical hurdles of addressing this problem, specifically the lack of quantitative tools to quantify and compare variation across multiple disparate anatomical parts and tissue types. In this paper we apply for the first time a powerful new quantitative tool, Anatomical Network Analysis (AnNA), to examine and compare in detail the musculoskeletal modularity and integration of normal and abnormal human upper and lower limbs. In contrast to other morphological methods, the strength of AnNA is that it allows efficient and direct empirical comparisons among body parts with even vastly different architectures (e.g. upper and lower limbs) and diverse or complex tissue composition (e.g. bones, cartilages and muscles), by quantifying the spatial organization of these parts-their topological patterns relative to each other-using tools borrowed from network theory. Our results reveal similarities between the skeletal networks of the normal newborn/adult upper limb vs. lower limb, with exception to the shoulder vs. pelvis. However, when muscles are included, the overall musculoskeletal network organization of the upper limb is strikingly different from that of the lower limb, particularly that of the more proximal structures of each limb. Importantly, the obtained data provide further evidence to be added to the vast amount of paleontological, gross anatomical, developmental, molecular and embryological data recently obtained that contradicts the long-standing dogma that the upper and lower limbs are serial homologues. In addition, the AnNA of the limbs of a trisomy 18 human fetus strongly supports Pere Alberch's ill-named "logic of monsters" hypothesis, and contradicts the commonly accepted idea that birth defects often lead to lower integration (i.e. more parcellation) of anatomical structures

    Preparation of Large Monodisperse Vesicles

    Get PDF
    Preparation of monodisperse vesicles is important both for research purposes and for practical applications. While the extrusion of vesicles through small pores (∼100 nm in diameter) results in relatively uniform populations of vesicles, extrusion to larger sizes results in very heterogeneous populations of vesicles. Here we report a simple method for preparing large monodisperse multilamellar vesicles through a combination of extrusion and large-pore dialysis. For example, extrusion of polydisperse vesicles through 5-µm-diameter pores eliminates vesicles larger than 5 µm in diameter. Dialysis of extruded vesicles against 3-µm-pore-size polycarbonate membranes eliminates vesicles smaller than 3 µm in diameter, leaving behind a population of monodisperse vesicles with a mean diameter of ∼4 µm. The simplicity of this method makes it an effective tool for laboratory vesicle preparation with potential applications in preparing large monodisperse liposomes for drug delivery

    A Fisher-Rao Metric for curves using the information in edges

    Get PDF
    Two curves which are close together in an image are indistinguishable given a measurement, in that there is no compelling reason to associate the measurement with one curve rather than the other. This observation is made quantitative using the parametric version of the Fisher-Rao metric. A probability density function for a measurement conditional on a curve is constructed. The distance between two curves is then defined to be the Fisher-Rao distance between the two conditional pdfs. A tractable approximation to the Fisher-Rao metric is obtained for the case in which the measurements are compound in that they consist of a point x and an angle α which specifies the direction of an edge at x. If the curves are circles or straight lines, then the approximating metric is generalized to take account of inlying and outlying measurements. An estimate is made of the number of measurements required for the accurate location of a circle in the presence of outliers. A Bayesian algorithm for circle detection is defined. The prior density for the algorithm is obtained from the Fisher-Rao metric. The algorithm is tested on images from the CASIA Iris Interval database

    Quantitative conversations: the importance of developing rapport in standardised interviewing

    Get PDF
    © 2014, The Author(s). When developing household surveys, much emphasis is understandably placed on developing survey instruments that can elicit accurate and comparable responses. In order to ensure that carefully crafted questions are not undermined by ‘interviewer effects’, standardised interviewing tends to be utilised in preference to conversational techniques. However, by drawing on a behaviour coding analysis of survey paradata arising from the 2012 UK Poverty and Social Exclusion Survey we show that in practice standardised survey interviewing often involves extensive unscripted conversation between the interviewer and the respondent. Whilst these interactions can enhance response accuracy, cooperation and ethicality, unscripted conversations can also be problematic in terms of survey reliability and the ethical conduct of survey interviews, as well as raising more basic epistemological questions concerning the degree of standardisation typically assumed within survey research. We conclude that better training in conversational techniques is necessary, even when applying standardised interviewing methodologies. We also draw out some theoretical implications regarding the usefulness of the qualitative–quantitative dichotomy

    Multiple volcanic episodes of flood basalts caused by thermochemical mantle plumes

    Full text link
    The hypothesis that a single mushroom-like mantle plume head can generate a large igneous province within a few million years has been widely accepted(1). The Siberian Traps at the Permian Triassic boundary(2) and the Deccan Traps at the Cretaceous Tertiary boundary(3) were probably erupted within one million years. These large eruptions have been linked to mass extinctions. But recent geochronological data(4-11) reveal more than one pulse of major eruptions with diverse magma flux within several flood basalts extending over tens of million years. This observation indicates that the processes leading to large igneous provinces are more complicated than the purely thermal, single-stage plume model suggests. Here we present numerical experiments to demonstrate that the entrainment of a dense eclogite-derived material at the base of the mantle by thermal plumes can develop secondary instabilities due to the interaction between thermal and compositional buoyancy forces. The characteristic timescales of the development of the secondary instabilities and the variation of the plume strength are compatible with the observations. Such a process may contribute to multiple episodes of large igneous provinces.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/62705/1/nature03697.pd

    Choosing Organic Pesticides over Synthetic Pesticides May Not Effectively Mitigate Environmental Risk in Soybeans

    Get PDF
    Background: Selection of pesticides with small ecological footprints is a key factor in developing sustainable agricultural systems. Policy guiding the selection of pesticides often emphasizes natural products and organic-certified pesticides to increase sustainability, because of the prevailing public opinion that natural products are uniformly safer, and thus more environmentally friendly, than synthetic chemicals. Methodology/Principal Findings: We report the results of a study examining the environmental impact of several new synthetic and certified organic insecticides under consideration as reduced-risk insecticides for soybean aphid (Aphis glycines) control, using established and novel methodologies to directly quantify pesticide impact in terms of biocontrol services. We found that in addition to reduced efficacy against aphids compared to novel synthetic insecticides, organic approved insecticides had a similar or even greater negative impact on several natural enemy species in lab studies, were more detrimental to biological control organisms in field experiments, and had higher Environmental Impact Quotients at field use rates. Conclusions/Significance: These data bring into caution the widely held assumption that organic pesticides are more environmentally benign than synthetic ones. All pesticides must be evaluated using an empirically-based risk assessment

    Azimuthal anisotropy and correlations at large transverse momenta in p+pp+p and Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV

    Get PDF
    Results on high transverse momentum charged particle emission with respect to the reaction plane are presented for Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV. Two- and four-particle correlations results are presented as well as a comparison of azimuthal correlations in Au+Au collisions to those in p+pp+p at the same energy. Elliptic anisotropy, v2v_2, is found to reach its maximum at pt3p_t \sim 3 GeV/c, then decrease slowly and remain significant up to pt7p_t\approx 7 -- 10 GeV/c. Stronger suppression is found in the back-to-back high-ptp_t particle correlations for particles emitted out-of-plane compared to those emitted in-plane. The centrality dependence of v2v_2 at intermediate ptp_t is compared to simple models based on jet quenching.Comment: 4 figures. Published version as PRL 93, 252301 (2004

    Azimuthal anisotropy in Au+Au collisions at sqrtsNN = 200 GeV

    Get PDF
    The results from the STAR Collaboration on directed flow (v_1), elliptic flow (v_2), and the fourth harmonic (v_4) in the anisotropic azimuthal distribution of particles from Au+Au collisions at sqrtsNN = 200 GeV are summarized and compared with results from other experiments and theoretical models. Results for identified particles are presented and fit with a Blast Wave model. Different anisotropic flow analysis methods are compared and nonflow effects are extracted from the data. For v_2, scaling with the number of constituent quarks and parton coalescence is discussed. For v_4, scaling with v_2^2 and quark coalescence is discussed.Comment: 26 pages. As accepted by Phys. Rev. C. Text rearranged, figures modified, but data the same. However, in Fig. 35 the hydro calculations are corrected in this version. The data tables are available at http://www.star.bnl.gov/central/publications/ by searching for "flow" and then this pape

    A Climatic Stability Approach to Prioritizing Global Conservation Investments

    Get PDF
    Climate change is impacting species and ecosystems globally. Many existing templates to identify the most important areas to conserve terrestrial biodiversity at the global scale neglect the future impacts of climate change. Unstable climatic conditions are predicted to undermine conservation investments in the future. This paper presents an approach to developing a resource allocation algorithm for conservation investment that incorporates the ecological stability of ecoregions under climate change. We discover that allocating funds in this way changes the optimal schedule of global investments both spatially and temporally. This allocation reduces the biodiversity loss of terrestrial endemic species from protected areas due to climate change by 22% for the period of 2002–2052, when compared to allocations that do not consider climate change. To maximize the resilience of global biodiversity to climate change we recommend that funding be increased in ecoregions located in the tropics and/or mid-elevation habitats, where climatic conditions are predicted to remain relatively stable. Accounting for the ecological stability of ecoregions provides a realistic approach to incorporating climate change into global conservation planning, with potential to save more species from extinction in the long term
    corecore