3,761 research outputs found

    All-Versus-Nothing Proof of Einstein-Podolsky-Rosen Steering

    Get PDF
    Einstein-Podolsky-Rosen steering is a form of quantum nonlocality intermediate between entanglement and Bell nonlocality. Although Schr\"odinger already mooted the idea in 1935, steering still defies a complete understanding. In analogy to "all-versus-nothing" proofs of Bell nonlocality, here we present a proof of steering without inequalities rendering the detection of correlations leading to a violation of steering inequalities unnecessary. We show that, given any two-qubit entangled state, the existence of certain projective measurement by Alice so that Bob's normalized conditional states can be regarded as two different pure states provides a criterion for Alice-to-Bob steerability. A steering inequality equivalent to the all-versus-nothing proof is also obtained. Our result clearly demonstrates that there exist many quantum states which do not violate any previously known steering inequality but are indeed steerable. Our method offers advantages over the existing methods for experimentally testing steerability, and sheds new light on the asymmetric steering problem.Comment: 7 pages, 2 figures. Accepted in Sci. Re

    Fish Oil Supplementation During Late Pregnancy Does Not Influence Plasma Lipids or Lipoprotein Levels in Young Adult Offspring

    Get PDF
    Nutritional influences on cardiovascular disease operate throughout life. Studies in both experimental animals and humans have suggested that changes in the peri- and early post-natal nutrition can affect the development of the various components of the metabolic syndrome in adult life. This has lead to the hypothesis that n-3 fatty acid supplementation in pregnancy may have a beneficial effect on lipid profile in the offspring. The aim of the present study was to investigate the effect of supplementation with n-3 fatty acids during the third trimester of pregnancy on lipids and lipoproteins in the 19-year-old offspring. The study was based on the follow-up of a randomized controlled trial from 1990 where 533 pregnant women were randomized to fish oil (n = 266), olive oil (n = 136) or no oil (n = 131). In 2009, the offspring were invited to a physical examination including blood sampling. A total of 243 of the offspring participated. Lipid values did not differ between the fish oil and olive oil groups. The relative adjusted difference (95% confidence intervals) in lipid concentrations was −3% (−11; 7) for LDL cholesterol, 3% (−3; 10) for HDL cholesterol, −1% (−6; 5) for total cholesterol,−4% (−16; 10) for TAG concentrations, 2%(−2; 7) for apolipoprotein A1, −1% (−9; 7) for apolipoprotein B and 3% (−7; 15) in relative abundance of small dense LDL. In conclusion, there was no effect of fish oil supplementation during the third trimester of pregnancy on offspring plasma lipids and lipoproteins in adolescence

    The native architecture of a photosynthetic membrane

    Get PDF
    In photosynthesis, the harvesting of solar energy and its subsequent conversion into a stable charge separation are dependent upon an interconnected macromolecular network of membrane-associated chlorophyll–protein complexes. Although the detailed structure of each complex has been determined, the size and organization of this network are unknown. Here we show the use of atomic force microscopy to directly reveal a native bacterial photosynthetic membrane. This first view of any multi-component membrane shows the relative positions and associations of the photosynthetic complexes and reveals crucial new features of the organization of the network: we found that the membrane is divided into specialized domains each with a different network organization and in which one type of complex predominates. Two types of organization were found for the peripheral light-harvesting LH2 complex. In the first, groups of 10–20 molecules of LH2 form light-capture domains that interconnect linear arrays of dimers of core reaction centre (RC)–light-harvesting 1 (RC–LH1–PufX) complexes; in the second they were found outside these arrays in larger clusters. The LH1 complex is ideally positioned to function as an energy collection hub, temporarily storing it before transfer to the RC where photochemistry occurs: the elegant economy of the photosynthetic membrane is demonstrated by the close packing of these linear arrays, which are often only separated by narrow 'energy conduits' of LH2 just two or three complexes wide

    Scalar and vector Slepian functions, spherical signal estimation and spectral analysis

    Full text link
    It is a well-known fact that mathematical functions that are timelimited (or spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the finite precision of measurement and computation unavoidably bandlimits our observation and modeling scientific data, and we often only have access to, or are only interested in, a study area that is temporally or spatially bounded. In the geosciences we may be interested in spectrally modeling a time series defined only on a certain interval, or we may want to characterize a specific geographical area observed using an effectively bandlimited measurement device. It is clear that analyzing and representing scientific data of this kind will be facilitated if a basis of functions can be found that are "spatiospectrally" concentrated, i.e. "localized" in both domains at the same time. Here, we give a theoretical overview of one particular approach to this "concentration" problem, as originally proposed for time series by Slepian and coworkers, in the 1960s. We show how this framework leads to practical algorithms and statistically performant methods for the analysis of signals and their power spectra in one and two dimensions, and, particularly for applications in the geosciences, for scalar and vectorial signals defined on the surface of a unit sphere.Comment: Submitted to the 2nd Edition of the Handbook of Geomathematics, edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verlag. This is a slightly modified but expanded version of the paper arxiv:0909.5368 that appeared in the 1st Edition of the Handbook, when it was called: Slepian functions and their use in signal estimation and spectral analysi

    Genetic support for a quantitative trait nucleotide in the ABCG2 gene affecting milk composition of dairy cattle

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Our group has previously identified a quantitative trait locus (QTL) affecting fat and protein percentages on bovine chromosome 6, and refined the QTL position to a 420-kb interval containing six genes. Studies performed in other cattle populations have proposed polymorphisms in two different genes (<it>ABCG2 </it>and <it>OPN</it>) as the underlying functional QTL nucleotide. Due to these conflicting results, we have included these QTNs, together with a large collection of new SNPs produced from PCR sequencing, in a dense marker map spanning the QTL region, and reanalyzed the data using a combined linkage and linkage disequilibrium approach.</p> <p>Results</p> <p>Our results clearly exclude the <it>OPN </it>SNP (<it>OPN_3907</it>) as causal site for the QTL. Among 91 SNPs included in the study, the <it>ABCG2 </it>SNP (<it>ABCG2_49</it>) is clearly the best QTN candidate. The analyses revealed the presence of only one QTL for the percentage traits in the tested region. This QTL was completely removed by correcting the analysis for <it>ABCG2_49</it>. Concordance between the sires' marker genotypes and segregation status for the QTL was found for <it>ABCG2_49 </it>only. The C allele of <it>ABCG2_49 </it>is found in a marker haplotype that has an extremely negative effect on fat and protein percentages and positive effect on milk yield. Of the 91 SNPs, <it>ABCG2_49 </it>was the only marker in perfect linkage disequilibrium with the QTL.</p> <p>Conclusion</p> <p>Based on our results, OPN_3907 can be excluded as the polymorphism underlying the QTL. The results of this and other papers strongly suggest the [A/C] mutation in <it>ABCG2_49 </it>as the causal mutation, although the possibility that <it>ABCG2_49 </it>is only a marker in perfect LD with the true mutation can not be completely ruled out.</p

    A human MAP kinase interactome.

    Get PDF
    Mitogen-activated protein kinase (MAPK) pathways form the backbone of signal transduction in the mammalian cell. Here we applied a systematic experimental and computational approach to map 2,269 interactions between human MAPK-related proteins and other cellular machinery and to assemble these data into functional modules. Multiple lines of evidence including conservation with yeast supported a core network of 641 interactions. Using small interfering RNA knockdowns, we observed that approximately one-third of MAPK-interacting proteins modulated MAPK-mediated signaling. We uncovered the Na-H exchanger NHE1 as a potential MAPK scaffold, found links between HSP90 chaperones and MAPK pathways and identified MUC12 as the human analog to the yeast signaling mucin Msb2. This study makes available a large resource of MAPK interactions and clone libraries, and it illustrates a methodology for probing signaling networks based on functional refinement of experimentally derived protein-interaction maps

    Quality of medication use in primary care - mapping the problem, working to a solution: a systematic review of the literature

    Get PDF
    Background: The UK, USA and the World Health Organization have identified improved patient safety in healthcare as a priority. Medication error has been identified as one of the most frequent forms of medical error and is associated with significant medical harm. Errors are the result of the systems that produce them. In industrial settings, a range of systematic techniques have been designed to reduce error and waste. The first stage of these processes is to map out the whole system and its reliability at each stage. However, to date, studies of medication error and solutions have concentrated on individual parts of the whole system. In this paper we wished to conduct a systematic review of the literature, in order to map out the medication system with its associated errors and failures in quality, to assess the strength of the evidence and to use approaches from quality management to identify ways in which the system could be made safer. Methods: We mapped out the medicines management system in primary care in the UK. We conducted a systematic literature review in order to refine our map of the system and to establish the quality of the research and reliability of the system. Results: The map demonstrated that the proportion of errors in the management system for medicines in primary care is very high. Several stages of the process had error rates of 50% or more: repeat prescribing reviews, interface prescribing and communication and patient adherence. When including the efficacy of the medicine in the system, the available evidence suggested that only between 4% and 21% of patients achieved the optimum benefit from their medication. Whilst there were some limitations in the evidence base, including the error rate measurement and the sampling strategies employed, there was sufficient information to indicate the ways in which the system could be improved, using management approaches. The first step to improving the overall quality would be routine monitoring of adherence, clinical effectiveness and hospital admissions. Conclusion: By adopting the whole system approach from a management perspective we have found where failures in quality occur in medication use in primary care in the UK, and where weaknesses occur in the associated evidence base. Quality management approaches have allowed us to develop a coherent change and research agenda in order to tackle these, so far, fairly intractable problems

    Estimation of the national disease burden of influenza-associated severe acute respiratory illness in Kenya and Guatemala : a novel methodology

    Get PDF
    Background: Knowing the national disease burden of severe influenza in low-income countries can inform policy decisions around influenza treatment and prevention. We present a novel methodology using locally generated data for estimating this burden. Methods and Findings: This method begins with calculating the hospitalized severe acute respiratory illness (SARI) incidence for children <5 years old and persons ≥5 years old from population-based surveillance in one province. This base rate of SARI is then adjusted for each province based on the prevalence of risk factors and healthcare-seeking behavior. The percentage of SARI with influenza virus detected is determined from provincial-level sentinel surveillance and applied to the adjusted provincial rates of hospitalized SARI. Healthcare-seeking data from healthcare utilization surveys is used to estimate non-hospitalized influenza-associated SARI. Rates of hospitalized and non-hospitalized influenza-associated SARI are applied to census data to calculate the national number of cases. The method was field-tested in Kenya, and validated in Guatemala, using data from August 2009–July 2011. In Kenya (2009 population 38.6 million persons), the annual number of hospitalized influenza-associated SARI cases ranged from 17,129–27,659 for children <5 years old (2.9–4.7 per 1,000 persons) and 6,882–7,836 for persons ≥5 years old (0.21–0.24 per 1,000 persons), depending on year and base rate used. In Guatemala (2011 population 14.7 million persons), the annual number of hospitalized cases of influenza-associated pneumonia ranged from 1,065–2,259 (0.5–1.0 per 1,000 persons) among children <5 years old and 779–2,252 cases (0.1–0.2 per 1,000 persons) for persons ≥5 years old, depending on year and base rate used. In both countries, the number of non-hospitalized influenza-associated cases was several-fold higher than the hospitalized cases. Conclusions: Influenza virus was associated with a substantial amount of severe disease in Kenya and Guatemala. This method can be performed in most low and lower-middle income countries

    Dynamic Computational Model Suggests That Cellular Citizenship Is Fundamental for Selective Tumor Apoptosis

    Get PDF
    Computational models in the field of cancer research have focused primarily on estimates of biological events based on laboratory generated data. We introduce a novel in-silico technology that takes us to the next level of prediction models and facilitates innovative solutions through the mathematical system. The model's building blocks are cells defined phenotypically as normal or tumor, with biological processes translated into equations describing the life protocols of the cells in a quantitative and stochastic manner. The essentials of communication in a society composed of normal and tumor cells are explored to reveal “protocols” for selective tumor eradication. Results consistently identify “citizenship properties” among cells that are essential for the induction of healing processes in a healthy system invaded by cancer. These properties act via inter-cellular communication protocols that can be optimized to induce tumor eradication along with system recovery. Within the computational systems, the protocols universally succeed in removing a wide variety of tumors defined by proliferation rates, initial volumes, and apoptosis resistant phenotypes; they show high adaptability for biological details and allow incorporation of population heterogeneity. These protocols work as long as at least 32% of cells obey extra-cellular commands and at least 28% of cancer cells report their deaths. This low percentage implies that the protocols are resilient to the suboptimal situations often seen in biological systems. We conclude that our in-silico model is a powerful tool to investigate, to propose, and to exercise logical anti-cancer solutions. Functional results should be confirmed in a biological system and molecular findings should be loaded into the computational model for the next level of directed experiments
    corecore