77 research outputs found

    Ultrahigh and persistent optical depths of caesium in Kagom\'e-type hollow-core photonic crystal fibres

    Full text link
    Alkali-filled hollow-core fibres are a promising medium for investigating light-matter interactions, especially at the single-photon level, due to the tight confinement of light and high optical depths achievable by light-induced atomic desorption. However, until now these large optical depths could only be generated for seconds at most once per day, severely limiting the practicality of the technology. Here we report the generation of highest observed transient (>105>10^5 for up to a minute) and highest observed persistent (>2000>2000 for hours) optical depths of alkali vapours in a light-guiding geometry to date, using a caesium-filled Kagom\'e-type hollow-core photonic crystal fibre. Our results pave the way to light-matter interaction experiments in confined geometries requiring long operation times and large atomic number densities, such as generation of single-photon-level nonlinearities and development of single photon quantum memories.Comment: Author Accepted versio

    Erratum to: Model Convolution: A Computational Approach to Digital Image Interpretation

    Get PDF
    Digital fluorescence microscopy is commonly used to track individual proteins and their dynamics in living cells. However, extracting molecule-specific information from fluorescence images is often limited by the noise and blur intrinsic to the cell and the imaging system. Here we discuss a method called “model-convolution,” which uses experimentally measured noise and blur to simulate the process of imaging fluorescent proteins whose spatial distribution cannot be resolved. We then compare model-convolution to the more standard approach of experimental deconvolution. In some circumstances, standard experimental deconvolution approaches fail to yield the correct underlying fluorophore distribution. In these situations, model-convolution removes the uncertainty associated with deconvolution and therefore allows direct statistical comparison of experimental and theoretical data. Thus, if there are structural constraints on molecular organization, the model-convolution method better utilizes information gathered via fluorescence microscopy, and naturally integrates experiment and theory

    Model Convolution: A Computational Approach to Digital Image Interpretation

    Get PDF
    Digital fluorescence microscopy is commonly used to track individual proteins and their dynamics in living cells. However, extracting molecule-specific information from fluorescence images is often limited by the noise and blur intrinsic to the cell and the imaging system. Here we discuss a method called “model-convolution,” which uses experimentally measured noise and blur to simulate the process of imaging fluorescent proteins whose spatial distribution cannot be resolved. We then compare model-convolution to the more standard approach of experimental deconvolution. In some circumstances, standard experimental deconvolution approaches fail to yield the correct underlying fluorophore distribution. In these situations, model-convolution removes the uncertainty associated with deconvolution and therefore allows direct statistical comparison of experimental and theoretical data. Thus, if there are structural constraints on molecular organization, the model-convolution method better utilizes information gathered via fluorescence microscopy, and naturally integrates experiment and theory

    Realistic triaxial density--potential--force profiles for stellar systems and dark matter halos

    Full text link
    Popular models for describing the luminosity-density profiles of dynamically hot stellar systems (e.g., Jaffe, Hernquist, Dehnen) were constructed to match the deprojected form of de Vaucouleurs' R1/4R^{1/4} light-profile. However, we now know that elliptical galaxies and bulges display a mass-dependent range of structural profiles. To compensate this, the model in Terzic & Graham was designed to closely match the deprojected form of Sersic R1/nR^{1/n} light-profiles, including deprojected exponential light-profiles and galaxies with partially depleted cores. It is thus applicable for describing bulges in spiral galaxies, dwarf elliptical galaxies, both ``power-law'' and ``core'' elliptical galaxies, also dark matter halos formed from Λ\LambdaCDM cosmological simulations. In this paper, we present a new family of triaxial density-potential-force triplets, which generalizes the spherical model reported in Terzic & Graham to three dimensions. If the (optional) power-law core is present, it is a 5-parameter family, while in the absence of the core it reduces to 3 parameters. The isodensity contours in the new family are stratified on confocal ellipsoids and the potential and forces are expressed in terms of integrals which are easy to evaluate numerically. We provide the community with a suite of numerical routines for orbit integration, which feature: optimized computations of potential and forces for this family; the ability to run simulations on parallel platforms; and modular and easily editable design.Comment: Accepted for publication in the MNRAS. 13 pages, including 6 figure

    Quantum-over-classical Advantage in Solving Multiplayer Games

    Full text link
    We study the applicability of quantum algorithms in computational game theory and generalize some results related to Subtraction games, which are sometimes referred to as one-heap Nim games. In quantum game theory, a subset of Subtraction games became the first explicitly defined class of zero-sum combinatorial games with provable separation between quantum and classical complexity of solving them. For a narrower subset of Subtraction games, an exact quantum sublinear algorithm is known that surpasses all deterministic algorithms for finding solutions with probability 11. Typically, both Nim and Subtraction games are defined for only two players. We extend some known results to games for three or more players, while maintaining the same classical and quantum complexities: Θ(n2)\Theta\left(n^2\right) and O~(n1.5)\tilde{O}\left(n^{1.5}\right) respectively

    Regulation of Signaling at Regions of Cell-Cell Contact by Endoplasmic Reticulum-Bound Protein-Tyrosine Phosphatase 1B

    Get PDF
    Protein-tyrosine phosphatase 1B (PTP1B) is a ubiquitously expressed PTP that is anchored to the endoplasmic reticulum (ER). PTP1B dephosphorylates activated receptor tyrosine kinases after endocytosis, as they transit past the ER. However, PTP1B also can access some plasma membrane (PM)-bound substrates at points of cell-cell contact. To explore how PTP1B interacts with such substrates, we utilized quantitative cellular imaging approaches and mathematical modeling of protein mobility. We find that the ER network comes in close proximity to the PM at apparently specialized regions of cell-cell contact, enabling PTP1B to engage substrate(s) at these sites. Studies using PTP1B mutants show that the ER anchor plays an important role in restricting its interactions with PM substrates mainly to regions of cell-cell contact. In addition, treatment with PTP1B inhibitor leads to increased tyrosine phosphorylation of EphA2, a PTP1B substrate, specifically at regions of cell-cell contact. Collectively, our results identify PM-proximal sub-regions of the ER as important sites of cellular signaling regulation by PTP1B

    Systematic review on quality control for drug management programs: Is quality reported in the literature?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Maintaining quality of care while managing limited healthcare resources is an ongoing challenge in healthcare. The objective of this study was to evaluate how the impact of drug management programs is reported in the literature and to identify potentially existing quality standards.</p> <p>Methods</p> <p>This analysis relates to the published research on the impact of drug management on economic, clinical, or humanistic outcomes in managed care, indemnity insurance, VA, or Medicaid in the USA published between 1996 and 2007. Included articles were systematically analyzed for study objective, study endpoints, and drug management type. They were further categorized by drug management tool, primary objective, and study endpoints.</p> <p>Results</p> <p>None of the 76 included publications assessed the overall quality of drug management tools. The impact of 9 different drug management tools used alone or in combination was studied in pharmacy claims, medical claims, electronic medical records or survey data from either patient, plan or provider perspective using an average of 2.1 of 11 possible endpoints. A total of 68% of the studies reported the impact on plan focused endpoints, while the clinical, the patient or the provider perspective were studied to a much lower degree (45%, 42% and 12% of the studies). Health outcomes were only accounted for in 9.2% of the studies.</p> <p>Conclusion</p> <p>Comprehensive assessment of quality considering plan, patient and clinical outcomes is not yet applied. There is no defined quality standard. Benchmarks including health outcomes should be determined and used to improve the overall clinical and economic effectiveness of drug management programs.</p
    corecore