398 research outputs found

    Cardiorespiratory Responses during Aquatic Treadmill Exercise and Land Treadmill Exercise in Adults with Diabetes

    Get PDF
    The purpose of this study was to compare the effect of aquatic treadmill (ATM) exercise to land treadmill (LTM) exercise in adults with type 2 diabetes. Five participants with type 2 diabetes (T2D group; 4 females, 1 male; age = 51±6 years; height = 170±7 cm; weight = 96±24 kg; body fat = 31.6±2.2%) and five participants without type 2 diabetes (control group; 4 females, 1 male; age = 51±6 years; height = 170±6 cm; weight = 71±15 kg; body fat = 26.8±4.6%) completed the study. Protocols for both ATM exercise and LTM exercise began at 2 mph with 0% grade and increased by 1 mph after 5 minutes at each stage. Termination occurred after participants completed the protocol or reached 85% of heart rate reserve. Heart rate, absolute and relative VO2, and systolic and diastolic blood pressure were measured at rest and during steady-state exercise at each intensity. Mean arterial pressure (MAP) was calculated. A 2 x 2 x 3 Mixed Factorial ANOVA and Bonferroni post hoc test with a significance level of .0125 were used. There was a significant difference (p2 of the two groups at 4 mph while performing the land treadmill exercise (T2D: 14.1±1.4 ml/kg/min vs. control: 18.4±1.6 ml/kg/min, p2 between participant groups or modes of exercise. Those with type 2 diabetes had an increased MAP versus those without type 2 diabetes while performing the land treadmill exercise at 2 mph (T2D: 93±3 mmHg vs. control: 81±5 mmHg, p2, and MAP respond similarly in both groups during ATM and LTM exercise at most treadmill speeds

    An accurate test for homogeneity of odds ratios based on Cochran's Q-statistic

    Get PDF
    Background: A frequently used statistic for testing homogeneity in a meta-analysis of K independent studies is Cochran's Q. For a standard test of homogeneity the Q statistic is referred to a chi-square distribution with K - 1 degrees of freedom. For the situation in which the effects of the studies are logarithms of odds ratios, the chi-square distribution is much too conservative for moderate size studies, although it may be asymptotically correct as the individual studies become large. Methods: Using a mixture of theoretical results and simulations, we provide formulas to estimate the shape and scale parameters of a gamma distribution to t the distribution of Q. Results: Simulation studies show that the gamma distribution is a good approximation to the distribution for Q. Conclusions: : Use of the gamma distribution instead of the chi-square distribution for Q should eliminate inaccurate inferences in assessing homogeneity in a meta-analysis. (A computer program for implementing this test is provided.) This hypothesis test is competitive with the Breslow-Day test both in accuracy of level and in power

    Two-photon quantum walks in an elliptical direct-write waveguide array

    Full text link
    Integrated optics provides an ideal test bed for the emulation of quantum systems via continuous-time quantum walks. Here we study the evolution of two-photon states in an elliptic array of waveguides. We characterise the photonic chip via coherent-light tomography and use the results to predict distinct differences between temporally indistinguishable and distinguishable two-photon inputs which we then compare with experimental observations. Our work highlights the feasibility for emulation of coherent quantum phenomena in three-dimensional waveguide structures.Comment: 8 pages, 7 figure

    Quantum-inspired interferometry with chirped laser pulses

    Full text link
    We introduce and implement an interferometric technique based on chirped femtosecond laser pulses and nonlinear optics. The interference manifests as a high-visibility (> 85%) phase-insensitive dip in the intensity of an optical beam when the two interferometer arms are equal to within the coherence length of the light. This signature is unique in classical interferometry, but is a direct analogue to Hong-Ou-Mandel quantum interference. Our technique exhibits all the metrological advantages of the quantum interferometer, but with signals at least 10^7 times greater. In particular we demonstrate enhanced resolution, robustness against loss, and automatic dispersion cancellation. Our interferometer offers significant advantages over previous technologies, both quantum and classical, in precision time delay measurements and biomedical imaging.Comment: 6 pages, 4 figure

    Optical one-way quantum computing with a simulated valence-bond solid

    Full text link
    One-way quantum computation proceeds by sequentially measuring individual spins (qubits) in an entangled many-spin resource state. It remains a challenge, however, to efficiently produce such resource states. Is it possible to reduce the task of generating these states to simply cooling a quantum many-body system to its ground state? Cluster states, the canonical resource for one-way quantum computing, do not naturally occur as ground states of physical systems. This led to a significant effort to identify alternative resource states that appear as ground states in spin lattices. An appealing candidate is a valence-bond-solid state described by Affleck, Kennedy, Lieb, and Tasaki (AKLT). It is the unique, gapped ground state for a two-body Hamiltonian on a spin-1 chain, and can be used as a resource for one-way quantum computing. Here, we experimentally generate a photonic AKLT state and use it to implement single-qubit quantum logic gates.Comment: 11 pages, 4 figures, 8 tables - added one referenc

    Quantum computing with mixed states

    Full text link
    We discuss a model for quantum computing with initially mixed states. Although such a computer is known to be less powerful than a quantum computer operating with pure (entangled) states, it may efficiently solve some problems for which no efficient classical algorithms are known. We suggest a new implementation of quantum computation with initially mixed states in which an algorithm realization is achieved by means of optimal basis independent transformations of qubits.Comment: 2 figures, 52 reference

    Enumeration of leukocyte infiltration in solid tumors by confocal laser scanning microscopy

    Get PDF
    BACKGROUND: Leukocytes commonly infiltrate solid tumors, and have been implicated in the mechanism of spontaneous regression in some cancers. Conventional techniques for the quantitative estimation of leukocyte infiltrates in tumors rely on light microscopy of immunostained thin tissue sections, in which an arbitrary assessment (based on low, medium or high levels of infiltration) of antigen density is made by the pathologist. These estimates are relatively subjective and often require the opinion of a second pathologist. In addition, since thin tissue sections are cut, no data regarding the three-dimensional distribution of antigen can be obtained. RESULTS: To overcome these problems, we have designed a method to enumerate leukocyte infiltration into tumors, using confocal laser scanning microscopy of fluorescently immunostained leukocytes in thick tissue sections. Using image analysis software, a threshold was applied to eliminate unstained tissue and residual noise. The total antigen volume in the scanned tissue was calculated and divided by the mean cell volume (calculated by "seeding" ten individual cells) to obtain the cell count. Using this method, we compared the calculated leukocyte counts with those obtained manually by ten laboratory personnel. There was no significant difference (P > 0.05) between the cell counts obtained by either method. We then compared leukocyte infiltration into seven tumors and matched non-malignant tissue obtained from the periphery of the resected tissue. There was a significant increase in the infiltration of all leukocyte subsets into the tumors compared to minimal numbers in the non-malignant tissue. CONCLUSION: From these results we conclude that this method may be of considerable use for the enumeration of cells in tissues. Furthermore, since it can be performed by laboratory technical staff, less time input is required by the pathologist in assessing the degree of leukocyte infiltration into tumors

    Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.Abstract Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim.Published versio

    Accounting for assay performance when estimating the temporal dynamics in SARS-CoV-2 seroprevalence in the U.S.

    Get PDF
    Reconstructing the incidence of SARS-CoV-2 infection is central to understanding the state of the pandemic. Seroprevalence studies are often used to assess cumulative infections as they can identify asymptomatic infection. Since July 2020, commercial laboratories have conducted nationwide serosurveys for the U.S. CDC. They employed three assays, with different sensitivities and specificities, potentially introducing biases in seroprevalence estimates. Using models, we show that accounting for assays explains some of the observed state-to-state variation in seroprevalence, and when integrating case and death surveillance data, we show that when using the Abbott assay, estimates of proportions infected can differ substantially from seroprevalence estimates. We also found that states with higher proportions infected (before or after vaccination) had lower vaccination coverages, a pattern corroborated using a separate dataset. Finally, to understand vaccination rates relative to the increase in cases, we estimated the proportions of the population that received a vaccine prior to infection

    On opportunistic software reuse

    Get PDF
    The availability of open source assets for almost all imaginable domains has led the software industry toopportunistic design-an approach in which people develop new software systems in an ad hoc fashion by reusing and combining components that were not designed to be used together. In this paper we investigate this emerging approach. We demonstrate the approach with an industrial example in whichNode.jsmodules and various subsystems are used in an opportunistic way. Furthermore, to study opportunistic reuse as a phenomenon, we present the results of three contextual interviews and a survey with reuse practitioners to understand to what extent opportunistic reuse offers improvements over traditional systematic reuse approaches.Peer reviewe
    corecore