127 research outputs found

    Hybrid world object tracking for a virtual teaching agent

    Get PDF
    Fast algorithms and heuristics for real-time object recognition and tracking have enabled a new hybrid world technology in which one can manipulate a real world object and have its virtual world counterpart move correspondingly. This technology has been developed as part of a teaching head platform that was initially designed for language teaching but is now also being used in a range of health-oriented contexts. In this paper, the requirements of the technology are motivated and elucidated, with direct comparison of our proposed heuristics with well known object recognition algorithms

    Need for Laboratory Ecosystems To Unravel the Structures and Functions of Soil Microbial Communities Mediated by Chemistry

    Get PDF
    The chemistry underpinning microbial interactions provides an integrative framework for linking the activities of individual microbes, microbial communities, plants, and their environments. Currently, we know very little about the functions of genes and metabolites within these communities because genome annotations and functions are derived from the minority of microbes that have been propagated in the laboratory. Yet the diversity, complexity, inaccessibility, and irreproducibility of native microbial consortia limit our ability to interpret chemical signaling and map metabolic networks. In this perspective, we contend that standardized laboratory ecosystems are needed to dissect the chemistry of soil microbiomes. We argue that dissemination and application of standardized laboratory ecosystems will be transformative for the field, much like how model organisms have played critical roles in advancing biochemistry and molecular and cellular biology. Community consensus on fabricated ecosystems (“EcoFABs”) along with protocols and data standards will integrate efforts and enable rapid improvements in our understanding of the biochemical ecology of microbial communities

    Mutually-Antagonistic Interactions in Baseball Networks

    Get PDF
    We formulate the head-to-head matchups between Major League Baseball pitchers and batters from 1954 to 2008 as a bipartite network of mutually-antagonistic interactions. We consider both the full network and single-season networks, which exhibit interesting structural changes over time. We find interesting structure in the network and examine their sensitivity to baseball's rule changes. We then study a biased random walk on the matchup networks as a simple and transparent way to compare the performance of players who competed under different conditions and to include information about which particular players a given player has faced. We find that a player's position in the network does not correlate with his success in the random walker ranking but instead has a substantial effect on its sensitivity to changes in his own aggregate performance.Comment: A few clarifications added 14 pages, 2 tables, 6 figures. Submitte

    Upregulation of Inflammatory Cytokines in Pulmonary Embolism Using Biochip-Array Profiling.

    Get PDF
    The complex pathophysiology of pulmonary embolism (PE) involves hemostatic activation, inflammatory processes, cellular dysfunction, and hemodynamic derangements. Due to the heterogeneity of this disease, risk stratification and diagnosis remains challenging. Biochip-array technology provides an integrated high throughput method for analyzing blood plasma samples for the simultaneous measurement of multiple biomarkers for potential risk stratification. Using biochip-array method, this study aimed to quantify the inflammatory biomarkers such as interleukin (IL)-1α, IL-1β, IL-2, IL-4, IL-6, IL-8, IL-10, vascular endothelial growth factor (VEGF), interferon gamma (IFN-γ), tumor necrosis factor alpha (TNF-α), monocyte chemoattractant protein-1 (MCP-1), and epidermal growth factor (EGF) in 109 clinically confirmed PE patients in comparison to the control group comprised of plasma samples collected from 48 healthy subjects. Cytokines IL-4, IL-6, IL-8, IL-10, IL-1β, and MCP-1 demonstrated varying level of significant increase (P \u3c 0.05) in massive-risk PE patients compared to submassive- and low-risk PE patients. The upregulation of inflammatory cytokines in PE patients observed in this study suggest that inflammation plays an important role in the overall pathophysiology of this disease. The application of biochip-array technology may provide a useful approach to evaluate these biomarkers to understand the pathogenesis and risk stratification of PE patients

    Epidemiological Surveillance of Birth Defects Compatible with Thalidomide Embryopathy in Brazil

    Get PDF
    The thalidomide tragedy of the 1960s resulted in thousands of children being born with severe limb reduction defects (LRD), among other malformations. In Brazil, there are still babies born with thalidomide embryopathy (TE) because of leprosy prevalence, availability of thalidomide, and deficiencies in the control of drug dispensation. Our objective was to implement a system of proactive surveillance to identify birth defects compatible with TE. Along one year, newborns with LRD were assessed in the Brazilian hospitals participating in the Latin-American Collaborative Study of Congenital Malformations (ECLAMC). A phenotype of LRD called thalidomide embryopathy phenotype (TEP) was established for surveillance. Children with TEP born between the years 2000–2008 were monitored, and during the 2007–2008 period we clinically investigated in greater detail all cases with TEP (proactive period). The period from 1982 to 1999 was defined as the baseline period for the cumulative sum statistics. The frequency of TEP during the surveillance period, at 3.10/10,000 births (CI 95%: 2.50–3.70), was significantly higher than that observed in the baseline period (1.92/10,000 births; CI 95%: 1.60–2.20), and not uniformly distributed across different Brazilian regions. During the proactive surveillance (2007–2008), two cases of suspected TE were identified, although the two mothers had denied the use of the drug during pregnancy. Our results suggest that TEP has probably increased in recent years, which coincides with the period of greater thalidomide availability. Our proactive surveillance identified two newborns with suspected TE, proving to be a sensitive tool to detect TE. The high frequency of leprosy and the large use of thalidomide reinforce the need for a continuous monitoring of TEP across Brazil

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure

    Suppressing quantum errors by scaling a surface code logical qubit

    Full text link
    Practical quantum computing will require error rates that are well below what is achievable with physical qubits. Quantum error correction offers a path to algorithmically-relevant error rates by encoding logical qubits within many physical qubits, where increasing the number of physical qubits enhances protection against physical errors. However, introducing more qubits also increases the number of error sources, so the density of errors must be sufficiently low in order for logical performance to improve with increasing code size. Here, we report the measurement of logical qubit performance scaling across multiple code sizes, and demonstrate that our system of superconducting qubits has sufficient performance to overcome the additional errors from increasing qubit number. We find our distance-5 surface code logical qubit modestly outperforms an ensemble of distance-3 logical qubits on average, both in terms of logical error probability over 25 cycles and logical error per cycle (2.914%±0.016%2.914\%\pm 0.016\% compared to 3.028%±0.023%3.028\%\pm 0.023\%). To investigate damaging, low-probability error sources, we run a distance-25 repetition code and observe a 1.7×1061.7\times10^{-6} logical error per round floor set by a single high-energy event (1.6×1071.6\times10^{-7} when excluding this event). We are able to accurately model our experiment, and from this model we can extract error budgets that highlight the biggest challenges for future systems. These results mark the first experimental demonstration where quantum error correction begins to improve performance with increasing qubit number, illuminating the path to reaching the logical error rates required for computation.Comment: Main text: 6 pages, 4 figures. v2: Update author list, references, Fig. S12, Table I
    corecore