902 research outputs found

    Potencjalne zmniejszenie kosztów leczenia cukrzycy związane z poprawą kontroli glikemii

    Get PDF
    WSTĘP. W literaturze są dostępne jedynie ograniczone dane dotyczące wpływu kontroli glikemii na koszty leczenia chorych na cukrzycę. Celem tej pracy było zbadanie potencjalnego wpływu ściślejszej kontroli glikemii na niektóre wczesne powikłania cukrzycy i koszty ich leczenia. MATERIAŁ I METODY. Przeprowadzono retrospektywne badanie obejmujące dużą grupę chorych na cukrzycę, zarejestrowanych w komputerowej bazie danych kliniki Fallon od 1 stycznia 1994 roku do 30 czerwca 1998 roku. Chorych podzielono na trzy grupy w zależności od stężenia HBA1C: cukrzycy wyrównanej ( 10%). Oceniano częstość hospitalizacji z powodu takich zaburzeń towarzyszących cukrzycy, jak: niektóre zakażenia, epizody hiper- i hipoglikemii, zaburzenia elektrolitowe, a także koszty leczenia. Aby wyeliminować wpływ przypadkowych parametrów w poszczególnych grupach, zastosowano wieloczynnikową analizę statystyczną, obejmującą okres 3 lat. WYNIKI. Z 2394 chorych na cukrzycę około 10% (251 osób) hospitalizowano przynajmniej raz z powodu wczesnych powikłań choroby &#8212; łącznie odnotowano 447 przyjęć. Ustalono, że w okresie objętym analizą liczba hospitalizacji w grupie chorych na cukrzycę wyrównaną wynosiła 13 na 100 chorych, w grupie cukrzycy względnie wyrównanej &#8212; 16 na 100, a w grupie chorych na cukrzycę niewyrównaną &#8212; 31 hospitalizacji na 100 chorych (p < 0,05). Skorygowane średnie koszty wynosiły odpowiednio około 970, 1380 i 3040 USD. U osób z późnymi powikłaniami choroby, którzy stanowili 30% badanej populacji, częstość przyjęć i koszty związane z hospitalizacjami były wyższe. Częstość ta w poszczególnych grupach wynosiła 30, 38 i 74 na 100 pacjentów, natomiast średnie koszty leczenia &#8212; odpowiednio 2610, 3810 i 8320 USD w poddanym analizie okresie 3 lat. WNIOSKI. W typowej praktyce lekarskiej poprawa kontroli glikemii wiąże się ze zmniejszeniem częstości hospitalizacji z powodu wczesnych powikłań cukrzycy, a w związku z tym z redukcją kosztów leczenia w okresie 3-letnim. Te potencjalne korzyści mogą wpływać na decyzje o wdrożeniu nowych metod leczenia cukrzycy

    Infinite factorization of multiple non-parametric views

    Get PDF
    Combined analysis of multiple data sources has increasing application interest, in particular for distinguishing shared and source-specific aspects. We extend this rationale of classical canonical correlation analysis into a flexible, generative and non-parametric clustering setting, by introducing a novel non-parametric hierarchical mixture model. The lower level of the model describes each source with a flexible non-parametric mixture, and the top level combines these to describe commonalities of the sources. The lower-level clusters arise from hierarchical Dirichlet Processes, inducing an infinite-dimensional contingency table between the views. The commonalities between the sources are modeled by an infinite block model of the contingency table, interpretable as non-negative factorization of infinite matrices, or as a prior for infinite contingency tables. With Gaussian mixture components plugged in for continuous measurements, the model is applied to two views of genes, mRNA expression and abundance of the produced proteins, to expose groups of genes that are co-regulated in either or both of the views. Cluster analysis of co-expression is a standard simple way of screening for co-regulation, and the two-view analysis extends the approach to distinguishing between pre- and post-translational regulation

    The Hamiltonian of the V15_{15} Spin System from first-principles Density-Functional Calculations

    Full text link
    We report first-principles all-electron density-functional based studies of the electronic structure, magnetic ordering and anisotropy for the V15_{15} molecular magnet. From these calculations, we determine a Heisenberg Hamiltonian with four antiferromagnetic and one {\em ferromagnetic} coupling. We perform direct diagonalization to determine the temperature dependence of the susceptibility. This Hamiltonian reproduces the experimentally observed spin SS=1/2 ground state and low-lying SS=3/2 excited state. A small anisotropy term is necessary to account for the temperature independent part of the magnetization curve.Comment: 4 pages in RevTeX format + 2 ps-figures, accepted by PRL Feb. 2001 (previous version was an older version of the paper

    Black Holes and Instabilities of Negative Tension Branes

    Get PDF
    We consider the collision in 2+1 dimensions of a black hole and a negative tension brane on an orbifold. Because there is no gravitational radiation in 2+1 dimensions, the horizon area shrinks when part of the brane falls through. This provides a potential violation of the generalized second law of thermodynamics. However, tracing the details of the dynamical evolution one finds that it does not proceed from equilibrium configuration to equilibrium configuration. Instead, a catastrophic space-time singularity develops similar to the `big crunch' of Ω>1\Omega >1 FRW space-times. In the context of classical general relativity, our result demonstrates a new instability of constructions with negative tension branes.Comment: 18 pages, 3 figures, uses RevTeX. Minor typos fixed. References and one footnote adde

    Large-amplitude driving of a superconducting artificial atom: Interferometry, cooling, and amplitude spectroscopy

    Get PDF
    Superconducting persistent-current qubits are quantum-coherent artificial atoms with multiple, tunable energy levels. In the presence of large-amplitude harmonic excitation, the qubit state can be driven through one or more of the constituent energy-level avoided crossings. The resulting Landau-Zener-Stueckelberg (LZS) transitions mediate a rich array of quantum-coherent phenomena. We review here three experimental works based on LZS transitions: Mach-Zehnder-type interferometry between repeated LZS transitions, microwave-induced cooling, and amplitude spectroscopy. These experiments exhibit a remarkable agreement with theory, and are extensible to other solid-state and atomic qubit modalities. We anticipate they will find application to qubit state-preparation and control methods for quantum information science and technology.Comment: 13 pages, 5 figure

    Chronic and Acute Exposures to the World Trade Center Disaster and Lower Respiratory Symptoms: Area Residents and Workers

    Get PDF
    Objectives. We assessed associations between new-onset (post–September 11, 2001 [9/11]) lower respiratory symptoms reported on 2 surveys, administered 3 years apart, and acute and chronic 9/11-related exposures among New York City World Trade Center–area residents and workers enrolled in the World Trade Center Health Registry. Methods. World Trade Center–area residents and workers were categorized as case participants or control participants on the basis of lower respiratory symptoms reported in surveys administered 2 to 3 and 5 to 6 years after 9/11. We created composite exposure scales after principal components analyses of detailed exposure histories obtained during face-to-face interviews. We used multivariate logistic regression models to determine associations between lower respiratory symptoms and composite exposure scales. Results. Both acute and chronic exposures to the events of 9/11 were independently associated, often in a dose-dependent manner, with lower respiratory symptoms among individuals who lived and worked in the area of the World Trade Center. Conclusions. Study findings argue for detailed assessments of exposure during and after events in the future from which potentially toxic materials may be released and for rapid interventions to minimize exposures and screen for potential adverse health effects

    Cosmic Bell Test: Measurement Settings from Milky Way Stars

    Get PDF
    Bell’s theorem states that some predictions of quantum mechanics cannot be reproduced by a local-realist theory. That conflict is expressed by Bell’s inequality, which is usually derived under the assumption that there are no statistical correlations between the choices of measurement settings and anything else that can causally affect the measurement outcomes. In previous experiments, this “freedom of choice” was addressed by ensuring that selection of measurement settings via conventional “quantum random number generators” was spacelike separated from the entangled particle creation. This, however, left open the possibility that an unknown cause affected both the setting choices and measurement outcomes as recently as mere microseconds before each experimental trial. Here we report on a new experimental test of Bell’s inequality that, for the first time, uses distant astronomical sources as “cosmic setting generators.” In our tests with polarization-entangled photons, measurement settings were chosen using real-time observations of Milky Way stars while simultaneously ensuring locality. Assuming fair sampling for all detected photons, and that each stellar photon’s color was set at emission, we observe statistically significant ≳7.31σ and ≳11.93σ violations of Bell’s inequality with estimated p values of ≲1.8×10[superscript -13] and ≲4.0×10[superscript -33], respectively, thereby pushing back by ∼600  years the most recent time by which any local-realist influences could have engineered the observed Bell violation.Austrian Academy of SciencesAustrian Science Fund (Projects SFB F40 (FOQUS) and CoQuS W1210-N16)Austria. Federal Ministry of Science, Research, and EconomyNational Science Foundation (U.S.) (INSPIRE Grant PHY-1541160 and SES-1056580)Massachusetts Institute of Technology. Undergraduate Research Opportunities Progra

    Amplitude Spectroscopy of a Solid-State Artificial Atom

    Get PDF
    The energy-level structure of a quantum system plays a fundamental role in determining its behavior and manifests itself in a discrete absorption and emission spectrum. Conventionally, spectra are probed via frequency spectroscopy whereby the frequency \nu of a harmonic driving field is varied to fulfill the conditions \Delta E = h \nu, where the driving field is resonant with the level separation \Delta E (h is Planck's constant). Although this technique has been successfully employed in a variety of physical systems, including natural and artificial atoms and molecules, its application is not universally straightforward, and becomes extremely challenging for frequencies in the range of 10's and 100's of gigahertz. Here we demonstrate an alternative approach, whereby a harmonic driving field sweeps the atom through its energy-level avoided crossings at a fixed frequency, surmounting many of the limitations of the conventional approach. Spectroscopic information is obtained from the amplitude dependence of the system response. The resulting ``spectroscopy diamonds'' contain interference patterns and population inversion that serve as a fingerprint of the atom's spectrum. By analyzing these features, we determine the energy spectrum of a manifold of states with energies from 0.01 to 120 GHz \times h in a superconducting artificial atom, using a driving frequency near 0.1 GHz. This approach provides a means to manipulate and characterize systems over a broad bandwidth, using only a single driving frequency that may be orders of magnitude smaller than the energy scales being probed.Comment: 12 pages, 13 figure

    Gene Regulatory Networks from Multifactorial Perturbations Using Graphical Lasso: Application to the DREAM4 Challenge

    Get PDF
    A major challenge in the field of systems biology consists of predicting gene regulatory networks based on different training data. Within the DREAM4 initiative, we took part in the multifactorial sub-challenge that aimed to predict gene regulatory networks of size 100 from training data consisting of steady-state levels obtained after applying multifactorial perturbations to the original in silico network
    corecore