868 research outputs found

    Trajectory versus probability density entropy

    Full text link
    We study the problem of entropy increase of the Bernoulli-shift map without recourse to the concept of trajectory and we discuss whether, and under which conditions if it does, the distribution density entropy coincides with the Kolmogorov-Sinai entropy, namely, with the trajectory entropy.Comment: 24 page

    Comorbid diabetes and copd: impact of corticosteroid use on diabetes complications

    Get PDF
    OBJECTIVE: To identify if there is a dose-dependent risk of diabetes complications in patients treated with corticosteroids who have both diabetes and chronic obstructive pulmonary disorder (COPD). RESEARCH DESIGN AND METHODS: A retrospective study of administrative claims data from the Australian Government Department of Veterans' Affairs, from 1 July 2001 to 30 June 2008, of diabetes patients newly initiated on metformin or sulfonylurea. COPD was identified by dispensings of tiotropium or ipratropium in the 6 months preceding study entry. Total corticosteroid use (inhaled and systemic) in the 12 months after study entry was determined. The outcome was time to hospitalization for a diabetes-related complication. Competing risks and Cox proportional hazard regression analyses were conducted with adjustment for a number of covariates. RESULTS: A total of 18,226 subjects with diabetes were identified, of which 5.9% had COPD. Of those with COPD, 67.2% were dispensed corticosteroids in the 12 months from study entry. Stratification by dose of corticosteroids demonstrated a 94% increased likelihood of hospitalization for a diabetes complication for those who received a total defined daily dose (DDD) of corticosteroids≄0.83/day (subhazard ratio 1.94 [95% CI 1.14-3.28], P=0.014), by comparison with those who did not receive a corticosteroid. Lower doses of corticosteroid (<0.83 DDD/day) were not associated with an increased risk of diabetes-related hospitalization. CONCLUSIONS: In patients with diabetes and COPD, an increased risk of diabetes-related hospitalizations was only evident with use of high doses of corticosteroids. This highlights the need for constant revision of corticosteroid dose in those with diabetes and COPD, to ensure that the minimally effective dose is used, together with review of appropriate response to therapy.Gillian E. Caughey, Adrian K. Preiss, Agnes I. Vitry, Andrew L. Gilbert, Elizabeth E. Roughea

    Introductory clifford analysis

    Get PDF
    In this chapter an introduction is given to Clifford analysis and the underlying Clifford algebras. The functions under consideration are defined on Euclidean space and take values in the universal real or complex Clifford algebra, the structure and properties of which are also recalled in detail. The function theory is centered around the notion of a monogenic function, which is a null solution of a generalized Cauchy–Riemann operator, which is rotation invariant and factorizes the Laplace operator. In this way, Clifford analysis may be considered as both a generalization to higher dimension of the theory of holomorphic functions in the complex plane and a refinement of classical harmonic analysis. A notion of monogenicity may also be associated with the vectorial part of the Cauchy–Riemann operator, which is called the Dirac operator; some attention is paid to the intimate relation between both notions. Since a product of monogenic functions is, in general, no longer monogenic, it is crucial to possess some tools for generating monogenic functions: such tools are provided by Fueter’s theorem on one hand and the Cauchy–Kovalevskaya extension theorem on the other hand. A corner stone in this function theory is the Cauchy integral formula for representation of a monogenic function in the interior of its domain of monogenicity. Starting from this representation formula and related integral formulae, it is possible to consider integral transforms such as Cauchy, Hilbert, and Radon transforms, which are important both within the theoretical framework and in view of possible applications

    Current status of turbulent dynamo theory: From large-scale to small-scale dynamos

    Full text link
    Several recent advances in turbulent dynamo theory are reviewed. High resolution simulations of small-scale and large-scale dynamo action in periodic domains are compared with each other and contrasted with similar results at low magnetic Prandtl numbers. It is argued that all the different cases show similarities at intermediate length scales. On the other hand, in the presence of helicity of the turbulence, power develops on large scales, which is not present in non-helical small-scale turbulent dynamos. At small length scales, differences occur in connection with the dissipation cutoff scales associated with the respective value of the magnetic Prandtl number. These differences are found to be independent of whether or not there is large-scale dynamo action. However, large-scale dynamos in homogeneous systems are shown to suffer from resistive slow-down even at intermediate length scales. The results from simulations are connected to mean field theory and its applications. Recent work on helicity fluxes to alleviate large-scale dynamo quenching, shear dynamos, nonlocal effects and magnetic structures from strong density stratification are highlighted. Several insights which arise from analytic considerations of small-scale dynamos are discussed.Comment: 36 pages, 11 figures, Spa. Sci. Rev., submitted to the special issue "Magnetism in the Universe" (ed. A. Balogh

    Spallation reactions. A successful interplay between modeling and applications

    Get PDF
    The spallation reactions are a type of nuclear reaction which occur in space by interaction of the cosmic rays with interstellar bodies. The first spallation reactions induced with an accelerator took place in 1947 at the Berkeley cyclotron (University of California) with 200 MeV deuterons and 400 MeV alpha beams. They highlighted the multiple emission of neutrons and charged particles and the production of a large number of residual nuclei far different from the target nuclei. The same year R. Serber describes the reaction in two steps: a first and fast one with high-energy particle emission leading to an excited remnant nucleus, and a second one, much slower, the de-excitation of the remnant. In 2010 IAEA organized a worskhop to present the results of the most widely used spallation codes within a benchmark of spallation models. If one of the goals was to understand the deficiencies, if any, in each code, one remarkable outcome points out the overall high-quality level of some models and so the great improvements achieved since Serber. Particle transport codes can then rely on such spallation models to treat the reactions between a light particle and an atomic nucleus with energies spanning from few tens of MeV up to some GeV. An overview of the spallation reactions modeling is presented in order to point out the incomparable contribution of models based on basic physics to numerous applications where such reactions occur. Validations or benchmarks, which are necessary steps in the improvement process, are also addressed, as well as the potential future domains of development. Spallation reactions modeling is a representative case of continuous studies aiming at understanding a reaction mechanism and which end up in a powerful tool.Comment: 59 pages, 54 figures, Revie

    Demographic reconstruction from ancient DNA supports rapid extinction of the great auk

    Get PDF
    The great auk was once abundant and distributed across the North Atlantic. It is now extinct, having been heavily exploited for its eggs, meat, and feathers. We investigated the impact of human hunting on its demise by integrating genetic data, GPS-based ocean current data, and analyses of population viability. We sequenced complete mitochondrial genomes of 41 individuals from across the species’ geographic range and reconstructed population structure and population dynamics throughout the Holocene. Taken together, our data do not provide any evidence that great auks were at risk of extinction prior to the onset of intensive human hunting in the early 16th century. In addition, our population viability analyses reveal that even if the great auk had not been under threat by environmental change, human hunting alone could have been sufficient to cause its extinction. Our results emphasise the vulnerability of even abundant and widespread species to intense and localised exploitation

    An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics

    Get PDF
    For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types

    Search for a W' boson decaying to a bottom quark and a top quark in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    Results are presented from a search for a W' boson using a dataset corresponding to 5.0 inverse femtobarns of integrated luminosity collected during 2011 by the CMS experiment at the LHC in pp collisions at sqrt(s)=7 TeV. The W' boson is modeled as a heavy W boson, but different scenarios for the couplings to fermions are considered, involving both left-handed and right-handed chiral projections of the fermions, as well as an arbitrary mixture of the two. The search is performed in the decay channel W' to t b, leading to a final state signature with a single lepton (e, mu), missing transverse energy, and jets, at least one of which is tagged as a b-jet. A W' boson that couples to fermions with the same coupling constant as the W, but to the right-handed rather than left-handed chiral projections, is excluded for masses below 1.85 TeV at the 95% confidence level. For the first time using LHC data, constraints on the W' gauge coupling for a set of left- and right-handed coupling combinations have been placed. These results represent a significant improvement over previously published limits.Comment: Submitted to Physics Letters B. Replaced with version publishe

    Search for the standard model Higgs boson decaying into two photons in pp collisions at sqrt(s)=7 TeV

    Get PDF
    A search for a Higgs boson decaying into two photons is described. The analysis is performed using a dataset recorded by the CMS experiment at the LHC from pp collisions at a centre-of-mass energy of 7 TeV, which corresponds to an integrated luminosity of 4.8 inverse femtobarns. Limits are set on the cross section of the standard model Higgs boson decaying to two photons. The expected exclusion limit at 95% confidence level is between 1.4 and 2.4 times the standard model cross section in the mass range between 110 and 150 GeV. The analysis of the data excludes, at 95% confidence level, the standard model Higgs boson decaying into two photons in the mass range 128 to 132 GeV. The largest excess of events above the expected standard model background is observed for a Higgs boson mass hypothesis of 124 GeV with a local significance of 3.1 sigma. The global significance of observing an excess with a local significance greater than 3.1 sigma anywhere in the search range 110-150 GeV is estimated to be 1.8 sigma. More data are required to ascertain the origin of this excess.Comment: Submitted to Physics Letters

    Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment

    Get PDF
    This paper describes an analysis of the angular distribution of W->enu and W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with the ATLAS detector at the LHC in 2010, corresponding to an integrated luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and the missing transverse energy, the W decay angular distribution projected onto the transverse plane is obtained and analysed in terms of helicity fractions f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw > 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour, are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017 +/- 0.030, where the first uncertainties are statistical, and the second include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables, revised author list, matches European Journal of Physics C versio
    • 

    corecore