1,683 research outputs found

    Does metformin improve vascular health in children with Type 1 diabetes? Protocol for a one year, double blind, randomised, placebo controlled trial

    Get PDF
    Background: Cardiovascular disease is the leading cause of mortality in Type 1 diabetes (T1D). Vascular dysfunction is an early and critical event in the development of cardiovascular disease. Children with T1D have vascular dysfunction therefore early interventions to improve vascular health are essential to reduce cardiovascular mortality in T1D. Metformin is an insulin sensitising agent which is known to improve vascular health outcomes in type 2 diabetes (T2D) and other individuals with insulin resistance. It has been used safely in children and adolescents with T2D for over 10 years. This study aims to assess the effect of metformin on vascular health in children with T1D. Methods/Design: This study is a 12 month, double blind, randomised, placebo controlled trial to determine the effect of metformin on vascular health in children (age 8–18) with T1D. The sample size is 76 with 38 children in the metformin group and 38 children in the placebo group. Vascular health and biochemical markers will be measured at baseline, 3, 6 and 12 months. Vascular function will be measured using flow mediated dilatation and glyceryl trinitrate mediated dilatation of the brachial artery and vascular structure will be measured with carotid and aortic intima media thickness, using standardised protocols. Discussion: This study will be the first to investigate the effect of metformin on vascular health in children with T1D. It will provide important information on a potential intervention to improve cardiovascular morbidity and mortality in this population at high risk from cardiovascular disease.Jemma Anderson, Alexia S Peña, Thomas Sullivan, Roger Gent, Bronwen D’Arcy, Timothy Olds, Brian Coppin and Jennifer Coupe

    Phase Transition in the Number Partitioning Problem

    Full text link
    Number partitioning is an NP-complete problem of combinatorial optimization. A statistical mechanics analysis reveals the existence of a phase transition that separates the easy from the hard to solve instances and that reflects the pseudo-polynomiality of number partitioning. The phase diagram and the value of the typical ground state energy are calculated.Comment: minor changes (references, typos and discussion of results

    Random Costs in Combinatorial Optimization

    Full text link
    The random cost problem is the problem of finding the minimum in an exponentially long list of random numbers. By definition, this problem cannot be solved faster than by exhaustive search. It is shown that a classical NP-hard optimization problem, number partitioning, is essentially equivalent to the random cost problem. This explains the bad performance of heuristic approaches to the number partitioning problem and allows us to calculate the probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR

    Optimization by Quantum Annealing: Lessons from hard 3-SAT cases

    Full text link
    The Path Integral Monte Carlo simulated Quantum Annealing algorithm is applied to the optimization of a large hard instance of the Random 3-SAT Problem (N=10000). The dynamical behavior of the quantum and the classical annealing are compared, showing important qualitative differences in the way of exploring the complex energy landscape of the combinatorial optimization problem. At variance with the results obtained for the Ising spin glass and for the Traveling Salesman Problem, in the present case the linear-schedule Quantum Annealing performance is definitely worse than Classical Annealing. Nevertheless, a quantum cooling protocol based on field-cycling and able to outperform standard classical simulated annealing over short time scales is introduced.Comment: 10 pages, 6 figures, submitted to PR

    Observational constraints on the origin of the elements. V. Non-LTE abundance ratios of [Ni/Fe] in Galactic stars and enrichment by sub-Chandrasekhar mass SNe

    Full text link
    We constrain the role of different SN Ia channels in the chemical enrichment of the Galaxy by studying the abundances of nickel in Galactic stars. We investigate four different SN Ia sub-classes, including the classical single-degenerate near-Chandrasekhar mass SN Ia, the fainter SN Iax systems associated with He accretion from the companion, as well as two sub-Ch mass SN Ia channels. The latter include the double-detonation of a white dwarf accreting helium-rich matter and violent white dwarf mergers. NLTE models of Fe and Ni are used in the abundance analysis. In the GCE models, we include new delay time distributions arising from the different SN Ia channels, as well as recent yields for core-collapse supernovae and AGB stars. The data-model comparison is performed using a Markov chain Monte Carlo framework that allows us to explore the entire parameter space allowed by the diversity of explosion mechanisms and the Galactic SN Ia rate, taking into account the uncertainties of the observed data. We show that NLTE effects have a non-negligible impact on the observed [Ni/Fe] ratios in the Galactic stars. The NLTE corrections to Ni abundances are not large, but strictly positive, lifting the [Ni/Fe] ratios by ~+0.15 dex at [Fe/H] =-2. We find that that the distributions of [Ni/Fe] in LTE and in NLTE are very tight, with a scatter of < 0.1 dex at all metallicities, supporting earlier work. In LTE, most stars have scaled-solar Ni abundances, [Ni/Fe] = 0, with a slight tendency for sub-solar [Ni/Fe] ratios at lower [Fe/H]. In NLTE, however, we find a mild anti-correlation between [Ni/Fe] and metallicity, and a slightly elevated [Ni/Fe] ratios at [Fe/H] < -1.0. The NLTE data can be explained by the GCE models calculated with a substantial, ~ 75%, fraction of sub-Ch SN Ia.Comment: accepted for publication in Astronomy & Astrophysics, abridged version of the abstrac

    Entropy-based analysis of the number partitioning problem

    Full text link
    In this paper we apply the multicanonical method of statistical physics on the number-partitioning problem (NPP). This problem is a basic NP-hard problem from computer science, and can be formulated as a spin-glass problem. We compute the spectral degeneracy, which gives us information about the number of solutions for a given cost EE and cardinality mm. We also study an extension of this problem for QQ partitions. We show that a fundamental difference on the spectral degeneracy of the generalized (Q>2Q>2) NPP exists, which could explain why it is so difficult to find good solutions for this case. The information obtained with the multicanonical method can be very useful on the construction of new algorithms.Comment: 6 pages, 4 figure

    Oscillating Fracture in Rubber

    Full text link
    We have found an oscillating instability of fast-running cracks in thin rubber sheets. A well-defined transition from straight to oscillating cracks occurs as the amount of biaxial strain increases. Measurements of the amplitude and wavelength of the oscillation near the onset of this instability indicate that the instability is a Hopf bifurcation

    Phase transition for cutting-plane approach to vertex-cover problem

    Full text link
    We study the vertex-cover problem which is an NP-hard optimization problem and a prototypical model exhibiting phase transitions on random graphs, e.g., Erdoes-Renyi (ER) random graphs. These phase transitions coincide with changes of the solution space structure, e.g, for the ER ensemble at connectivity c=e=2.7183 from replica symmetric to replica-symmetry broken. For the vertex-cover problem, also the typical complexity of exact branch-and-bound algorithms, which proceed by exploring the landscape of feasible configurations, change close to this phase transition from "easy" to "hard". In this work, we consider an algorithm which has a completely different strategy: The problem is mapped onto a linear programming problem augmented by a cutting-plane approach, hence the algorithm operates in a space OUTSIDE the space of feasible configurations until the final step, where a solution is found. Here we show that this type of algorithm also exhibits an "easy-hard" transition around c=e, which strongly indicates that the typical hardness of a problem is fundamental to the problem and not due to a specific representation of the problem.Comment: 4 pages, 3 figure

    Phase transition and landscape statistics of the number partitioning problem

    Full text link
    The phase transition in the number partitioning problem (NPP), i.e., the transition from a region in the space of control parameters in which almost all instances have many solutions to a region in which almost all instances have no solution, is investigated by examining the energy landscape of this classic optimization problem. This is achieved by coding the information about the minimum energy paths connecting pairs of minima into a tree structure, termed a barrier tree, the leaves and internal nodes of which represent, respectively, the minima and the lowest energy saddles connecting those minima. Here we apply several measures of shape (balance and symmetry) as well as of branch lengths (barrier heights) to the barrier trees that result from the landscape of the NPP, aiming at identifying traces of the easy/hard transition. We find that it is not possible to tell the easy regime from the hard one by visual inspection of the trees or by measuring the barrier heights. Only the {\it difficulty} measure, given by the maximum value of the ratio between the barrier height and the energy surplus of local minima, succeeded in detecting traces of the phase transition in the tree. In adddition, we show that the barrier trees associated with the NPP are very similar to random trees, contrasting dramatically with trees associated with the pp spin-glass and random energy models. We also examine critically a recent conjecture on the equivalence between the NPP and a truncated random energy model
    • …
    corecore