6,479 research outputs found

    Development of the Rio Grande Compact of 1938

    Get PDF

    Can the UNAIDS 90-90-90 target be achieved? A systematic analysis of national HIV treatment cascades

    Get PDF
    Background In 2014, the Joint United Nations Programme on HIV and AIDS (UNAIDS) and partners set the ‘90-90-90 targets’; aiming to diagnose 90% of all HIV positive people, provide antiretroviral therapy (ART) for 90% of those diagnosed and achieve viral suppression for 90% of those treated, by 2020. This results in 81% of all HIV positive people on treatment and 73% of all HIV positive people achieving viral suppression. We aimed to analyse how effective national HIV treatment programmes are at meeting these targets, using HIV care continuums or cascades. Methods We searched for HIV treatment cascades for 196 countries in published papers, conference presentations, UNAIDS databases and national reports. Cascades were constructed using reliable, generalisable, recent data from national, cross-sectional and longitudinal study cohorts. Data were collected for four stages; total HIV positive people, diagnosed, on treatment and virally suppressed. The cascades were categorised as complete (four stages) or partial (3 stages), and analysed for ‘break points’ defined as a drop >10% in coverage between consecutive 90-90-90 targets. Results 69 country cascades were analysed (32 complete, 37 partial). Diagnosis (target one—90%) ranged from 87% (the Netherlands) to 11% (Yemen). Treatment coverage (target two—81% on ART) ranged from 71% (Switzerland) to 3% (Afghanistan). Viral suppression (target three—73% virally suppressed) was between 68% (Switzerland) and 7% (China). Conclusions No country analysed met the 90-90-90 targets. Diagnosis was the greatest break point globally, but the most frequent key break point for individual countries was providing ART to those diagnosed. Large disparities were identified between countries. Without commitment to standardised reporting methodologies, international comparisons are complex

    Expanding Disease Definitions in Guidelines and Expert Panel Ties to Industry:A Cross-sectional Study of Common Conditions in the United States

    Get PDF
    BACKGROUND: Financial ties between health professionals and industry may unduly influence professional judgments and some researchers have suggested that widening disease definitions may be one driver of over-diagnosis, bringing potentially unnecessary labeling and harm. We aimed to identify guidelines in which disease definitions were changed, to assess whether any proposed changes would increase the numbers of individuals considered to have the disease, whether potential harms of expanding disease definitions were investigated, and the extent of members' industry ties. METHODS AND FINDINGS: We undertook a cross-sectional study of the most recent publication between 2000 and 2013 from national and international guideline panels making decisions about definitions or diagnostic criteria for common conditions in the United States. We assessed whether proposed changes widened or narrowed disease definitions, rationales offered, mention of potential harms of those changes, and the nature and extent of disclosed ties between members and pharmaceutical or device companies. Of 16 publications on 14 common conditions, ten proposed changes widening and one narrowing definitions. For five, impact was unclear. Widening fell into three categories: creating “pre-disease”; lowering diagnostic thresholds; and proposing earlier or different diagnostic methods. Rationales included standardising diagnostic criteria and new evidence about risks for people previously considered to not have the disease. No publication included rigorous assessment of potential harms of proposed changes. Among 14 panels with disclosures, the average proportion of members with industry ties was 75%. Twelve were chaired by people with ties. For members with ties, the median number of companies to which they had ties was seven. Companies with ties to the highest proportions of members were active in the relevant therapeutic area. Limitations arise from reliance on only disclosed ties, and exclusion of conditions too broad to enable analysis of single panel publications. CONCLUSIONS: For the common conditions studied, a majority of panels proposed changes to disease definitions that increased the number of individuals considered to have the disease, none reported rigorous assessment of potential harms of that widening, and most had a majority of members disclosing financial ties to pharmaceutical companies. Please see later in the article for the Editors' Summar

    Comparing Greedy Constructive Heuristic Subtour Elimination Methods for the Traveling Salesman Problem

    Get PDF
    Purpose — This paper aims to define the class of fragment constructive heuristics used to compute feasible solutions for the traveling salesman problem (TSP) into edge-greedy and vertex-greedy subclasses. As these subclasses of heuristics can create subtours, two known methodologies for subtour elimination on symmetric instances are reviewed and are expanded to cover asymmetric problem instances. This paper introduces a third novel subtour elimination methodology, the greedy tracker (GT), and compares it to both known methodologies. Design/methodology/approach — Computational results for all three subtour elimination methodologies are generated across 17 symmetric instances ranging in size from 29 vertices to 5,934 vertices, as well as 9 asymmetric instances ranging in size from 17 to 443 vertices. Findings — The results demonstrate the GT is the fastest method for preventing subtours for instances below 400 vertices. Additionally, a distinction between fragment constructive heuristics and the subtour elimination methodology used to ensure the feasibility of resulting solutions enables the introduction of a new vertex-greedy fragment heuristic called ordered greedy. Originality/value — This research has two main contributions: first, it introduces a novel subtour elimination methodology. Second, the research introduces the concept of ordered lists which remaps the TSP into a new space with promising initial computational results

    A helium film coated quasi‐parabolic mirror to focus a beam of ultra‐cold spin polarized atomic hydrogen

    Full text link
    A 350 mK helium‐4‐coated mirror was used to increase the intensity of an ultra‐cold electron‐spin‐polarized atomic hydrogen beam. The mirror uses the observed specular reflection of atomic hydrogen from a superfluid‐helium‐covered surface. A quasi‐parabolic polished copper mirror was installed with its focus at the 5 mm diameter exit aperture of an atomic hydrogen stabilization cell in the gradient of an 8 T solenoid field. The four‐coned mirror shape, which was designed specifically for operation in the gradient, increased the beam intensity focused by a sextupole magnet into a compression tube detector by a factor of about 7.5.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/87512/2/40_1.pd

    A New Fossil Amiid from the Eocene of Senegal and the Persistence of Extinct Marine Amiids after the Cretaceous–Paleogene Boundary

    Get PDF
    We report a new fossil amiid from Eocene rocks of West Africa representing the first record of this clade from Senegal. The new specimen has a maxilla that is very similar in size to that of Amia calva. It is distinctly smaller than reported remains of another West African Eocene taxon, Maliamia gigas. We tentatively refer the Senegal specimen to Vidalamiini because it has the large postmaxillary process diagnostic of this clade; however, it also exhibits anatomical features not previously described in extinct amiids. We recovered the specimen in rocks of the Lam-Lam Formation in Central-Western Senegal that we interpret to have been a shallow marine depositional environment. The occurrence of an Eocene marine amiid contradicts existing hypotheses that marine amiids were generally absent after the Cretaceous– Paleogene boundary having been replaced by freshwater taxa. Research completed since the initial discovery of Maliamia gigas indicates that this Eocene taxon was also found in shallow marine rocks

    Cost-effectiveness thresholds : pros and cons

    Get PDF
    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this-in addition to uncertainty in the modelled cost-effectiveness ratios-can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations-e.g. budget impact and feasibility considerations-in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair

    Propagation of ultra-high energy protons in the nearby universe

    Get PDF
    We present a new calculation of the propagation of protons with energies above 101910^{19} eV over distances of up to several hundred Mpc. The calculation is based on a Monte Carlo approach using the event generator SOPHIA for the simulation of hadronic nucleon-photon interactions and a realistic integration of the particle trajectories in a random extragalactic magnetic field. Accounting for the proton scattering in the magnetic field affects noticeably the nucleon energy as a function of the distance to their source and allows us to give realistic predictions on arrival energy, time delay, and arrival angle distributions and correlations as well as secondary particle production spectra.Comment: 12 pages, 9 figures, ReVTeX. Physical Review D, accepte
    • 

    corecore