234 research outputs found

    Health and economic impact of combining metformin with nateglinide to achieve glycemic control: Comparison of the lifetime costs of complications in the U.K

    Get PDF
    BACKGROUND: To reduce the likelihood of complications in persons with type 2 diabetes, it is critical to control hyperglycaemia. Monotherapy with metformin or insulin secretagogues may fail to sustain control after an initial reduction in glycemic levels. Thus, combining metformin with other agents is frequently necessary. These analyses model the potential long-term economic and health impact of using combination therapy to improve glycemic control. METHODS: An existing model that simulates the long-term course of type 2 diabetes in relation to glycosylated haemoglobin (HbA(1c)) and post-prandial glucose (PPG) was used to compare the combination of nateglinide with metformin to monotherapy with metformin. Complication rates were estimated for major diabetes-related complications (macrovascular and microvascular) based on existing epidemiologic studies and clinical trial data. Utilities and costs were estimated using data collected in the United Kingdom Prospective Diabetes Study (UKPDS). Survival, life years gained (LYG), quality-adjusted life years (QALY), complication rates and associated costs were estimated. Costs were discounted at 6% and benefits at 1.5% per year. RESULTS: Combination therapy was predicted to reduce complication rates and associated costs compared with metformin. Survival increased by 0.39 (0.32 discounted) and QALY by 0.46 years (0.37 discounted) implying costs of £6,772 per discounted LYG and £5,609 per discounted QALY. Sensitivity analyses showed the results to be consistent over broad ranges. CONCLUSION: Although drug treatment costs are increased by combination therapy, this cost is expected to be partially offset by a reduction in the costs of treating long-term diabetes complications

    Modeling good research practices - overview: a report of the ISPOR-SMDM modeling good research practices task force - 1.

    Get PDF
    Models—mathematical frameworks that facilitate estimation of the consequences of health care decisions—have become essential tools for health technology assessment. Evolution of the methods since the first ISPOR modeling task force reported in 2003 has led to a new task force, jointly convened with the Society for Medical Decision Making, and this series of seven papers presents the updated recommendations for best practices in conceptualizing models; implementing state–transition approaches, discrete event simulations, or dynamic transmission models; dealing with uncertainty; and validating and reporting models transparently. This overview introduces the work of the task force, provides all the recommendations, and discusses some quandaries that require further elucidation. The audience for these papers includes those who build models, stakeholders who utilize their results, and, indeed, anyone concerned with the use of models to support decision making

    Adult Vaccination Strategies for the Control of Pertussis in the United States: An Economic Evaluation Including the Dynamic Population Effects

    Get PDF
    BACKGROUND: Prior economic evaluations of adult and adolescent vaccination strategies against pertussis have reached disparate conclusions. Using static approaches only, previous studies failed to analytically include the indirect benefits derived from herd immunity as well as the impact of vaccination on the evolution of disease incidence over time. METHODS: We assessed the impact of different pertussis vaccination strategies using a dynamic compartmental model able to consider pertussis transmission. We then combined the results with economic data to estimate the relative cost-effectiveness of pertussis immunization strategies for adolescents and adults in the US. The analysis compares combinations of programs targeting adolescents, parents of newborns (i.e. cocoon strategy), or adults of various ages. RESULTS: In the absence of adolescent or adult vaccination, pertussis incidence among adults is predicted to more than double in 20 years. Implementing an adult program in addition to childhood and adolescent vaccination either based on 1) a cocoon strategy and a single booster dose or 2) a decennial routine vaccination would maintain a low level of pertussis incidence in the long run for all age groups (respectively 30 and 20 cases per 100,000 person years). These strategies would also result in significant reductions of pertussis costs (between -77% and -80% including additional vaccination costs). The cocoon strategy complemented by a single booster dose is the most cost-effective one, whereas the decennial adult vaccination is slightly more effective in the long run. CONCLUSIONS: By providing a high level of disease control, the implementation of an adult vaccination program against pertussis appears to be highly cost-effective and often cost-saving

    Trusting the results of model-based economic analyses: is there a pragmatic validation solution?

    Get PDF
    Models have become a nearly essential component of health technology assessment. This is because the efficacy and safety data available from clinical trials are insufficient to provide the required estimates of impact of new interventions over long periods of time and for other populations and subgroups. Despite more than five decades of use of these decision-analytic models, decision makers are still often presented with poorly validated models and thus trust in their results is impaired. Among the reasons for this vexing situation are the artificial nature of the models, impairing their validation against observable data, the complexity in their formulation and implementation, the lack of data against which to validate the model results, and the challenges of short timelines and insufficient resources. This article addresses this crucial problem of achieving models that produce results that can be trusted and the resulting requirements for validation and transparency, areas where our field is currently deficient. Based on their differing perspectives and experiences, the authors characterize the situation and outline the requirements for improvement and pragmatic solutions to the problem o

    Cost-effectiveness analysis of HPV extended versus partial genotyping for cervical cancer screening in Singapore

    Get PDF
    Human papillomavirus (HPV) partial genotyping (PGT) identifies HPV16 and HPV18 individually, alongside 12 other high-risk HPV genotypes (hrHPV) collectively. HPV extended genotyping (XGT) identifies four additional hrHPV individually (HPV31, 45, 51, and 52), and reports the remaining eight in three groups (HPV33|58; 56|59|66; 35|39|68). Quality-adjusted life years (QALY), health care resource use, and costs of XGT were compared to PGT for cervical cancer screening in Singapore using DICE simulation. Women with one of the three hrHPV identified by XGT (HPV35|39|68; 56|59|66; 51), and atypical squamous cells of undetermined significance (ASCUS) on cytology, are recalled for a repeat screening in one year, instead of undergoing an immediate colposcopy with PGT. At the repeat screening, the colposcopy is performed only for persistent same-genotype infections in XGT, while with PGT, all the women with persistent HPV have a colposcopy. Screening 500,122 women, aged 30–69, with XGT, provided an incremental cost-effectiveness ratio (ICER) versus PGT of SGD 16,370/QALY, with 7130 (19.4%) fewer colposcopies, 6027 (7.0%) fewer cytology tests, 9787 (1.6%) fewer clinic consultations, yet 2446 (0.5%) more HPV tests. The XGT ICER remains well below SGD 100,000 in sensitivity analyses, (-SGD 17,736/QALY to SGD 50,474/QALY). XGT is cost-effective compared to PGT, utilizes fewer resources, and provides a risk-based approach as the primary cervical cancer screening method

    The use of MCDA in HTA : great potential, but more effort needed

    Get PDF
    The potential for multi-criteria decision analysis (MCDA) to support health technology assessment (HTA) has been much discussed, and various HTA agencies are piloting or applying MCDA. Alongside these developments, good practice guidelines for the application of MCDA in health care have been developed. An assessment of current applications of MCDA to HTA in light of good practice guidelines reveals, however, that many have methodologic flaws that undermine their usefulness. Three challenges are considered: the use of additive models, a lack of connection between criteria scales and weights, and the use of MCDA in economic evaluation. More attention needs to be paid to MCDA good practice by researchers, journal editors, and decision makers and further methodologic developments are required if MCDA is to achieve its potential to support HTA

    Characterization of hARD2, a processed hARD1 gene duplicate, encoding a human protein N-α-acetyltransferase

    Get PDF
    BACKGROUND: Protein acetylation is increasingly recognized as an important mechanism regulating a variety of cellular functions. Several human protein acetyltransferases have been characterized, most of them catalyzing ε-acetylation of histones and transcription factors. We recently described the human protein acetyltransferase hARD1 (human Arrest Defective 1). hARD1 interacts with NATH (N-Acetyl Transferase Human) forming a complex expressing protein N-terminal α-acetylation activity. RESULTS: We here describe a human protein, hARD2, with 81 % sequence identity to hARD1. The gene encoding hARD2 most likely originates from a eutherian mammal specific retrotransposition event. hARD2 mRNA and protein are expressed in several human cell lines. Immunoprecipitation experiments show that hARD2 protein potentially interacts with NATH, suggesting that hARD2-NATH complexes may be responsible for protein N-α-acetylation in human cells. In NB4 cells undergoing retinoic acid mediated differentiation, the level of endogenous hARD1 and NATH protein decreases while the level of hARD2 protein is stable. CONCLUSION: A human protein N-α-acetyltransferase is herein described. ARD2 potentially complements the functions of ARD1, adding more flexibility and complexity to protein N-α-acetylation in human cells as compared to lower organisms which only have one ARD

    Applying the win ratio method in clinical trials of orphan drugs: an analysis of data from the COMET trial of avalglucosidase alfa in patients with late-onset Pompe disease

    Get PDF
    Background: Clinical trials for rare diseases often include multiple endpoints that capture the effects of treatment on different disease domains. In many rare diseases, the primary endpoint is not standardized across trials. The win ratio approach was designed to analyze multiple endpoints of interest in clinical trials and has mostly been applied in cardiovascular trials. Here, we applied the win ratio approach to data from COMET, a phase 3 trial in late-onset Pompe disease, to illustrate how this approach can be used to analyze multiple endpoints in the orphan drug context. Methods: All possible participant pairings from both arms of COMET were compared sequentially on changes at week 49 in upright forced vital capacity (FVC) % predicted and six-minute walk test (6MWT). Each participant’s response for the two endpoints was first classified as a meaningful improvement, no meaningful change, or a meaningful decline using thresholds based on published minimal clinically important differences (FVC ± 4% predicted, 6MWT ± 39 m). Each comparison assessed whether the outcome with avalglucosidase alfa (AVA) was better than (win), worse than (loss), or equivalent to (tie) the outcome with alglucosidase alfa (ALG). If tied on FVC, 6MWT was compared. In this approach, the treatment effect is the ratio of wins to losses (“win ratio”), with ties excluded. Results: In the 2499 possible pairings (51 receiving AVA × 49 receiving ALG), the win ratio was 2.37 (95% confidence interval [CI], 1.30–4.29, p = 0.005) when FVC was compared before 6MWT. When the order was reversed, the win ratio was 2.02 (95% CI, 1.13–3.62, p = 0.018). Conclusion: The win ratio approach can be used in clinical trials of rare diseases to provide meaningful insight on treatment benefits from multiple endpoints and across disease domains

    The time course of subsequent hospitalizations and associated costs in survivors of an ischemic stroke in Canada

    Get PDF
    BACKGROUND: Documentation of the hospitalizations rates following a stroke provides the inputs required for planning health services and to evaluate the economic efficiency of any new therapies. METHODS: Hospitalization rates by cause were examined using administrative data on 18,695 patients diagnosed with ischemic stroke (first or subsequent, excluding transient ischemic attack) in Saskatchewan, Canada between 1990 and 1995. Medical history was available retrospectively to January 1980 and follow-up was complete to March 2000. Analyses evaluated the rate and timing of all-cause and cardiovascular hospitalizations within discrete periods in the five years following the index stroke. Cardiovascular hospitalizations included patients with a primary diagnosis of ischemic stroke, transient ischemic attack, myocardial infarction, stable or unstable angina, heart failure or peripheral arterial disease. RESULTS: One-third (36%) of patients were identified by a hospitalized stroke. Mean age was 70.5 years, 48.0% were male, half had a history of stroke or a transient ischemic attack at the time of their index stroke. Three-quarters of the patients (72.7%) were hospitalized at least once during a mean follow-up of 4.6 years, accruing CAD $24 million in the first year alone. Of all hospitalizations, 20.4% were related to cardiovascular disease and 1.6% to bleeds. In the month following index stroke, 12.5% were admitted, an average of 1.04 times per patient hospitalized. Strokes accounted for 33% of all hospitalizations in the first month. The rate diminished steadily throughout the year and stabilized in the second year when approximately one-third of patients required hospitalization, at a rate of about one hospitalization for every two patient-years. Mean lengths of stay ranged from nine days to nearly 40 days. Close-fitting Weibull functions allow highly specific probability estimates. Other cardiovascular risk factors significantly increased hospitalization rates. CONCLUSION: After stroke, there are frequent hospitalizations accounting for substantial additional costs. Though these rates drop after one year, they remain high over time. The number of other cardiovascular causes of hospitalization confirms that stroke is a manifestation of disseminated atherothrombotic disease

    Software architecture for the measurement of operational risk in financial sector entities

    Get PDF
    La medición de los riesgos financieros tales como operacional, liquidez y crédito, entre otros, es una de las preocupaciones más frecuentes en el sector financiero; en este sentido, la materialización del riesgo operacional da lugar a enormes pérdidas monetarias derivadas de fallos en las personas, en los procesos y en los procedimientos que inciden en la operación de la entidad. Con el fin de sistematizar la medición del riesgo operacional se ha desarrollado el Sistema de Información Operational Risk Management, el cual facilita la medición del riesgo operacional, a partir de la obtención de la matriz de pérdidas esperadas e inesperadas y la estimación de Valor en Riesgo Operacional (Op-VaR) para los diferentes fallos que se puedan presentar en cada una de las líneas de negocio con las que cuenta la entidad. En este trabajo se muestra cómo el uso de la arquitectura basada en filtros facilita y agiliza cálculos que requieren grandes volúmenes de datos con información financiera. Actualmente el Sistema de Información es utilizado por entidades del sector financiero colombiano quienes a partir de su uso han optimizado tanto sus utilidades como la productividad del talento humano, toda vez que el sistema ha permitido generar planes de contingencia para atender una crisis por riesgo operacionalThe measurement of financial risks, such as operational, liquidity and credit, among others, is one of the most frequent concern in the bank and corporative sector, in this sense, the operational risk materialization causes large losses due to fails on the procedures that affect the functioning of the organization. With the goal of systematize the risk measurement, we has implement the Information System Financial Risk Management which facilitates the measurement of operational risk starting on the expected and unexpected loss matrix and the estimation of Value at Operational Risk (Op-VaR) for different failures that may occur in each of the business lines that the entity has. The paper shows how the use of filters based on easier and faster calculations that require large volumes of data from financial information architecture. Currently the Information System is used by Colombian financial sector entities who from their use have optimized both their profits and the productivity of human talent, since the system has allowed the generation of contingency plans to deal with a crisis due to operational ris
    corecore