31 research outputs found

    β-Blockers for Primary Prevention in Hypertension: Era Bygone?

    No full text
    β-Blockers are used commonly worldwide in clinical practice for lowering blood pressure. Most of the agents in this class are efficacious in reducing blood pressure, although they vary widely in their pharmacokinetic and pharmacodynamic properties. This variability may confer significant differences in clinical benefits associated with use of specific agents. Although commonly used in managing hypertension, the role of β-blockers for primary prevention in uncomplicated hypertension has been controversial. Evidence from recent trials suggest relatively poor efficacy toward primary prevention and worse outcomes for certain end points, when compared with other blood pressure-lowering agents, Recently updated National Institute for Health and Clinical Excellence guidelines for England and Wales reflect this concern and have changed the indication for β-blockers for blood pressure control from primary agents to use as an add-on agent in patients requiring multiple therapy. In this review, considering the extended debate on this topic, we discuss the general properties of β-blockers and evidence from clinical trials supporting or refuting their use in various clinical scenarios. Newer β-blockers have additional properties which may be beneficial. Although, whether these additional benefits will help in primary prevention is not clear. © 2006 Elsevier Inc. All rights reserved

    The J-Curve Between Blood Pressure and Coronary Artery Disease or Essential Hypertension. Exactly How Essential?

    Get PDF
    The topic of the J-curve relationship between blood pressure and coronary artery disease (CAD) has been the subject of much controversy for the past decades. An inverse relationship between diastolic pressure and adverse cardiac ischemic events (i.e., the lower the diastolic pressure the greater the risk of coronary heart disease and adverse outcomes) has been observed in numerous studies. This effect is even more pronounced in patients with underlying CAD. Indeed, a J-shaped relationship between diastolic pressure and coronary events was documented in treated patients with CAD in most large trials that scrutinized this relationship. In contrast to any other vascular bed, the coronary circulation receives its perfusion mostly during diastole; hence, an excessive decrease in diastolic pressure can significantly hamper perfusion. This adverse effect of too low a diastolic pressure on coronary heart disease leaves the practicing physician with the disturbing possibility that, in patients at risk, lowering blood pressure to levels that prevent stroke or renal disease might actually precipitate myocardial ischemia. However, these concerns should not deter physicians from pursuing a more aggressive control of hypertension, because currently blood pressure is brought to recommended target levels in only approximately one-third of patients. © 2009 American College of Cardiology Foundation

    The J-point phenomenon in aggressive therapy of hypertension: New insights

    No full text
    In the era of aggressive control of cardiovascular risk factors such as hypertension, the mantra of lower is better has taken a strong foothold. Although there is clear epidemiologic evidence that lower blood pressure improves specific organ-related outcomes, this rule does not apply to all patients and definitely not all target organs. The concept of J-curve or adverse outcomes at lower blood pressure has been proposed for more than three decades but has recently come under increasing scrutiny. Specifically, a relationship between adverse cardiovascular outcomes and low diastolic blood pressure has been observed in multiple clinical trials. In this article we review the advances in understanding of the J-curve phenomenon and include a discussion on specific populations that might be at higher risk due to the J-curve relationship. © Springer Science+Business Media, LLC 2012

    Potentiation of Doxorubicin Cardiotoxicity by Iron Loading in a Rodent Model

    Get PDF
    Objectives: The role of iron toward doxorubicin (DOX) cardiotoxicity was studied using a rodent model of dietary carbonyl iron loading. Background: Doxorubicin, a commonly used anticancer drug, is known to cause serious and potentially life-threatening cardiotoxicity. Doxorubicin cardiotoxicity is thought to be mediated through free-radical injury. Methods: Male Sprague Dawley rats fed iron-rich chow (n = 8) and regular chow (n = 8) were treated with DOX or saline (4 animals in each arm). Cardiotoxicity was assessed using mortality, weight changes, Tc-99m annexin-V imaging, histopathology, and immunohistochemistry. Results: Animals fed iron-rich chow showed significantly higher DOX cardiotoxicity as evidenced by greater weight loss (107 ± 14 g vs. 55 ± 10 g weight loss, p \u3c 0.05), higher annexin uptake (0.14 ± 0.01% vs. 0.08 ± 0.01% injected dose/g of myocardium, p \u3c 0.05), more severe myocyte injury on electron microscopy, and significantly higher cleaved caspase-3 staining compared with regular chow fed rats given DOX. Feeding iron-rich chow alone did not result in any cardiotoxicity. Conclusions: Dietary iron loading resulted in a substantially increased DOX cardiotoxicity in rats. Body iron stores as well as its bioavailability in tissue may be important independent predictors of susceptibility to DOX cardiotoxicity in man. Further clinical studies are warranted. © 2007 American College of Cardiology Foundation
    corecore