5,184 research outputs found
The Malaria-High Blood Pressure Hypothesis.
RATIONALE: Several studies have demonstrated links between infectious diseases and cardiovascular conditions. Malaria and hypertension are widespread in many low- and middle-income countries, but the possible link between them has not been considered. OBJECTIVE: In this article, we outline the basis for a possible link between malaria and hypertension and discuss how the hypothesis could be confirmed or refuted. METHODS AND RESULTS: We reviewed published literature on factors associated with hypertension and checked whether any of these were also associated with malaria. We then considered various study designs that could be used to test the hypothesis. Malaria causes low birth weight, malnutrition, and inflammation, all of which are associated with hypertension in high-income countries. The hypothetical link between malaria and hypertension can be tested through the use of ecological, cohort, or Mendelian randomization studies, each of which poses specific challenges. CONCLUSIONS: Confirmation of the existence of a causative link with malaria would be a paradigm shift in efforts to prevent and control hypertension and would stimulate wider research on the links between infectious and noncommunicable disease
Recommended from our members
Burn wound classification model using spatial frequency-domain imaging and machine learning.
Accurate assessment of burn severity is critical for wound care and the course of treatment. Delays in classification translate to delays in burn management, increasing the risk of scarring and infection. To this end, numerous imaging techniques have been used to examine tissue properties to infer burn severity. Spatial frequency-domain imaging (SFDI) has also been used to characterize burns based on the relationships between histologic observations and changes in tissue properties. Recently, machine learning has been used to classify burns by combining optical features from multispectral or hyperspectral imaging. Rather than employ models of light propagation to deduce tissue optical properties, we investigated the feasibility of using SFDI reflectance data at multiple spatial frequencies, with a support vector machine (SVM) classifier, to predict severity in a porcine model of graded burns. Calibrated reflectance images were collected using SFDI at eight wavelengths (471 to 851 nm) and five spatial frequencies (0 to 0.2 mm - 1). Three models were built from subsets of this initial dataset. The first subset included data taken at all wavelengths with the planar (0 mm - 1) spatial frequency, the second comprised data at all wavelengths and spatial frequencies, and the third used all collected data at values relative to unburned tissue. These data subsets were used to train and test cubic SVM models, and compared against burn status 28 days after injury. Model accuracy was established through leave-one-out cross-validation testing. The model based on images obtained at all wavelengths and spatial frequencies predicted burn severity at 24 h with 92.5% accuracy. The model composed of all values relative to unburned skin was 94.4% accurate. By comparison, the model that employed only planar illumination was 88.8% accurate. This investigation suggests that the combination of SFDI with machine learning has potential for accurately predicting burn severity
Mismatch-based delayed thrombolysis: a meta-analysis
<p><b>Background and Purpose</b>: Clinical benefit from thrombolysis is reduced as stroke onset to treatment time increases. The use of "mismatch" imaging to identify patients for delayed treatment has face validity and has been used in case series and clinical trials. We undertook a meta-analysis of relevant trials to examine whether present evidence supports delayed thrombolysis among patients selected according to mismatch criteria.</p>
<p><b>Methods</b>: We collated outcome data for patients who were enrolled after 3 hours of stroke onset in thrombolysis trials and had mismatch on pretreatment imaging. We selected the trials on the basis of a systematic search of the Web of Knowledge. We compared favorable outcome, reperfusion and/or recanalization, mortality, and symptomatic intracerebral hemorrhage between the thrombolyzed and nonthrombolyzed groups of patients and the probability of a favorable outcome among patients with successful reperfusion and clinical findings for 3 to 6 versus 6 to 9 hours from poststroke onset. Results are expressed as adjusted odds ratios (a-ORs) with 95% CIs. Heterogeneity was explored by test statistics for clinical heterogeneity, I2 (inconsistency), and L’Abbé plot.</p>
<p><b>Results</b>: We identified articles describing the DIAS, DIAS II, DEDAS, DEFUSE, and EPITHET trials, giving a total of 502 mismatch patients thrombolyzed beyond 3 hours. The combined a-ORs for favorable outcomes were greater for patients who had successful reperfusion (a-OR=5.2; 95% CI, 3 to 9; I2=0%). Favorable clinical outcome was not significantly improved by thrombolysis (a-OR=1.3; 95% CI, 0.8 to 2.0; I2=20.9%). Odds for reperfusion/recanalization were increased among patients who received thrombolytic therapy (a-OR=3.0; 95% CI, 1.6 to 5.8; I2=25.7%). The combined data showed a significant increase in mortality after thrombolysis (a-OR=2.4; 95% CI, 1.2 to 4.9; I2=0%), but this was not confirmed when we excluded data from desmoteplase doses that were abandoned in clinical development (a-OR=1.6; 95% CI, 0.7 to 3.7; I2=0%). Symptomatic intracerebral hemorrhage was significantly increased after thrombolysis (a-OR=6.5; 95% CI, 1.2 to 35.4; I2=0%) but not significant after exclusion of abandoned doses of desmoteplase (a-OR=5.4; 95% CI, 0.9 to 31.8; I2=0%).</p>
<p><b>Conclusions</b>: Delayed thrombolysis amongst patients selected according to mismatch imaging is associated with increased reperfusion/recanalization. Recanalization/reperfusion is associated with improved outcomes. However, delayed thrombolysis in mismatch patients was not confirmed to improve clinical outcome, although a useful clinical benefit remains possible. Thrombolysis carries a significant risk of symptomatic intracerebral hemorrhage and possibly increased mortality. Criteria to diagnose mismatch are still evolving. Validation of the mismatch selection paradigm is required with a phase III trial. Pending these results, delayed treatment, even according to mismatch selection, cannot be recommended as part of routine care.</p>
The journey to delivered value in Australian procurement
In line with developments overseas Australian clients are turning to considerations of value in project procurement. Until the 1980s the industry operated in a largely traditional manner however the extremely adversarial behaviour exhibited during towards the end of the decade led to a number of significant events and initiatives including the publication of “No Dispute”, the Gyles Royal Commission into the Building Industry, the Construction Industry Development Agency (CIDA) and the work of the Australian Procurement and Construction Council (APCC). A number of research projects in progress in the CRC for Construction Innovation (CRC CI) are focussing on the assessment of value and methodologies to support the delivery of value in the procurement and management of engineering and construction projects. This paper charts the emergence of several key drivers in the process and illustrates how they can be integrated into a comprehensive Decision Support System that balances value to stakeholders with project imperatives and incorporates a lessons learned data base which enriches the decision making process to optimise delivery method design and selection
Clinical and Epidemiological Implications of 24-Hour Ambulatory Blood Pressure Monitoring for the Diagnosis of Hypertension in Kenyan Adults: A Population-Based Study.
BACKGROUND: The clinical and epidemiological implications of using ambulatory blood pressure monitoring (ABPM) for the diagnosis of hypertension have not been studied at a population level in sub-Saharan Africa. We examined the impact of ABPM use among Kenyan adults. METHODS AND RESULTS: We performed a nested case-control study of diagnostic accuracy. We selected an age-stratified random sample of 1248 adults from the list of residents of the Kilifi Health and Demographic Surveillance System in Kenya. All participants underwent a screening blood pressure (BP) measurement. All those with screening BP ≥140/90 mm Hg and a random subset of those with screening BP <140/90 mm Hg were invited to undergo ABPM. Based on the 2 tests, participants were categorized as sustained hypertensive, masked hypertensive, "white coat" hypertensive, or normotensive. Analyses were weighted by the probability of undergoing ABPM. Screening BP ≥140/90 mm Hg was present in 359 of 986 participants, translating to a crude population prevalence of 23.1% (95% CI 16.5-31.5%). Age standardized prevalence of screening BP ≥140/90 mm Hg was 26.5% (95% CI 19.3-35.6%). On ABPM, 186 of 415 participants were confirmed to be hypertensive, with crude prevalence of 15.6% (95% CI 9.4-23.1%) and age-standardized prevalence of 17.1% (95% CI 11.0-24.4%). Age-standardized prevalence of masked and white coat hypertension were 7.6% (95% CI 2.8-13.7%) and 3.8% (95% CI 1.7-6.1%), respectively. The sensitivity and specificity of screening BP measurements were 80% (95% CI 73-86%) and 84% (95% CI 79-88%), respectively. BP indices and validity measures showed strong age-related trends. CONCLUSIONS: Screening BP measurement significantly overestimated hypertension prevalence while failing to identify ≈50% of true hypertension diagnosed by ABPM. Our findings suggest significant clinical and epidemiological benefits of ABPM use for diagnosing hypertension in Kenyan adults
Recommended from our members
Managing Surface Water Inputs to Reduce Phosphorus Losses from Cranberry Farms
Abstract:
In Massachusetts, cranberry (Vaccinium macrocarpon Ait.) production accounts for one-fourth of US cranberry supply, but water quality concerns, water use, and wetland protection laws threaten the sustainability and future viability of the state’s cranberry industry. Pond water used for harvest and winter flooding accounts for up to two-thirds of phosphorus (P) losses in drainage waters. Consequently, use of P sorbing salts to treat pond water holds promise in the mitigation of P losses from cranberry farms. Laboratory evaluation of aluminum (Al)-, iron (Fe)-, and calcium (Ca)-based salts was conducted to determine the application rate required for reducing P in shallow (0.4 m) and deep (3.2 m) water ponds used for cranberry production. Limited P removal (\u3c22%) with calcium carbonate and calcium sulfate was consistent with their relatively low solubility in water. Calcium hydroxide reduced total P up to 66%, but increases in pond water pH (\u3e8) could be detrimental to cranberry production. Ferric sulfate and aluminum sulfate applications of 15 mg L−1 (ppm) resulted in near-complete removal of total P, which decreased from 49 ± 3 to \u3c10 mg P L−1 (ppb). However, ferric sulfate application lowered pH below the recommend range for cranberry soils. Field testing of aluminum sulfate demonstrated that at a dose of 15 mg L−1 (~1.4 Al mg L−1), total P in pond water was reduced by 78 to 93%. Laboratory and field experiments support the recommendation of aluminum sulfate as a cost-effective remedial strategy for reducing elevated P in surface water used for cranberry production
Focused Deterrence and the Prevention of Violent Gun Injuries: Practice, Theoretical Principles, and Scientific Evidence
Focused deterrence strategies are a relatively new addition to a growing portfolio of evidence-based violent gun injury prevention practices available to policy makers and practitioners. These strategies seek to change offender behavior by understanding the underlying violence-producing dynamics and conditions that sustain recurring violent gun injury problems and by implementing a blended strategy of law enforcement, community mobilization, and social service actions. Consistent with documented public health practice, the focused deterrence approach identifies underlying risk factors and causes of recurring violent gun injury problems, develops tailored responses to these underlying conditions, and measures the impact of implemented interventions. This article reviews the practice, theoretical principles, and evaluation evidence on focused deterrence strategies. Although more rigorous randomized studies are needed, the available empirical evidence suggests that these strategies generate noteworthy gun violence reduction impacts and should be part of a broader portfolio of violence prevention strategies available to policy makers and practitioners
Hyper- and hypo- nutrition studies of the hepatic transcriptome and epigenome suggest that PPARα regulates anaerobic glycolysis
Diet plays a crucial role in shaping human health and disease. Diets promoting obesity and insulin resistance can lead to severe metabolic diseases, while calorie-restricted (CR) diets can improve health and extend lifespan. In this work, we fed mice either a chow diet (CD), a 16 week high-fat diet (HFD), or a CR diet to compare and contrast the effects of these diets on mouse liver biology. We collected transcriptomic and epigenomic datasets from these mice using RNA-Seq and DNase-Seq. We found that both CR and HFD induce extensive transcriptional changes, in some cases altering the same genes in the same direction. We used our epigenomic data to infer transcriptional regulatory proteins bound near these genes that likely influence their expression levels. In particular, we found evidence for critical roles played by PPARα and RXRα. We used ChIP-Seq to profile the binding locations for these factors in HFD and CR livers. We found extensive binding of PPARα near genes involved in glycolysis/gluconeogenesis and uncovered a role for this factor in regulating anaerobic glycolysis. Overall, we generated extensive transcriptional and epigenomic datasets from livers of mice fed these diets and uncovered new functions and gene targets for PPARα
- …