229 research outputs found
Ultrasensitive, rapid and inexpensive detection of DNA using paper based lateral flow assay
Sensitive, specific, rapid, inexpensive and easy-to-use nucleic acid tests for use at the point-of-need are critical for the emerging field of personalised medicine for which companion diagnostics are essential, as well as for application in low resource settings. Here we report on the development of a point-of-care nucleic acid lateral flow test for the direct detection of isothermally amplified DNA. The recombinase polymerase amplification method is modified slightly to use tailed primers, resulting in an amplicon with a duplex flanked by two single stranded DNA tails. This tailed amplicon facilitates detection via hybridisation to a surface immobilised oligonucleotide capture probe and a gold nanoparticle labelled reporter probe. A detection limit of 1
7 10−11 M (190 amol), equivalent to 8.67
7 105 copies of DNA was achieved, with the entire assay, both amplification and detection, being completed in less than 15 minutes at a constant temperature of 37 \ub0C. The use of the tailed primers obviates the need for hapten labelling and consequent use of capture and reporter antibodies, whilst also avoiding the need for any post-amplification processing for the generation of single stranded DNA, thus presenting an assay that can facilely find application at the point of need
Surgical management of Diabetic foot ulcers: A Tanzanian university teaching hospital experience
\ud
\ud
Diabetic foot ulcers (DFUs) pose a therapeutic challenge to surgeons, especially in developing countries where health care resources are limited and the vast majority of patients present to health facilities late with advanced foot ulcers. A prospective descriptive study was done at Bugando Medical Centre from February 2008 to January 2010 to describe our experience in the surgical management of DFUs in our local environment and compare with what is known in the literature. Of the total 4238 diabetic patients seen at BMC during the period under study, 136 (3.2%) patients had DFUs. Males outnumbered females by the ratio of 1.2:1. Their mean age was 54.32 years (ranged 21-72years). Thirty-eight (27.9%) patients were newly diagnosed diabetic patients. The majority of patients (95.5%) had type 2 diabetes mellitus. The mean duration of diabetes was 8.2 years while the duration of DFUs was 18.34 weeks. Fourteen (10.3%) patients had previous history of foot ulcers and six (4.4%) patients had previous amputations. The forefoot was commonly affected in 60.3% of cases. Neuropathic ulcers were the most common type of DFUs in 57.4% of cases. Wagner's stage 4 and 5 ulcers were the most prevalent at 29.4% and 23.5% respectively. The majority of patients (72.1%) were treated surgically. Lower limb amputation was the most common surgical procedure performed in 56.7% of cases. The complication rate was (33.5%) and surgical site infection was the most common complication (18.8%). Bacterial profile revealed polymicrobial pattern and Staphylococcus aureus was the most frequent microorganism isolated. All the microorganisms isolated showed high resistance to commonly used antibiotics except for Meropenem and imipenem, which were 100% sensitive each respectively. The mean hospital stay was 36.24 ± 12.62 days (ranged 18-128 days). Mortality rate was 13.2%. Diabetic foot ulceration constitutes a major source of morbidity and mortality among patients with diabetes mellitus at Bugando Medical Centre and is the leading cause of non-traumatic lower limb amputation. A multidisciplinary team approach targeting at good glycaemic control, education on foot care and appropriate footware, control of infection and early surgical intervention is required in order to reduce the morbidity and mortality associated with DFUs. Due to polymicrobial infection and antibiotic resistance, surgical intervention must be concerned
Lighting during grow-out and Salmonella in broiler flocks
<p>Abstract</p> <p>Background</p> <p>Lighting is used during conventional broiler grow-out to modify bird behaviour to reach the goals of production and improve bird welfare. The protocols for lighting intensity vary. In a field study, we evaluated if the lighting practices impact the burden of <it>Salmonella </it>in broiler flocks.</p> <p>Methods</p> <p>Conventional grow-out flocks reared in the states of Alabama, Mississippi and Texas, USA in 2003 to 2006 were sampled 1 week before harvest (<it>n </it>= 58) and upon arrival for processing (<it>n </it>= 56) by collecting feathered carcass rinsate, crop and one cecum from each of 30 birds, and during processing by collecting rinsate of 30 carcasses at pre-chilling (<it>n </it>= 56) and post-chilling points (<it>n </it>= 54). Litter samples and drag swabs of litter were collected from the grow-out houses after bird harvest (<it>n </it>= 56). Lighting practices for these flocks were obtained with a questionnaire completed by the growers. Associations between the lighting practices and the burden of <it>Salmonella </it>in the flocks were tested while accounting for variation between the grow-out farms, their production complexes and companies.</p> <p>Results</p> <p>Longer relative duration of reduced lights during the grow-out period was associated with reduced detection of <it>Salmonella </it>on the exterior of birds 1 week before harvest and on the broiler carcasses at the post-chilling point of processing. In addition, starting reduced lights for ≥18 hours per day later in the grow-out period was associated with decreased detection of <it>Salmonella </it>on the exterior of broilers arriving for processing and in the post-harvest drag swabs of litter from the grow-out house.</p> <p>Conclusions</p> <p>The results of this field study show that lighting practices implemented during broiler rearing can impact the burden of <it>Salmonella </it>in the flock. The underlying mechanisms are likely to be interactive.</p
Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015
SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation
Complicated intra-abdominal infections worldwide: the definitive data of the CIAOW Study.
The CIAOW study (Complicated intra-abdominal infections worldwide observational study) is a multicenter observational study underwent in 68 medical institutions worldwide during a six-month study period (October 2012-March 2013). The study included patients older than 18 years undergoing surgery or interventional drainage to address complicated intra-abdominal infections (IAIs).1898 patients with a mean age of 51.6 years (range 18-99) were enrolled in the study. 777 patients (41%) were women and 1,121 (59%) were men. Among these patients, 1,645 (86.7%) were affected by community-acquired IAIs while the remaining 253 (13.3%) suffered from healthcare-associated infections. Intraperitoneal specimens were collected from 1,190 (62.7%) of the enrolled patients.827 patients (43.6%) were affected by generalized peritonitis while 1071 (56.4%) suffered from localized peritonitis or abscesses.The overall mortality rate was 10.5% (199/1898).According to stepwise multivariate analysis (PR = 0.005 and PE = 0.001), several criteria were found to be independent variables predictive of mortality, including patient age (OR = 1.1; 95%CI = 1.0-1.1; p < 0.0001), the presence of small bowel perforation (OR = 2.8; 95%CI = 1.5-5.3; p < 0.0001), a delayed initial intervention (a delay exceeding 24 hours) (OR = 1.8; 95%CI = 1.5-3.7; p < 0.0001), ICU admission (OR = 5.9; 95%CI = 3.6-9.5; p < 0.0001) and patient immunosuppression (OR = 3.8; 95%CI = 2.1-6.7; p < 0.0001). © 2014 Sartelli et al.; licensee BioMed Central Ltd
A mature macrophage is a principal HIV-1 cellular reservoir in humanized mice after treatment with long acting antiretroviral therapy
BACKGROUND: Despite improved clinical outcomes seen following antiretroviral therapy (ART), resting CD4+ T cells continue to harbor latent human immunodeficiency virus type one (HIV-1). However, such cells are not likely the solitary viral reservoir and as such defining where and how others harbor virus is imperative for eradication measures. To such ends, we used HIV-1(ADA)-infected NOD.Cg-Prkdc (scid) Il2rg (tm1Wjl)/SzJ mice reconstituted with a human immune system to explore two long-acting ART regimens investigating their abilities to affect viral cell infection and latency. At 6Â weeks of infection animals were divided into four groups. One received long-acting (LA) cabotegravir (CAB) and rilpivirine (RVP) (2ART), a second received LA CAB, lamivudine, abacavir and RVP (4ART), a third were left untreated and a fourth served as an uninfected control. After 4Â weeks of LA ART treatment, blood, spleen and bone marrow (BM) cells were collected then phenotypically characterized. CD4+ T cell subsets, macrophages and hematopoietic progenitor cells were analyzed for HIV-1 nucleic acids by droplet digital PCR. RESULTS: Plasma viral loads were reduced by two log(10) or to undetectable levels in the 2 and 4ART regimens, respectively. Numbers and distributions of CD4+ memory and regulatory T cells, macrophages and hematopoietic progenitor cells were significantly altered by HIV-1 infection and by both ART regimens. ART reduced viral DNA and RNA in all cell and tissue compartments. While memory cells were the dominant T cell reservoir, integrated HIV-1 DNA was also detected in the BM and spleen macrophages in both regimen-treated mice. CONCLUSION: Despite vigorous ART regimens, HIV-1 DNA and RNA were easily detected in mature macrophages supporting their potential role as an infectious viral reservoir. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12977-017-0344-7) contains supplementary material, which is available to authorized users
Bioactive Electrospun Scaffolds Delivering Growth Factors and Genes for Tissue Engineering Applications
A biomaterial scaffold is one of the key factors for successful tissue engineering. In recent years, an increasing tendency has been observed toward the combination of scaffolds and biomolecules, e.g. growth factors and therapeutic genes, to achieve bioactive scaffolds, which not only provide physical support but also express biological signals to modulate tissue regeneration. Huge efforts have been made on the exploration of strategies to prepare bioactive scaffolds. Within the past five years, electrospun scaffolds have gained an exponentially increasing popularity in this area because of their ultrathin fiber diameter and large surface-volume ratio, which is favored for biomolecule delivery. This paper reviews current techniques that can be used to prepare bioactive electrospun scaffolds, including physical adsorption, blend electrospinning, coaxial electrospinning, and covalent immobilization. In addition, this paper also analyzes the existing challenges (i.e., protein instability, low gene transfection efficiency, and difficulties in accurate kinetics prediction) to achieve biomolecule release from electrospun scaffolds, which necessitate further research to fully exploit the biomedical applications of these bioactive scaffolds
Evolution and patterns of global health financing 1995-2014 : development assistance for health, and government, prepaid private, and out-of-pocket health spending in 184 countries
Background An adequate amount of prepaid resources for health is important to ensure access to health services and for the pursuit of universal health coverage. Previous studies on global health financing have described the relationship between economic development and health financing. In this study, we further explore global health financing trends and examine how the sources of funds used, types of services purchased, and development assistance for health disbursed change with economic development. We also identify countries that deviate from the trends. Methods We estimated national health spending by type of care and by source, including development assistance for health, based on a diverse set of data including programme reports, budget data, national estimates, and 964 National Health Accounts. These data represent health spending for 184 countries from 1995 through 2014. We converted these data into a common inflation-adjusted and purchasing power-adjusted currency, and used non-linear regression methods to model the relationship between health financing, time, and economic development. Findings Between 1995 and 2014, economic development was positively associated with total health spending and a shift away from a reliance on development assistance and out-of-pocket (OOP) towards government spending. The largest absolute increase in spending was in high-income countries, which increased to purchasing power-adjusted 914 and 51 to 37.6 billion in 2016. Nonetheless, there is a great deal of variation revolving around these averages. 29 countries spend at least 50% more than expected per capita, based on their level of economic development alone, whereas 11 countries spend less than 50% their expected amount. Interpretation Health spending remains disparate, with low-income and lower-middle-income countries increasing spending in absolute terms the least, and relying heavily on OOP spending and development assistance. Moreover, tremendous variation shows that neither time nor economic development guarantee adequate prepaid health resources, which are vital for the pursuit of universal health coverage.Peer reviewe
- …