102 research outputs found

    Comparison between Doppler Ultrasound and Biopsy Findings in Patients with Suspected Kidney Transplant Rejection

    Get PDF
    Introduction: The role of Doppler ultrasound in diagnosing kidney allograft rejections is controversial. Our goal in this study was to investigate the utility of Doppler-measured resistive index (RI) as a screening tool for kidney transplant rejection. Methods: We retrospectively studied a random sample of 188 kidney transplanted patients who had Doppler-ultrasound examination followed within two weeks by transplant biopsy. We evaluated the specificity and sensitivity of Doppler ultrasound in diagnosing rejection at different RI thresholds, using the reported biopsy findings as the gold standard. Results: The RI values of the study population had a mean value of 0.7 ± 0.11 (mean ± SD) and a range of 0.4-1.0. There was no significant difference in the mean RI between patients with biopsy proven rejection and patients without rejection (0.68±0.09 versus 0.71±0.12, P = 0.16). The sensitivity and specificity of Doppler-measured RI in diagnosing rejection was highly variable depending on the chosen cut-off value, ranging between 4.1-98.6% and 2.6-92.2% respectively. Acceptable specificity was only achieved at the expense of very low sensitivity. Acute tubular necrosis (ATN) and interstitial edema (IE) were associated with higher RI values than other pathological entities, while very low RI values had high specificity and low sensitivity for transplanted renal artery stenosis (TRAS). Conclusion: Doppler-measured RI lacks accuracy in diagnosing transplanted kidney rejection, resulting in poor utility of this test as a screening tool for rejection. Keywords: Doppler Ultrasound; Rejection; Resistive Index (RI); Transplant kidne

    Hepatitis B Virus, Hepatitis C Virus and Human Immunodeficiency Virus Infections among Pregnant Women in Central Sudan

    Get PDF
    Background: The epidemiology of viral hepatitis and Human immunodeficiency virus (HIV) during pregnancy is of great importance for health planners and program managers. However, few published data on viral hepatitis and HIV are available in Sudan especially during pregnancy.Objectives: The current study was conducted to investigate seropositivity of hepatitis B, hepatitis C, and HIV among pregnant women in central Sudan.Materials and methods: A cross sectional study was conducted where 396 pregnant women were investigated for the presence of hepatitis B, C and HIV. Enzyme linked immunosorbent assay (ELISA) was used to detect HBsAg and anti-HCV. Antibodies to HIV were detected by three different methods as per Strategy III of the National AIDS Control Organization by utilizing different systems of testing to make a diagnosis of HIV.Results: Twenty (5.1%), five (1.3%), and six (1.5%) women were seropositive for HBsAg anti- HCV antibodies and HIV, respectively. One (0.003 %) woman was seropositive for both HBsAg and anti-HCV antibodies. While age, parity, were not associated with seropostivtiy of HBsAg, home delivery was the only significant risk factor for seropostivtiy of HBsAg (OR=4.5 (95% CI=1.2-16.7)Conclusion: Prevalence of HBV and HCV among pregnant women in this setting is in the intermediate zone of endemicity. This is alarming and should draw medical authorities’ attention if vertical transmission is to be reduced.Key words: Sudan, hepatitis B, hepatitis C, HIV, seropositivity, Pregnancy

    Monitoring the millennium development goals: the potential role of the INDEPTH Network

    Get PDF
    The Millennium Declaration, adopted by the United Nations (UN) in 2000, set a series of Millennium Development Goals (MDGs) as priorities for UN member countries, committing governments to realising eight major MDGs and 18 associated targets by 2015. Progress towards these goals is being assessed by tracking a series of 48 technical indicators that have since been unanimously adopted by experts. This concept paper outlines the role member Health and Demographic Surveillance Systems (HDSSs) of the INDEPTH Network could play in monitoring progress towards achieving the MDGs. The unique qualities of the data generated by HDSSs lie in the fact that they provide an opportunity to measure or evaluate interventions longitudinally, through the long-term follow-up of defined populations

    CMR for Assessment of Diastolic Function

    Get PDF
    Prevalence of heart failure with preserved left ventricular ejection fraction amounts to 50% of all cases with heart failure. Diagnosis assessment requires evidence of left ventricular diastolic dysfunction. Currently, echocardiography is the method of choice for diastolic function testing in clinical practice. Various applications are in use and recommended criteria are followed for classifying the severity of dysfunction. Cardiovascular magnetic resonance (CMR) offers a variety of alternative applications for evaluation of diastolic function, some superior to echocardiography in accuracy and reproducibility, some being complementary. In this article, the role of the available CMR applications for diastolic function testing in clinical practice and research is reviewed and compared to echocardiography

    Insights into SCP/TAPS Proteins of Liver Flukes Based on Large-Scale Bioinformatic Analyses of Sequence Datasets

    Get PDF
    Background: SCP/TAPS proteins of parasitic helminths have been proposed to play key roles in fundamental biological processes linked to the invasion of and establishment in their mammalian host animals, such as the transition from free-living to parasitic stages and the modulation of host immune responses. Despite the evidence that SCP/TAPS proteins of parasitic nematodes are involved in host-parasite interactions, there is a paucity of information on this protein family for parasitic trematodes of socio-economic importance.\ud \ud Methodology/Principal Findings: We conducted the first large-scale study of SCP/TAPS proteins of a range of parasitic trematodes of both human and veterinary importance (including the liver flukes Clonorchis sinensis, Opisthorchis viverrini, Fasciola hepatica and F. gigantica as well as the blood flukes Schistosoma mansoni, S. japonicum and S. haematobium). We mined all current transcriptomic and/or genomic sequence datasets from public databases, predicted secondary structures of full-length protein sequences, undertook systematic phylogenetic analyses and investigated the differential transcription of SCP/TAPS genes in O. viverrini and F. hepatica, with an emphasis on those that are up-regulated in the developmental stages infecting the mammalian host.\ud \ud Conclusions: This work, which sheds new light on SCP/TAPS proteins, guides future structural and functional explorations of key SCP/TAPS molecules associated with diseases caused by flatworms. Future fundamental investigations of these molecules in parasites and the integration of structural and functional data could lead to new approaches for the control of parasitic diseases

    Identification and genomic location of a reniform nematode (Rotylenchulus reniformis) resistance locus (Renari) introgressed from Gossypium aridum into upland cotton (G. hirsutum)

    Get PDF
    In this association mapping study, a tri-species hybrid, [Gossypium arboreum × (G. hirsutum × G. aridum)2], was crossed with MD51ne (G. hirsutum) and progeny from the cross were used to identify and map SSR markers associated with reniform nematode (Rotylenchulus reniformis) resistance. Seventy-six progeny (the 50 most resistant and 26 most susceptible) plants were genotyped with 104 markers. Twenty-five markers were associated with a resistance locus that we designated Renari and two markers, BNL3279_132 and BNL2662_090, mapped within 1 cM of Renari. Because the SSR fragments associated with resistance were found in G. aridum and the bridging line G 371, G. aridum is the likely source of this resistance. The resistance is simply inherited, possibly controlled by a single dominant gene. The markers identified in this project are a valuable resource to breeders and geneticists in the quest to produce cotton cultivars with a high level of resistance to reniform nematode

    Evolution and patterns of global health financing 1995-2014 : development assistance for health, and government, prepaid private, and out-of-pocket health spending in 184 countries

    Get PDF
    Background An adequate amount of prepaid resources for health is important to ensure access to health services and for the pursuit of universal health coverage. Previous studies on global health financing have described the relationship between economic development and health financing. In this study, we further explore global health financing trends and examine how the sources of funds used, types of services purchased, and development assistance for health disbursed change with economic development. We also identify countries that deviate from the trends. Methods We estimated national health spending by type of care and by source, including development assistance for health, based on a diverse set of data including programme reports, budget data, national estimates, and 964 National Health Accounts. These data represent health spending for 184 countries from 1995 through 2014. We converted these data into a common inflation-adjusted and purchasing power-adjusted currency, and used non-linear regression methods to model the relationship between health financing, time, and economic development. Findings Between 1995 and 2014, economic development was positively associated with total health spending and a shift away from a reliance on development assistance and out-of-pocket (OOP) towards government spending. The largest absolute increase in spending was in high-income countries, which increased to purchasing power-adjusted 5221percapitabasedonanannualgrowthrateof3.05221 per capita based on an annual growth rate of 3.0%. The largest health spending growth rates were in upper-middle-income (5.9) and lower-middle-income groups (5.0), which both increased spending at more than 5% per year, and spent 914 and 267percapitain2014,respectively.Spendinginlowincomecountriesgrewnearlyasfast,at4.6267 per capita in 2014, respectively. Spending in low-income countries grew nearly as fast, at 4.6%, and health spending increased from 51 to 120percapita.In2014,59.2120 per capita. In 2014, 59.2% of all health spending was financed by the government, although in low-income and lower-middle-income countries, 29.1% and 58.0% of spending was OOP spending and 35.7% and 3.0% of spending was development assistance. Recent growth in development assistance for health has been tepid; between 2010 and 2016, it grew annually at 1.8%, and reached US37.6 billion in 2016. Nonetheless, there is a great deal of variation revolving around these averages. 29 countries spend at least 50% more than expected per capita, based on their level of economic development alone, whereas 11 countries spend less than 50% their expected amount. Interpretation Health spending remains disparate, with low-income and lower-middle-income countries increasing spending in absolute terms the least, and relying heavily on OOP spending and development assistance. Moreover, tremendous variation shows that neither time nor economic development guarantee adequate prepaid health resources, which are vital for the pursuit of universal health coverage.Peer reviewe

    Effectiveness of Mechanisms and Models of Coordination between Organizations, Agencies and Bodies Providing or Financing Health Services in Humanitarian Crises: A Systematic Review.

    Get PDF
    BACKGROUND: Effective coordination between organizations, agencies and bodies providing or financing health services in humanitarian crises is required to ensure efficiency of services, avoid duplication, and improve equity. The objective of this review was to assess how, during and after humanitarian crises, different mechanisms and models of coordination between organizations, agencies and bodies providing or financing health services compare in terms of access to health services and health outcomes. METHODS: We registered a protocol for this review in PROSPERO International prospective register of systematic reviews under number PROSPERO2014:CRD42014009267. Eligible studies included randomized and nonrandomized designs, process evaluations and qualitative methods. We electronically searched Medline, PubMed, EMBASE, Cochrane Central Register of Controlled Trials, CINAHL, PsycINFO, and the WHO Global Health Library and websites of relevant organizations. We followed standard systematic review methodology for the selection, data abstraction, and risk of bias assessment. We assessed the quality of evidence using the GRADE approach. RESULTS: Of 14,309 identified citations from databases and organizations' websites, we identified four eligible studies. Two studies used mixed-methods, one used quantitative methods, and one used qualitative methods. The available evidence suggests that information coordination between bodies providing health services in humanitarian crises settings may be effective in improving health systems inputs. There is additional evidence suggesting that management/directive coordination such as the cluster model may improve health system inputs in addition to access to health services. None of the included studies assessed coordination through common representation and framework coordination. The evidence was judged to be of very low quality. CONCLUSION: This systematic review provides evidence of possible effectiveness of information coordination and management/directive coordination between organizations, agencies and bodies providing or financing health services in humanitarian crises. Our findings can inform the research agenda and highlight the need for improving conduct and reporting of research in this field

    Cardiac resynchronization therapy guided by cardiovascular magnetic resonance

    Get PDF
    Cardiac resynchronization therapy (CRT) is an established treatment for patients with symptomatic heart failure, severely impaired left ventricular (LV) systolic dysfunction and a wide (> 120 ms) complex. As with any other treatment, the response to CRT is variable. The degree of pre-implant mechanical dyssynchrony, scar burden and scar localization to the vicinity of the LV pacing stimulus are known to influence response and outcome. In addition to its recognized role in the assessment of LV structure and function as well as myocardial scar, cardiovascular magnetic resonance (CMR) can be used to quantify global and regional LV dyssynchrony. This review focuses on the role of CMR in the assessment of patients undergoing CRT, with emphasis on risk stratification and LV lead deployment
    corecore