235 research outputs found

    Handwriting-Based Gender Classification Using End-to-End Deep Neural Networks

    Full text link
    Handwriting-based gender classification is a well-researched problem that has been approached mainly by traditional machine learning techniques. In this paper, we propose a novel deep learning-based approach for this task. Specifically, we present a convolutional neural network (CNN), which performs automatic feature extraction from a given handwritten image, followed by classification of the writer's gender. Also, we introduce a new dataset of labeled handwritten samples, in Hebrew and English, of 405 participants. Comparing the gender classification accuracy on this dataset against human examiners, our results show that the proposed deep learning-based approach is substantially more accurate than that of humans

    Protocol: does sodium nitrite administration reduce ischaemia-reperfusion injury in patients presenting with acute ST segment elevation myocardial infarction? Nitrites in acute myocardial infarction (NIAMI)

    Get PDF
    BACKGROUND: Whilst advances in reperfusion therapies have reduced early mortality from acute myocardial infarction, heart failure remains a common complication, and may develop very early or long after the acute event. Reperfusion itself leads to further tissue damage, a process described as ischaemia-reperfusion-injury (IRI), which contributes up to 50% of the final infarct size. In experimental models nitrite administration potently protects against IRI in several organs, including the heart. In the current study we investigate whether intravenous sodium nitrite administration immediately prior to percutaneous coronary intervention (PCI) in patients with acute ST segment elevation myocardial infarction will reduce myocardial infarct size. This is a phase II, randomised, placebo-controlled, double-blinded and multicentre trial. METHODS AND OUTCOMES: The aim of this trial is to determine whether a 5 minute systemic injection of sodium nitrite, administered immediately before opening of the infarct related artery, results in significant reduction of IRI in patients with first acute ST elevation myocardial infarction (MI). The primary clinical end point is the difference in infarct size between sodium nitrite and placebo groups measured using cardiovascular magnetic resonance imaging (CMR) performed at 6-8 days following the AMI and corrected for area at risk (AAR) using the endocardial surface area technique. Secondary end points include (i) plasma creatine kinase and Troponin I measured in blood samples taken pre-injection of the study medication and over the following 72 hours; (ii) infarct size at six months; (iii) Infarct size corrected for AAR measured at 6-8 days using T2 weighted triple inversion recovery (T2-W SPAIR or STIR) CMR imaging; (iv) Left ventricular (LV) ejection fraction measured by CMR at 6-8 days and six months following injection of the study medication; and (v) LV end systolic volume index at 6-8 days and six months. FUNDING,ETHICS AND REGULATORY APPROVALS: This study is funded by a grant from the UK Medical Research Council. This protocol is approved by the Scotland A Research Ethics Committee and has also received clinical trial authorisation from the Medicines and Healthcare products Regulatory Agency (MHRA) (EudraCT number: 2010-023571-26)

    Risk and clinical-outcome indicators of delirium in an emergency department intermediate care unit (EDIMCU) : an observational prospective study

    Get PDF
    We are thankful to the staff at the EDIMCU of Hospital de Braga.Background Identification of delirium in emergency departments (ED) is often underestimated; within EDs, studies on delirium assessment and relation with patient outcome in Intermediate Care Units (IMCU) appear missing in European hospital settings. Here we aimed to determine delirium prevalence in an EDIMCU (Hospital de Braga, Braga, Portugal) and assessed routine biochemical parameters that might be delirium indicators. Methods The study was prospective and observational. Sedation level was assessed via the Richmond Agitation-Sedation Scale and delirium status by the Confusion Assessment Method for the ICU. Information collected included age and gender, admission type, Charlson Comorbidity Index combined condition score (Charlson score), systemic inflammatory response syndrome criteria (SIRS), biochemical parameters (blood concentration of urea nitrogen, creatinine, hemoglobin, sodium and potassium, arterial blood gases, and other parameters as needed depending on clinical diagnosis) and EDIMCU length of stay (LOS). Statistical analyses were performed as appropriate to determine if baseline features differed between the ‘Delirium’ and ‘No Delirium’ groups. Multivariate logistic regression was performed to assess the effect of delirium on the 1-month outcome. Results Inclusion and exclusion criteria were met in 283 patients; 238 were evaluated at 1-month for outcome follow-up after EDIMCU discharge (“good” recovery without complications requiring hospitalization or institutionalization; “poor” institutionalization in permanent care-units/assisted-living or death). Delirium was diagnosed in 20.1% patients and was significantly associated with longer EDIMCU LOS. At admission, Delirium patients were significantly older and had significantly higher blood urea, creatinine and osmolarity levels and significantly lower hemoglobin levels, when compared with No Delirium patients. Delirium was an independent predictor of increased EDIMCU LOS (odds ratio 3.65, 95% CI 1.97-6.75) and poor outcome at 1-month after discharge (odds ratio 3.51, CI 1.84-6.70), adjusted for age, gender, admission type, presence of SIRS criteria, Charlson score and osmolarity at admission. Conclusions In an EDIMCU setting, delirium was associated with longer LOS and poor outcome at1-month post-discharge. Altogether, findings support the need for delirium screening and management in emergency settings.NCS is supported by the post-doctoral fellowship UMINHO/BPD/013/2011 by the European Commission (FP7) “SwitchBox” Project (Contract HEALTH-F2-2010-259772)

    Prevalence of HCV and HIV infections in 2005-Earthquake-affected areas of Pakistan

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>On October 8, 2005, an earthquake of magnitude 7.6 hit the Northern parts of Pakistan. In the post-earthquake scenario, overcrowding, improper sewage disposal, contamination of food and drinking water, hasty surgical procedures, and unscreened blood transfusions to earthquake victims most likely promotes the spread of infections already prevalent in the area.</p> <p>Objective</p> <p>The objective of the study reported here was to determine the prevalence of Human Immunodeficiency and Hepatitis C viruses (respectively, HIV and HCV) in the earthquake-affected communities of Pakistan. The samples were analyzed 2 months and then again 11 months after the earthquake to estimate the burden of HIV and HCV in these areas, and to determine any rise in the prevalence of these viral infections as a result of the earthquake.</p> <p>Methods</p> <p>Blood samples were initially collected during December, 2005 to March 2006, from 245 inhabitants of the earthquake-affected areas. These samples were screened for HCV and HIV, using immunochromatography and Enzyme-Linked Immuno-Sorbent Assay (ELISA).</p> <p>Results</p> <p>Out of 245 samples tested, 8 (3.26%) were found positive for HCV, and 0 (0.0%) for HIV, indicating the existence of HCV infection in the earthquake-stricken areas. The same methods were used to analyze the samples collected in the second round of screening in the same area, in September, 2006 – 11 months after the earthquake. This time 290 blood samples were collected, out of which 16 (5.51%) samples were positive for HCV, and 0 for HIV.</p> <p>Conclusion</p> <p>A slightly higher prevalence of HCV was recorded 11 months after the earthquake; this increase, however, was not statistically significant. None of the study participants was found HIV-infected.</p

    Perceptions of Pakistani medical students about drugs and alcohol: a questionnaire-based survey

    Get PDF
    BACKGROUND: Drug abuse is hazardous and known to be prevalent among young adults, warranting efforts to increase awareness about harmful effects and to change attitudes. This study was conducted to assess the perceptions of a group of medical students from Pakistan, a predominantly Muslim country, regarding four drugs namely heroin, charas, benzodiazepines and alcohol. RESULTS: In total, 174 self-reported questionnaires were received (87% response rate). The most commonly cited reasons for why some students take these drugs were peer pressure (96%), academic stress (90%) and curiosity (88%). The most commonly cited justifiable reason was to go to sleep (34%). According to 77%, living in the college male hostel predisposed one to using these drugs. Sixty percent of students said that the drugs did not improve exam performance, while 54% said they alleviated stress. Seventy-eight percent said they did not intend to ever take drugs in the future. Females and day-scholars were more willing to discourage a friend who took drugs. Morality (78%), religion (76%) and harmful effects of drugs (57%) were the most common deterrents against drug intake. Five suggestions to decrease drug abuse included better counseling facilities (78%) and more recreational facilities (60%). CONCLUSION: Efforts need to be made to increase student awareness regarding effects and side effects of drugs. Our findings suggest that educating students about the adverse effects as well as the moral and religious implications of drug abuse is more likely to have a positive impact than increased policing. Proper student-counseling facilities and healthier avenues for recreation are also required

    Improving the implementation of health workforce policies through governance: a review of case studies

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>Responsible governance is crucial to national development and a catalyst for achieving the Millennium Development Goals. To date, governance seems to have been a neglected issue in the field of human resources for health (HRH), which could be an important reason why HRH policy formulation and implementation is often poor. This article aims to describe how governance issues have influenced HRH policy development and to identify governance strategies that have been used, successfully or not, to improve HRH policy implementation in low- and middle-income countries (LMIC).</p> <p>Methods</p> <p>We performed a descriptive literature review of HRH case studies which describe or evaluate a governance-related intervention at country or district level in LMIC. In order to systematically address the term 'governance' a framework was developed and governance aspects were regrouped into four dimensions: 'performance', 'equity and equality', 'partnership and participation' and 'oversight'.</p> <p>Results and discussion</p> <p>In total 16 case studies were included in the review and most of the selected studies covered several governance dimensions. The dimension 'performance' covered several elements at the core of governance of HRH, decentralization being particularly prominent. Although improved equity and/or equality was, in a number of interventions, a goal, inclusiveness in policy development and fairness and transparency in policy implementation did often not seem adequate to guarantee the corresponding desirable health workforce scenario. Forms of partnership and participation described in the case studies are numerous and offer different lessons. Strikingly, in none of the articles was 'partnerships' a core focus. A common theme in the dimension of 'oversight' is local-level corruption, affecting, amongst other things, accountability and local-level trust in governance, and its cultural guises. Experiences with accountability mechanisms for HRH policy development and implementation were lacking.</p> <p>Conclusion</p> <p>This review shows that the term 'governance' is neither prominent nor frequent in recent HRH literature. It provides initial lessons regarding the influence of governance on HRH policy development and implementation. The review also shows that the evidence base needs to be improved in this field in order to better understand how governance influences HRH policy development and implementation. Tentative lessons are discussed, based on the case studies.</p

    Management of intracranial tuberculous mass lesions: How long should we treat for? [version 2; peer review: 1 approved, 2 approved with reservations]

    Get PDF
    Tuberculous intracranial mass lesions are common in settings with high tuberculosis (TB) incidence and HIV prevalence. The diagnosis of such lesions, which include tuberculoma and tuberculous abscesses, is often presumptive and based on radiological features, supportive evidence of TB elsewhere and response to TB treatment. However, the treatment response is unpredictable, with lesions frequently enlarging paradoxically or persisting for many years despite appropriate TB treatment and corticosteroid therapy. Most international guidelines recommend a 9-12 month course of TB treatment for central nervous system TB when the infecting Mycobacterium tuberculosis (M.tb) strain is sensitive to first-line drugs. However, there is variation in opinion and practice with respect to the duration of TB treatment in patients with tuberculomas or tuberculous abscesses. A major reason for this is the lack of prospective clinical trial evidence. Some experts suggest continuing treatment until radiological resolution of enhancing lesions has been achieved, but this may unnecessarily expose patients to prolonged periods of potentially toxic drugs. It is currently unknown whether persistent radiological enhancement of intracranial tuberculomas after 9-12 months of treatment represents active disease, inflammatory response in a sterilized lesion or merely revascularization. The consequences of stopping TB treatment prior to resolution of lesional enhancement have rarely been explored. These important issues were discussed at the 3 International Tuberculous Meningitis Consortium meeting. Most clinicians were of the opinion that continued enhancement does not necessarily represent treatment failure and that prolonged TB therapy was not warranted in patients presumably infected with M.tb strains susceptible to first-line drugs. In this manuscript we highlight current medical treatment practices, benefits and disadvantages of different TB treatment durations and the need for evidence-based guidelines regarding the treatment duration of patients with intracranial tuberculous mass lesions

    SMAD4 - Molecular gladiator of the TGF-β signaling is trampled upon by mutational insufficiency in colorectal carcinoma of Kashmiri population: an analysis with relation to KRAS proto-oncogene

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The development and progression of colorectal cancer has been extensively studied and the genes responsible have been well characterized. However the correlation between the <it>SMAD4 </it>gene mutations with <it>KRAS </it>mutant status has not been explored by many studies so far. Here, in this study we aimed to investigate the role of <it>SMAD4 </it>gene aberrations in the pathogenesis of CRC in Kashmir valley and to correlate it with various clinicopathological variables and <it>KRAS </it>mutant genotype.</p> <p>Methods</p> <p>We examined the paired tumor and normal tissue specimens of 86 CRC patients for the occurrence of aberrations in MCR region of <it>SMAD4 </it>and exon 1 of <it>KRAS </it>by PCR-SSCP and/or PCR-Direct sequencing.</p> <p>Results</p> <p>The overall mutation rate of mutation cluster region (MCR) region of <it>SMAD4 </it>gene among 86 patients was 18.6% (16 of 86). 68.75% (11/16) of the <it>SMAD4 </it>gene mutants were found to have mutations in <it>KRAS </it>gene as well. The association between the <it>KRAS </it>mutant genotype with <it>SMAD4 </it>mutants was found to be significant (P =< 0.05). Further more, we found a significant association of tumor location, tumor grade, node status, occupational exposure to pesticides and bleeding PR/Constipation with the mutation status of the <it>SMAD4 </it>gene (P =< 0.05).</p> <p>Conclusion</p> <p>Our study suggests that <it>SMAD4 </it>gene aberrations are the common event in CRC development but play a differential role in the progression of CRC in higher tumor grade (C+D) and its association with the <it>KRAS </it>mutant status suggest that these two molecules together are responsible for the progression of the tumor to higher/advanced stage.</p
    corecore