122 research outputs found

    An Efficient Deep Learning Model To Detect COVID-19 Using Chest X-Ray Images

    Get PDF
    The tragic pandemic of COVID-19, due to the Severe Acute Respiratory Syndrome coronavirus-2 or SARS-CoV-2, has shaken the entire world, and has significantly disrupted healthcare systems in many countries. Because of the existing challenges and controversies to testing for COVID-19, improved and cost-effective methods are needed to detect the disease. For this purpose, machine learning (ML) has emerged as a strong forecasting method for detecting COVID-19 from chest X-ray images. In this paper, we used a Deep Learning Method (DLM) to detect COVID-19 using chest X-ray (CXR) images. Radiographic images are readily available and can be used effectively for COVID-19 detection compared to other expensive and time-consuming pathological tests. We used a dataset of 10,040 samples, of which 2143 had COVID-19, 3674 had pneumonia (but not COVID-19), and 4223 were normal (not COVID-19 or pneumonia). Our model had a detection accuracy of 96.43% and a sensitivity of 93.68%. The area under the ROC curve was 99% for COVID-19, 97% for pneumonia (but not COVID-19 positive), and 98% for normal cases. In conclusion, ML approaches may be used for rapid analysis of CXR images and thus enable radiologists to filter potential candidates in a time-effective manner to detect COVID-19

    Analysis of Function Component Complexity for Hypercube Homotopy Algorithms

    Get PDF
    Probability-one homotopy algorithms are a class of methods for solving nonlinear systems of equations that globally convergent from an arbitrary starting point with probability one. The essence of these homotopy algorithms is the construction of a homotopy map p-sub a and the subsequent tracking of a smooth curve y in the zero set p-sub a to the -1 (0) of p-sub a. Tracking the zero curve y requires repeated evaluation of the map p-sub a, its n x (v + 1) Jacobian matrix Dp-sub a and numerical linear algebra for calculating the kernel of Dp-sub a. This paper analyzes parallel homotopy algorithms on a hypercube, considering the numerical algebra, several communications topologies and problem decomposition strategies, functions component complexity, problem size, and the effect of different component complexity distributions. These parameters interact in complicated ways, but some general principles can be inferred based on empirical results

    Unit Tangent Vector Computation for Homotopy Curve Tracking on aHypercube

    Get PDF
    Probability-one homotopy methods are a class of methods for solving nonlinear systems of equations that are globally convergent from an arbitrary starting point. The essence of all such algorithms is the construction of an appropriate homotopy map and subsequent tracking of some smooth curve in the zero set of the homotopy map. Tracking a homotopy curve involves finding the unit tangent vectors at different points along the zero curve. Because of the way a homotopy map is constructed, the unit tangent vector at each point in the zero curve of a homotopy map (symbols) is in the kernel of the Jacobian matrix (symbols). Hence tracking the zero curve of a homotopy map involves finding the kernel of the Jacobian matrix (symbols). The Jacobian matrix (symbols) is a n x (n + 1) matrix with full rank. Since the accuracy of the unit tangent vector is very important, on orthogonal factorization instead of an LU factorization of the Jacobian matrix is computed. Two related factorizations, namely QR and LQ factorization, are considered here. This paper presents computational results showing the performance of several different parallel orthogonal factorization/triangular system solving algorithms on a hypercube. Since the purpose of this study is to find ways to parallelize homotopy algorithms, it is assumed that the matrices are small, dense, and have a special structure such as that of the Jacobian matrix of a homotopy map

    Prospective observational study on peri-operative usage pattern of analgesics

    Get PDF
    The present work reports a prospective observational study to determine perioperative usage pattern of analgesics. Both general and regional anaesthesia and also different analgesics which is most effective to manage post operative pain were used. 120 patients were studied randomly as per criteria. Techniques of anaesthesia, pre, peri& post operative vitals and pain scores were noted in selected time interval. Pain score recorded 0-3 as mild, 4-6 as moderate & 7-10 as severe. Of all the analgesics administered in peri & postoperative period, Fentanyl (F) alone & in combination with diclofenac (D) were used maximum in perioperative period. These two groups were compared. In postoperative GA cases, majority patients received paracetamol (P) and tramadol (T) and in regional blocks, Pethidine (PE) Phenergan (PH) combination & tramadol were used. These groups werecompared to evaluate pain perception. Fentanyl alone found to be more effectivein maintaining hemodynamic stability. In fentanyl-diclofenac combination, pulse rate and blood pressure were higher than fentanyl alone. Pain scores were significantly higher in GA compared to regional blocks. In GA patients, fentanyl-paracetamol combination decrease pain significantly compared to fentanyl-tramadol combination. But in regional techniques, pethidine, phenergan and tramadol in combination with perioperative fentanyl shows same result for decreasing pain. Fentanyl is a better analgesics compared to fentanyl+diclofenac combination in perioperative period and in case of postoperative period paracetamol is better effective compared to tramadol HCL in combination with fentany

    Granularity issues for solving polynomial systems via globally convergent algorithms on a hypercube

    Get PDF
    Polynomial systems of equations frequently arise in many applications such as solid modeling, robotics, computer vision, chemistry, chemical engineering, and mechanical engineering . Locally convergent iterative methods such as quasi-Newton methods may diverge or fail to find all meaningful solutions of a polynomial system. Recently a homotopy algorithm has been proposed for polynomial systems that is guaranteed globally convergent (always converges from an arbitrary starting point) with probability one, finds all solutions to the polynomial system, and has a large amount of inherent parallelism. There are several ways the homotopy algorithms can be decomposed to run on a hypercube. The granularity of a decomposition has a profound effect on the performance of the algorithm. The results of decompositions with two different granularities are presented. The experiments were conducted on an iPSC-16 hypercube using actual industrial problems

    Follow-up of indigenous-specific health assessments - a socioecological analysis

    Get PDF
    Objectives: To describe patterns of uptake of Indigenous-specific health assessments and associated follow-up items, and examine the barriers and enablers to delivery and billing of follow-up over the first 3 years of implementation of the Indigenous Chronic Disease Package (ICDP).Design, setting and participants: We used a socioecological approach to analyse data derived from the Sentinel Sites Evaluation of the ICDP — with data from 24 sites across Australia. Administrative data (1 May 2009 to 30 May 2012) and program data (1 March 2010 to 30 May 2012) were provided by the Department of Health. Data on barriers and enablers to follow-up of health assessments were obtained from community focus groups, in-depth interviews and discussions with key informants (1 November 2010 to 30 December 2012).Main outcome measures: Monthly number of Medicare Benefits Schedule items claimed for Indigenous-specific health services and follow-up; qualitative data on enablers and barriers categorised according to patient, patient–health service relationship, health service or organisation, community and policy environment levels or influence.Results: There was an increase in the uptake of health assessments, but relatively limited delivery of follow-up care and billing for Indigenous-specific follow-up items. Follow-up was constrained by factors that operated at various levels: patient, interpersonal, health service, community and policy. Constraints included practitioners\u27 lack of awareness of item numbers, staffing, poor state of clinical information systems, billing against non-Indigenous-specific items or more general follow-up items, emphasis on health assessments with less attention to requirements for follow-up, limited capacity to arrange and facilitate follow-up, and communication and transport challenges for patients.Conclusions: Work is required across various levels of the system to address barriers to follow-up care. Enhancing follow-up care is vital to achieving health benefits from the large financial and human resource investment in health assessments

    VOICE–Validating Outcomes by Including Consumer Experience: A Study Protocol to Develop a Patient Reported Experience Measure for Aboriginal and Torres Strait Islander Peoples Accessing Primary Health Care

    Get PDF
    Aboriginal and Torres Strait Islander peoples’ (hereafter respectfully referred to as Indigenous Australians) experiences of health care are shaped by historical, social and cultural factors, with cultural security critical to effective care provision and engagement between services and community. Positive patient experiences are associated with better health outcomes. Consequently, it is an accreditation requirement that primary health care (PHC) services must formally gather and respond to patient feedback. However, currently available patient feedback tools were not developed with Indigenous Australians, and do not reflect their values and world views. Existing tools do not capture important experiences of care of Indigenous Australians in PHC settings, nor return information that assists services to improve care. Consistent with the principles of Indigenous Data Sovereignty, we will co-design and validate an Indigenous-specific Patient Reported Experience Measure (PREM) that produces data by and for community, suitable for use in quality improvement in comprehensive PHC services. This paper presents the protocol of the study, outlining the rationale, methodologies and associated activities that are being applied in developing the PREM. Briefly, guided by an Aboriginal and Torres Strait Islander Advisory Group, our team of Indigenous and non-Indigenous researchers, service providers and policy makers will use a combination of Indigenous methodologies, participatory, and traditional western techniques for scale development. We will engage PHC service staff and communities in eight selected sites across remote, regional, and metropolitan communities in Australia for iterative cycles of data collection and feedback throughout the research process. Yarning Circles with community members will identify core concepts to develop an “Experience of Care Framework”, which will be used to develop items for the PREM. Staff members will be interviewed regarding desirable characteristics and feasibility considerations for the PREM. The PREM will undergo cognitive and psychometric testing

    A Rare HBV Subgenotype D4 with Unique Genomic Signatures Identified in North-Eastern India –An Emerging Clinical Challenge?

    Get PDF
    BACKGROUND/AIMS: HBV has been classified into ten genotypes (A-J) and multiple subgenotypes, some of which strongly influence disease outcome and their distribution also correlate with human migration. HBV infection is highly prevalent in India and its diverse population provides an excellent opportunity to study the distinctiveness of HBV, its evolution and disease biology in variegated ethnic groups. The North-East India, having international frontiers on three sides, is one of the most ethnically and linguistically diverse region of the country. Given the paucity of information on molecular epidemiology of HBV in this region, the study aimed to carry out an in-depth genetic characterization of HBV prevailing in North-East state of Tripura. METHODS: From sera of chronically HBV infected patients biochemical/serological tests, HBV DNA quantification, PCR-amplification, sequencing of PreS/S or full-length HBV genomes were done. HBV genotype/subgenotype determination and sequence variability were assessed by MEGA5-software. The evolutionary divergence times of different HBV subgenotypes were estimated by DNAMLK/PHYLIP program while jpHMM method was used to detect any recombination event in HBV genomes. RESULTS: HBV genotypes D (89.5%), C (6.6%) and A (3.9%) were detected among chronic carriers. While all HBV/A and HBV/C isolates belonged to subgenotype-A1 and C1 respectively, five subgenotypes of HBV/D (D1-D5) were identified including the first detection of rare D4. These non-recombinant Indian D4 (IndD4) formed a distinct phylogenetic clade, had 2.7% nucleotide divergence and recent evolutionary radiation than other global D4. Ten unique amino acids and 9 novel nucleotide substitutions were identified as IndD4 signatures. All IndD4 carried T120 and R129 in ORF-S that may cause immune/vaccine/diagnostic escape and N128 in ORF-P, implicated as compensatory Lamivudine resistance mutation. CONCLUSIONS: IndD4 has potential to undermine vaccination programs or anti-viral therapy and its introduction to North-East India is believed to be linked with the settlement of ancient Tibeto-Burman migrants from East-Asia

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Application of critical path analysis in clinical trials

    No full text
    Clinical research operates in a strictly regulated environment under various management models, but a distinct management model of clinical trial (CT) still needs exploration and research. Critical path analysis (CPA) is a management approach can be used for monitoring, analysis, and prediction of success of its time-bound operational activities. A model CT was compiled with 78 activities, which were further merged into 35 major activities. After performing dependence analysis, the list was finalized with 25 activities which were taken in activity predecessor to create a network diagram and perform CPA considering patients, conduct, and outcome. Activities were inclusive, described the trial entirely with accuracy, and were in chronological and logical sequences. This approach does not replace an understanding of or adherence to the requirements contained in all applicable regulations, guidelines or standard operating procedures governing clinical studies but ensures the proper use of operational and decisional approaches including optimal resource management. As the need to meet deadlines becomes more important and the need to produce good, stable project plans, CPA is very useful for determining activities that can lead to project delay. With this approach, project may be effectively monitored, and realistic schedules can be maintained
    corecore