190 research outputs found

    Study the Effect of Substitution Filler on performance of Asphalt Mixture

    Get PDF
    The major distresses in asphalt pavements are rutting, fatigue, and adhesion loss (moisture susceptibility). In this research study, two substitution fillers (Cement and Lime) were used with two different aggregate quarries (based on minerals composition) to evaluate the relatively most beneficial combination of both fillers as well as an aggregate quarry to enhance the performance life of asphalt pavements, especially in under-developed countries. Four basic tests, (Asphalt Pavement Analyzer, Four Points Bending Beam, Dynamic Modulus, and Rolling Bottle Test) that used for the most desired properties of any asphalt pavement, were utilized to access the performance properties of modified asphalt mixture. Based on all laboratory test results this research study concludes that replacement of aggregate filler with hydrated lime and cement has a beneficial effect on asphalt mix performance and to save investment by using raw material. Substitution filler improves the high-temperature rut performance and intermediate temperature fatigue performance of asphaltic concrete mixture up to 25% to that of the conventional mixture. At the same time, substitution filler has more beneficial to improve 70% adhesion properties to that of the conventional mixture

    Is Grown Up Congenital Heart (GUCH) disease different in a developing country?

    Get PDF
    Abstract In the current era grown up congenital heart disease (GUCH) patients undergoing surgical interventions are increasing. Most of the interventions in the developed countries are either complex or redo-operations in patients who had previously undergone repair, palliation or correction. However, in the developing countries most of the interventions are primary and corrective. This descriptive retrospective study comprised GUCH patients who underwent surgical intervention for congenital heart disease (CHD) at Aga Khan University Hospital, Karachi, from January 2006 to December 2015. A total of 195 patients had been treated surgically with a mean age of 31.0±13.5 years. Majority of the patients underwent surgical interventions for closure of atrial 109(55.3%) and ventricular 51(26.2%) septal defect. The most common complications were prolonged ventilation 16(8.1%). Overall mortality was 4(2.1%). GUCH in our practice is for primary procedure with simple diagnosis that should have been treated before reaching adulthood as is done in the developed countries

    Diagnosing isolated hepatosplenic tuberculosis in an immunocompetent patient: A case report

    Get PDF
    For many years, tuberculosis (TB) has been endemic in Pakistan; many rare and unusual presentations have been reported. There is a myriad of non-specific symptoms which always requires a high index of clinical suspicion for TB. World Health Organization data suggest that Pakistan ranks as the fifth highest country burdened with TB and has the fourth highest prevalence of multi-drug resistant TB globally. With an annual incidence of 277 cases per 100,000, the importance of early diagnosis and treatment is self-evident. We present a case where a strong suspicion of isolated hepatosplenic TB in an immunocompetent patient justified a directed approach

    Incidence and Impact of Baseline Electrolyte Abnormalities in Patients Admitted with Chemotherapy Induced Febrile Neutropenia

    Get PDF
    BACKGROUND: Febrile neutropenia (FN) and myelosupression remain a challenging oncologic medical emergency and dose limiting toxicity associated with chemotherapy for cancers. Various factors are known to affect the outcomes for patients diagnosed with FN. Electrolyte abnormalities have commonly been observed, but the real incidence and their impact has been only scarcely studied in literature

    Impact of Delayed Pain to Needle and Variable Door to Needle Time On In-Hospital Complications in Patients With ST-Elevation Myocardial Infarction Who Underwent Thrombolysis: A Single-Center Experience.

    Get PDF
    Background Myocardial infarction is a life-threatening event, and timely intervention is essential to improve patient outcomes and mortality. Previous studies have shown that the time to thrombolysis should be less than 30 minutes of the patient\u27s arrival at the emergency room. Pain-to-needle time is a time from onset of chest pain to the initiation of thrombolysis, and door-to-needle time is a time between arrival to the emergency room to initiation of thrombolytic treatment. Ideally, the target for door-to-needle time should be less than 30 minutes; however, it is unclear if the door-to-needle time has a significant impact on patients presenting later than three hours from the onset of pain. As many of the previous studies were conducted in first-world countries, with established emergency medical services (EMS) systems and pre-hospital ST-elevation myocardial infarction (STEMI) triages and protocols, the data is not completely generalizable to developing countries. We, therefore, looked for the impact of the shorter and longer door-to-needle times on patient outcomes who presented to the emergency room (ER) with delayed pain-to-needle times (more than three hours of pain onset). Objective To determine the impact of delayed pain-to-needle time (PNT) with variable door-to-needle time (DNT) on in-hospital complications (post-infarct angina, heart failure, left ventricular dysfunction, and death) in patients with ST-elevation myocardial infarction (STEMI) who underwent thrombolysis. Methods and results A total of 300 STEMI patients who underwent thrombolysis within 12 hours of symptoms onset were included, which were divided into two groups based on PNT. These groups were further divided into subgroups based on DNT. The primary outcome was in-hospital complications between the two groups and between subgroups within each group. The pain-to-needle time was ≤3 hours in 73 (24.3%) patients and \u3e3 hours in 227 (75.7%) patients. In-hospital complications were higher in group II with PNT \u3e3 hours (p3 hours), has a significant impact on in-hospital complications with no difference in mortality

    Towards a machine learning-based framework for DDOS attack detection in software-defined IoT (SD-IoT) networks

    Get PDF
    The Internet of Things (IoT) is a complex and diverse network consisting of resource-constrained sensors/devices/things that are vulnerable to various security threats, particularly Distributed Denial of Services (DDoS) attacks. Recently, the integration of Software Defined Networking (SDN) with IoT has emerged as a promising approach for improving security and access control mechanisms. However, DDoS attacks continue to pose a significant threat to IoT networks, as they can be executed through botnet or zombie attacks. Machine learning-based security frameworks offer a viable solution to scrutinize the behavior of IoT devices and compile a profile that enables the decision-making process to maintain the integrity of the IoT environment. In this paper, we present a machine learning-based approach to detect DDoS attacks in an SDN-WISE IoT controller. We have integrated a machine learning-based detection module into the controller and set up a testbed environment to simulate DDoS attack traffic generation. The traffic is captured by a logging mechanism added to the SDN-WISE controller, which writes network logs into a log file that is pre-processed and converted into a dataset. The machine learning DDoS detection module, integrated into the SDN-WISE controller, uses Naive Bayes (NB), Decision Tree (DT), and Support Vector Machine (SVM) algorithms to classify SDN-IoT network packets. We evaluate the performance of the proposed framework using different traffic simulation scenarios and compare the results generated by the machine learning DDoS detection module. The proposed framework achieved an accuracy rate of 97.4%, 96.1%, and 98.1% for NB, SVM, and DT, respectively. The attack detection module takes up to 30% usage of memory and CPU, and it saves about 70% memory while keeping the CPU free up to 70% to process the SD-IoT network traffic with an average throughput of 48 packets per second, achieving an accuracy of 97.2%. Our experimental results demonstrate the superiority of the proposed framework in detecting DDoS attacks in an SDN-WISE IoT environment. The proposed approach can be used to enhance the security of IoT networks and mitigate the risk of DDoS attacks

    Differences in angiographic profile and immediate outcome of primary percutaneous coronary intervention in otherwise risk-free young male smokers

    Get PDF
    Introduction: Cigarette smoking is a well-established risk factor for the development and progression of coronary artery disease (CAD) and it is strongly related to cardiac morbidity and mortality. Therefore, this study aimed to compare the angiographic profile and immediate clinical outcomes in young male smokers and non-smokers without any other cardiac risk factors presented with ST-elevation myocardial infarction (STEMI).Methods: This study includes young (≤40 years) male patients presented without cardiac risk factors other than smoking. Angiographic profile and immediate outcome of primary percutaneous coronary intervention (PCI) were collected from the hospital database.Results: A total of 580 young male patients were included in this study, 51.2% (297) were smokers. Baseline characteristics and presentation were similar for smoker and non-smoker groups. Angiographic profile was not significantly different for smokers in terms of pre-procedure thrombolysis in myocardial infarction (TIMI) flow (p = 0.373), the number of vessels involved (p = 0.813), infarct-related artery (p = 0.834), and left ventricular dysfunction (p = 0.311). Similarly, in-hospital outcomes of primary PCI were not significantly different in smokers. Post-procedure no-reflow was in 3.4% vs. 2.8%; p = 0.708, acute stent thrombosis in 1.7% vs. 0.4%; p = 0.114 and in-hospital mortality in 1.0% vs. 1.4%; p = 0.657 of the smoker and non-smoker group, respectively.Conclusion: Our study concludes smoking has no significant impact on the angiographic profile and immediate clinical outcomes of primary PCI after STEMI in young males, without any other conventional cardiac risk factors. With these findings, further multicenter prospective studies are needed to identify other potential causes in such patients

    Medical thoracoscopy in evaluation of undiagnosed pleural effusion

    Get PDF
    Background: Medical thoracoscopy or pleuroscopy, in recent past has received lot of interest for diagnostic as well as therapeutic purposes. In the evaluation of undiagnosed pleural effusion, it has become a key diagnostic modality as it is a cost effective and safe procedure. The aim of present study was to assess the diagnostic yield of medical thoracoscopy in patients with undiagnosed exudative pleural effusion.Methods: This prospective study was conducted at government chest diseases hospital Srinagar between December 2016 to June 2018. One hundred and twenty-five (125) patients who fulfilled inclusion criteria were included in this study. Thoracoscopy was done using rigid thoracoscope under local anesthesia.  Thoracoscopic and histopathological data of enrolled patients was collected prospectively and analysed.Results: Patients enrolled in the study were in the age range of 17 to 82 years and consisted of 80 males and 45 females. Most common thoracoscopic finding was multiple variable sized nodules (53.6%) followed by sago grain infiltration (15.2%). Malignancy was the most common histopathological diagnosis (60.8%) with metastatic adenocarcinoma being the most common histopathological diagnosis (50%). The overall diagnostic yield of thoracoscopy was 90.4%.Conclusions: Medical thoracoscopy is a safe procedure with excellent diagnostic yield for evaluation of undiagnosed pleural effusion with minimal complication rates

    Incidence and prevalence of venous thromboembolism in chronic liver disease: a systematic review and meta-analysis

    Get PDF
    Background and Aims: Historically, bleeding was thought to be a frequent and fatal complication of liver disease. However, thrombosis due to coagulation disorders in cirrhosis remains a real risk. We aim to systematically analyse published articles to evaluate epidemiology of venous thromboembolism (VTE) in chronic liver disease (CLD). Method: Electronic search was conducted on Ovid Medline, EMBASE and Scopus from inception to November 2021 to identify studies presenting epidemiology VTE (deep vein thrombosis and pulmonary embolism) in CLD in inpatients and/or community settings. Random-effects meta-analysis was performed to determine pooled per-year cumulative incidence, incidence rate and prevalence. Heterogeneity was measured by I² test, and, potential sources of heterogeneity by meta-regression and sensitivity analysis. PROSPERO registration-CRD42021239117. Results: Twenty-nine studies comprising 19,157,018 participants were included, of which 15,2049 (0.79%) had VTE. None of included the studies were done in the community. In hospitalised patients with CLD: pooled cumulative incidence of VTE was 1.07% (95%CI 0.80,1.38) per-year, incidence rate was 157.15 (95%CI 14.74,445.29) per 10,000 person-years, and period prevalence was 1.10% (95%CI 0.85,1.38) per year. There was significant heterogeneity and publication bias. Pooled relative risk (RR) of studies reporting incidence rate was 2.11 (95%CI 1.35,3.31). CLD patients (n=1644), who did not receive pharmacological prophylaxis were at 2.78 times (95% CI 1.11, 6.98) increased risk of VTE compared to those receiving prophylaxis. Conclusion: Hospitalised patients with CLD may be at an increased risk of VTE . For every 1000 hospitalised patients with CLD ten have new, and eleven have pre-existing diagnoses of VTE per-year
    • …
    corecore