63 research outputs found

    Calcium Silicide - Methods of production and their technological consideration

    Get PDF
    Three industrial methods of production of calcium silicide have been discussed with their merits and demerits along with technological considerations. Calcium silicide was produced at NML in 500 KVA submerged arc furnace using partial charging method. Based on the results and observations of the smelting trials carried out a model has been proposed to explain the mechanism of form-ation of calcium silicide. Inoculation trials with calcium silicide produced at NML compares favourably with those obtained from abroad

    Leprosy & gangrene: A rare association; role of anti phospholipid antibodies

    Get PDF
    BACKGROUND: Leprosy still remains an important public health problem for many parts of the world. An association of gangrene with leprosy is a rare one & can have a number of causative mechanisms. We present a case with Leprosy & gangrene with positive anti phopholipid antibody titers. CASE PRESENTATION: A 50-year-old non-diabetic, non-hypertensive lady presented with 2 months history of progressive gangrene of bilateral toes. She was found to have madarosis & hypopigmented, hypoaesthetic macular lesions on the upper limb & thighs. Bilateral ulnar & popliteal nerves were thickened. A skin biopsy of the lesions revealed borderline tuberculoid leprosy, slit skin smears revealed a bacteriological index of 1+. She did not have any evidence of thromboembolic episode or atherosclerosis. ACLA was positive at presentation & also on another occasion 6 weeks later. ACLAs were of the IgM type on both occasions. Lupus Anticoagulant & β2 GPI antibody were negative. DOPPLER of the lower limb arteries did not reveal any abnormality. Patient was successfully treated with multi-drug antileprotics & anticoagulants. CONCLUSION: Infectious APLAs should be recognized as a cause of thrombosis in Leprosy. Appropriate anticoagulation can salvage limb function

    A process pattern model for tackling and improving big data quality

    Get PDF
    Data seldom create value by themselves. They need to be linked and combined from multiple sources, which can often come with variable data quality. The task of improving data quality is a recurring challenge. In this paper, we use a case study of a large telecom company to develop a generic process pattern model for improving data quality. The process pattern model is defined as a proven series of activities, aimed at improving the data quality given a certain context, a particular objective, and a specific set of initial conditions. Four different patterns are derived to deal with the variations in data quality of datasets. Instead of having to find the way to improve the quality of big data for each situation, the process model provides data users with generic patterns, which can be used as a reference model to improve big data quality

    Assessing health-related quality of life in patients with inflammatory bowel disease, in Crete, Greece

    Get PDF
    BACKGROUND: Health Related Quality of Life (HRQoL) is an important outcome measure in Inflammatory Bowel Disease (IBD). The aim of our study was to assess HRQoL in a population of 135 Greek patients with IBD. METHODS: A cohort of 135 patients with IBD, 81 with ulcerative colitis (UC) and 54 with Crohn's disease (CD) were enrolled in our study. Demographic and disease-related data were recorded. HRQoL was assessed by a disease-specific and a generic questionnaire, IBDQ and SF-36, respectively. Disease activity was assessed by Harvey-Bradshaw Index and the Colitis Activity Index for CD and UC patients, respectively. RESULTS: Among all variables recorded in our study, only disease activity had a significant effect on HRQoL. Patients with active disease scored significantly lower on both IBDQ and SF-36 when compared to those in remission. Only two among the four IBDQ dimensions, bowel and systemic, had significant ability in distinguishing best patients in remission from those with active disease. CONCLUSIONS: IBD has a negative impact on HRQoL. Patients with active disease are more impaired than patients in remission. In our population of patients bowel and systemic dimensions had a predominant value in patients' perception of quality of life. Patients in our study using the same instrument scored higher than previously reported

    Feasibility study of computed tomography colonography using limited bowel preparation at normal and low-dose levels study

    Get PDF
    The purpose was to evaluate low-dose CT colonography without cathartic cleansing in terms of image quality, polyp visualization and patient acceptance. Sixty-one patients scheduled for colonoscopy started a low-fiber diet, lactulose and amidotrizoic-acid for fecal tagging 2 days prior to the CT scan (standard dose, 5.8–8.2 mSv). The original raw data of 51 patients were modified and reconstructed at simulated 2.3 and 0.7 mSv levels. Two observers evaluated the standard dose scan regarding image quality and polyps. A third evaluated the presence of polyps at all three mSv levels in a blinded prospective way. All observers were blinded to the reference standard: colonoscopy. At three times patients were given questionnaires relating to their experiences and preference. Image quality was sufficient in all patients, but significantly lower in the cecum, sigmoid and rectum. The two observers correctly identified respectively 10/15 (67%) and 9/15 (60%) polyps ≥10 mm, with 5 and 8 false-positive lesions (standard dose scan). Dose reduction down to 0.7 mSv was not associated with significant changes in diagnostic value (polyps ≥10 mm). Eighty percent of patients preferred CT colonography and 13% preferred colonoscopy (P<0.001). CT colonography without cleansing is preferred to colonoscopy and shows sufficient image quality and moderate sensitivity, without impaired diagnostic value at dose-levels as low as 0.7 mSv

    Visualizing Big Data with augmented and virtual reality: challenges and research agenda

    Get PDF
    This paper provides a multi-disciplinary overview of the research issues and achievements in the field of Big Data and its visualization techniques and tools. The main aim is to summarize challenges in visualization methods for existing Big Data, as well as to offer novel solutions for issues related to the current state of Big Data Visualization. This paper provides a classification of existing data types, analytical methods, visualization techniques and tools, with a particular emphasis placed on surveying the evolution of visualization methodology over the past years. Based on the results, we reveal disadvantages of existing visualization methods. Despite the technological development of the modern world, human involvement (interaction), judgment and logical thinking are necessary while working with Big Data. Therefore, the role of human perceptional limitations involving large amounts of information is evaluated. Based on the results, a non-traditional approach is proposed: we discuss how the capabilities of Augmented Reality and Virtual Reality could be applied to the field of Big Data Visualization. We discuss the promising utility of Mixed Reality technology integration with applications in Big Data Visualization. Placing the most essential data in the central area of the human visual field in Mixed Reality would allow one to obtain the presented information in a short period of time without significant data losses due to human perceptual issues. Furthermore, we discuss the impacts of new technologies, such as Virtual Reality displays and Augmented Reality helmets on the Big Data visualization as well as to the classification of the main challenges of integrating the technology.publishedVersionPeer reviewe

    Normal parameter reduction algorithm in soft set based on hybrid binary particle swarm and biogeography optimizer

    Get PDF
    © 2019, Springer-Verlag London Ltd., part of Springer Nature. Existing classification techniques that are proposed previously for eliminating data inconsistency could not achieve an efficient parameter reduction in soft set theory, which effects on the obtained decisions. Meanwhile, the computational cost made during combination generation process of soft sets could cause machine infinite state, which is known as nondeterministic polynomial time. The contributions of this study are mainly focused on minimizing choices costs through adjusting the original classifications by decision partition order and enhancing the probability of searching domain space using a developed Markov chain model. Furthermore, this study introduces an efficient soft set reduction-based binary particle swarm optimized by biogeography-based optimizer (SSR-BPSO-BBO) algorithm that generates an accurate decision for optimal and sub-optimal choices. The results show that the decision partition order technique is performing better in parameter reduction up to 50%, while other algorithms could not obtain high reduction rates in some scenarios. In terms of accuracy, the proposed SSR-BPSO-BBO algorithm outperforms the other optimization algorithms in achieving high accuracy percentage of a given soft dataset. On the other hand, the proposed Markov chain model could significantly represent the robustness of our parameter reduction technique in obtaining the optimal decision and minimizing the search domain.Published versio
    corecore