83 research outputs found

    Changing Profile of Leprosy in a Tertiary Care Hospital

    Get PDF
    WHO (World Health Organization) has fixed the target of zero grade 2 (G2D) deformity among pediatric leprosy patients and reduction of new leprosy cases with G2D to less than one case per million population, to be achieved by 2020. It has also mentioned the performing indicators to evaluate the progress of leprosy control program. We undertook this study to find out what changes the leprosy clinic at our hospital had witnessed in terms of the WHO performance indicators and whether we had progressed toward reaching the goal fixed by WHO. The important indicators such as number of new cases, percentage of MB cases, child cases, and G2D cases were examined from the year 2012-13 to 2016-17. Although a significant reduction in G2D cases, MB cases and child cases were noted, which is quite encouraging, yet the numbers of annual new cases detected remained almost static during the study period, indicating persistence of active transmission of infection and the need for augmented active surveillance (leprosy case detection campaign), contact tracing, community awareness, stigma reduction and training

    Leveraging Data Recasting to Enhance Tabular Reasoning

    Full text link
    Creating challenging tabular inference data is essential for learning complex reasoning. Prior work has mostly relied on two data generation strategies. The first is human annotation, which yields linguistically diverse data but is difficult to scale. The second category for creation is synthetic generation, which is scalable and cost effective but lacks inventiveness. In this research, we present a framework for semi-automatically recasting existing tabular data to make use of the benefits of both approaches. We utilize our framework to build tabular NLI instances from five datasets that were initially intended for tasks like table2text creation, tabular Q/A, and semantic parsing. We demonstrate that recasted data could be used as evaluation benchmarks as well as augmentation data to enhance performance on tabular NLI tasks. Furthermore, we investigate the effectiveness of models trained on recasted data in the zero-shot scenario, and analyse trends in performance across different recasted datasets types.Comment: 14 pages, 10 tables, 3 figues, EMNLP 2022 (Findings

    An Experimental Approach for Encryption and Decryption of Image using Canonical Transforms & Scrambling Technique

    Get PDF
    Data security is a prime objective of various researchers & organizations. Because we have to send the data from one end to another end so it is very much important for the sender that the information will reach to the authorized receiver & with minimum loss in the original data. Data security is required in various fields like banking, defense, medical etc. So our objective here is that how to secure the data. This study is performed on MATLAB R2016b with standard databasegrey scale images like Barbara, Cameraman and Lenna or by using the personalize images in standard format. First of all, the images are scrambled and then the generation of a new complex image took place. Initially phase mask is applied on the complex image by using RPM 1, and then the complex image is encrypted by using LCT of first order. Again the phase mask RPM 2 is applied on the encrypted image followed by the LCT of second order to get the encrypted image finally. Reverse process is applied to get the original image. Various parameters are calculated which shows various aspects. Like Change in the value of MSE with change in order of transform tells the quality of encrypted image. Correlation coefficient of encrypted and decrypted image also shows the difference between the encrypted and decrypted image. The original image is then reconstructed and histogram of all these images analyzed. Robustness and imperceptibility of images increases by the proposed method

    Recent Advances in Diffuse Large B Cell Lymphoma

    Get PDF
    More recently, DLBCL has witnessed advances in the molecular profiling and treatment of patients with refractory and relapsed disease. DLBCL is biologically and clinically a heterogeneous disease. Despite its aggressive behavior, DLBCL is a potentially curable disease with overall survival of 94 and 55% in patients with low and high rIPI scores, respectively. The combination of anti-CD 20 monoclonal antibody rituximab and cyclophosphamide, doxorubicin, oncovin (vincristine) and prednisone (R-CHOP) chemotherapy every 3 weeks is the first line treatment. Radiotherapy is reserved for the patients with bulky disease who fail to achieve complete remission after first line treatment. CNS prophylaxis is reserved for the patients with high lactate dehydrogenase (LDH) levels and involvement of more than one extranodal sites and for the patients with involvement of selective extranodal sites like testes and orbits (the sanctuary sites). Patients who suffer relapse after first line treatment receive high-dose chemotherapy supported by autologous stem cell transplantation (HDC/ASCT). Variants of DLBCL like double-hit (presence of MYC and BCL2/BCL6) and triple-hit (presence of MYC, BCL2 and BCL6) lymphomas are treated differently and these patients have worse outcome. Several novel immunotherapeutic agents like checkpoint inhibitors and chimeric antigen receptor T cell (CART) are being investigated in randomized trials on patients with DLBCL

    A Review on Encryption and Decryption of Image using Canonical Transforms & Scrambling Technique

    Get PDF
    Data security is a prime objective of various researchers & organizations. Because we have to send the data from one end to another end so it is very much important for the sender that the information will reach to the authorized receiver & with minimum loss in the original data. Data security is required in various fields like banking, defence, medical etc. So our objective here is that how to secure the data. So for this purpose we have to use encryption schemes. Encryption is basically used to secure the data or information which we have to transmit or to store. Various methods for the encryption are provided by various researchers. Some of the methods are based on the random keys & some are based on the scrambling scheme. Chaotic map, logistic map, Fourier transform & Fractional Fourier transform etc. are widely used for the encryption process. Now day’s image encryption method is very popular for the encryption scheme. The information is encrypted in the form of image. The encryption is done in a format so no one can read that image. Only the person who are authenticated or have authentication keys can only read that data or information. So this work is based on the same fundamental concept. Here we use Linear Canonical Transform for the encryption process

    Non-Ideal Program-Time Conservation in Charge Trap Flash for Deep Learning

    Full text link
    Training deep neural networks (DNNs) is computationally intensive but arrays of non-volatile memories like Charge Trap Flash (CTF) can accelerate DNN operations using in-memory computing. Specifically, the Resistive Processing Unit (RPU) architecture uses the voltage-threshold program by stochastic encoded pulse trains and analog memory features to accelerate vector-vector outer product and weight update for the gradient descent algorithms. Although CTF, offering high precision, has been regarded as an excellent choice for implementing RPU, the accumulation of charge due to the applied stochastic pulse trains is ultimately of critical significance in determining the final weight update. In this paper, we report the non-ideal program-time conservation in CTF through pulsing input measurements. We experimentally measure the effect of pulse width and pulse gap, keeping the total ON-time of the input pulse train constant, and report three non-idealities: (1) Cumulative V_T shift reduces when total ON-time is fragmented into a larger number of shorter pulses, (2) Cumulative V_T shift drops abruptly for pulse widths < 2 {\mu}s, (3) Cumulative V_T shift depends on the gap between consecutive pulses and the V_T shift reduction gets recovered for smaller gaps. We present an explanation based on a transient tunneling field enhancement due to blocking oxide trap-charge dynamics to explain these non-idealities. Identifying and modeling the responsible mechanisms and predicting their system-level effects during learning is critical. This non-ideal accumulation is expected to affect algorithms and architectures relying on devices for implementing mathematically equivalent functions for in-memory computing-based acceleration

    Performance Evaluation of Reduced Rule Base Fuzzy Logic Controller for Indirect Vector Controlled Induction Motor Drive

    Get PDF
    Abstract. This paper investigates the performance of a fuzzy logic speed controller with a reduced rule base for an indirect vector controlled induction motor drive. Generally in the control of complex systems where high performance is required, traditional controllers does not meet the required performance. In this paper a fuzzy logic controller is developed in such a way that it can provide high performance while using lesser rule. The drive is simulated successfully using Simulink/MATLAB. The performance of the drive has been examined under various rigorous working conditions. The Simulation results show that the proposed fuzzy logic controller (FLC) works satisfactorily making the drive more suitable for high performance applications
    • …
    corecore