950 research outputs found

    BIGMAC : breaking inaccurate genomes and merging assembled contigs for long read metagenomic assembly.

    Get PDF
    BackgroundThe problem of de-novo assembly for metagenomes using only long reads is gaining attention. We study whether post-processing metagenomic assemblies with the original input long reads can result in quality improvement. Previous approaches have focused on pre-processing reads and optimizing assemblers. BIGMAC takes an alternative perspective to focus on the post-processing step.ResultsUsing both the assembled contigs and original long reads as input, BIGMAC first breaks the contigs at potentially mis-assembled locations and subsequently scaffolds contigs. Our experiments on metagenomes assembled from long reads show that BIGMAC can improve assembly quality by reducing the number of mis-assemblies while maintaining or increasing N50 and N75. Moreover, BIGMAC shows the largest N75 to number of mis-assemblies ratio on all tested datasets when compared to other post-processing tools.ConclusionsBIGMAC demonstrates the effectiveness of the post-processing approach in improving the quality of metagenomic assemblies

    Performance measurement of banks an application of economic value added & balanced scorecard

    Get PDF
    The new millennium has brought with it a sea phase change in the areas of economic activities with the banking sector gearing up for survival, productivity and enlarging the customer base. Performance of Financial institutions is a generally measured by applying quantitative techniques of financial measurement. It is a post-mortem examination technique of achievement of a bank. Differences in measured efficiency of banking institutions broadly arise on account of [I] different efficiency concepts used; [2] different measurement methods used to estimate efficiency; and [3] a host of other exogenous and endogenous factors. Nevertheless, to know about the existence of performance drivers in an institution, both quantitative and qualitative aspects of performance measurement are to be considered. CAMEL rating system, basically a quantitative technique, is widely used for measuring performance of banks. Concepts of Balanced Scorecard [BSC], which covers both quantitative and qualitative aspects of performance measurement, may be used to measure the long term prospect of performance was recommended. This paper also examines the extent of awareness and adaptability of Economic Value Added [EVAJConcept of the Indian Banks and suitable suggestions were drawn

    128-bit Vectorization on Cha-Cha20 Algorithm for Device-to-Device Communication

    Get PDF
    In 5G networks, device to device(D2D) communication is the advanced technology, and it has the major advantages when compared with traditional systems. The coverage area of the system increases by device-to-device transmission which renders with low latency. No doubt, for the upcoming technologies D2D communications is an added advantage but in various views this kind of transmission is still under risk. D2D communication is transferring the information from one device to another device without the involvement of base station. So, the communication is possible with less delay time. Attackers are possible into the D2D communications. To provide the security in communication, encryption algorithms are used. The main theme of the Encryption algorithms is it should execute with less delay time to provide faster communication and in particular CHA-CHA20 offers the secure communication for encrypting the data packets to securing the data. Vectorization on Cha-Cha20 Algorithm provides the security with less delay time compared to AES encryption Algorithm and Cha-Cha20 algorithm. Compared to other encryption algorithms in cryptography, Cha-Cha20 is suited for resource constrained devices

    Update on the management of constipation in the elderly: new treatment options

    Get PDF
    Constipation disproportionately affects older adults, with a prevalences of 50% in community-dwelling elderly and 74% in nursing-home residents. Loss of mobility, medications, underlying diseases, impaired anorectal sensation, and ignoring calls to defecate are as important as dyssynergic defecation or irritable bowel syndrome in causing constipation. Detailed medical history on medications and co-morbid problems, and meticulous digital rectal examination may help identify causes of constipation. Likewise, blood tests and colonoscopy may identify organic causes such as colon cancer. Physiological tests such as colonic transit study with radio-opaque markers or wireless motility capsule, anorectal manometry, and balloon expulsion tests can identify disorders of colonic and anorectal function. However, in the elderly, there is usually more than one mechanism, requiring an individualized but multifactorial treatment approach. The management of constipation continues to evolve. Although osmotic laxatives such as polyethylene glycol remain mainstay, several new agents that target different mechanisms appear promising such as chloride-channel activator (lubiprostone), guanylate cyclase agonist (linaclotide), 5HT4 agonist (prucalopride), and peripherally acting μ-opioid receptor antagonists (alvimopan and methylnaltrexone) for opioid-induced constipation. Biofeedback therapy is efficacious for treating dyssynergic defecation and fecal impaction with soiling. However, data on efficacy and safety of drugs in elderly are limited and urgently needed

    Unified Acceleration Method for Packing and Covering Problems via Diameter Reduction

    Get PDF
    In a series of recent breakthroughs, Allen-Zhu and Orecchia [Allen-Zhu/Orecchia, STOC 2015; Allen-Zhu/Orecchia, SODA 2015] leveraged insights from the linear coupling method [Allen-Zhu/Oreccia, arXiv 2014], which is a first-order optimization scheme, to provide improved algorithms for packing and covering linear programs. The result in [Allen-Zhu/Orecchia, STOC 2015] is particularly interesting, as the algorithm for packing LP achieves both width-independence and Nesterov-like acceleration, which was not known to be possible before. Somewhat surprisingly, however, while the dependence of the convergence rate on the error parameter epsilon for packing problems was improved to O(1/epsilon), which corresponds to what accelerated gradient methods are designed to achieve, the dependence for covering problems was only improved to O(1/epsilon^{1.5}), and even that required a different more complicated algorithm, rather than from Nesterov-like acceleration. Given the primal-dual connection between packing and covering problems and since previous algorithms for these very related problems have led to the same epsilon dependence, this discrepancy is surprising, and it leaves open the question of the exact role that the linear coupling is playing in coordinating the complementary gradient and mirror descent step of the algorithm. In this paper, we clarify these issues, illustrating that the linear coupling method can lead to improved O(1/epsilon) dependence for both packing and covering problems in a unified manner, i.e., with the same algorithm and almost identical analysis. Our main technical result is a novel dimension lifting method that reduces the coordinate-wise diameters of the feasible region for covering LPs, which is the key structural property to enable the same Nesterov-like acceleration as in the case of packing LPs. The technique is of independent interest and that may be useful in applying the accelerated linear coupling method to other combinatorial problems
    corecore